More being read into surveys than justified

POLLS have played a conspicuous part in monitoring opinion over the past 15 years

POLLS have played a conspicuous part in monitoring opinion over the past 15 years. They've also made increasingly reliable contributions to election campaigns. But, from a professional point of view, things are not what they ought to be.

Much more is being read into survey results than the figures justify. Misleading headlines and a tendency towards the melodramatic need to be addressed in the interests of the market research industry - and the public.

Ideally, an interpretation of findings should be based on adequate knowledge of a survey's objectives, methodology, characteristics and limitations. Personal experience of the task should make it less likely that readers, viewers or listeners are misled to the detriment of the market research industry - and the media.

A poll published in the national media is the most visible example of market research in action; and the Irish companies that have contributed most to the industry's credibility have notably displayed high sociological and statistical standards and avoided misinterpretation. Figures should not be stretched to mean more than they do.

READ MORE

National opinion polls cover a statistical sample of the electorate. The 1,000 (or more) who are interviewed by questionnaire reflect its composition, as broken down by age, sex, socio-economic categories and regional distribution.

The attitudes and opinions expressed are those of the 1,000 but cannot be assumed to represent the views of the electorate at large, which now amounts to 2.6 millions. What is known, however, is the extent to which each figure in the survey is, likely to vary from the figure that would have been obtained if all 2.6 millions had been interviewed.

Measured by a formula familiar to Ball professional market researchers, in a sample of 1,000 it's 3.16 but is usually expressed as a plus or minus three per cent. (It was for this reason that I lately described all published national polls this year as broadly consistent. Purists and text books - refer to sample survey figures as estimates which is what they are.)

In all Irish Times/MRBI Polls over the past 15 years, the question has been:

"If a General Election was held today, to which party would you give your first preference vote?"

What's being measured is first preference voting intentions on the day of the survey and, though the question may be hypothetical, the respondents are in a position to answer factually. Only when the survey is close to an election should the figures be taken as foreshadowing the result.

As Table A shows, in all general elections since 1973, the final MRBI Poll has accurately mirrored the first preference outcome.

Some commentators of late have tended to predict seats won or lost in certain regions on the strength of national opinion poll data. This can be very misleading for these reasons:

First, the poll question is about first preference voting intentions and nothing else; second, the relationship between first preference votes and Dail seats is not consistent. It changes from party to party and election to election. A few examples illustrate the point. In 1987 and 1989, Fianna Fail received precisely the same share of first preference votes - 44 per cent. But in 1987, the party won 81 seats and in 1989 dropped to 77. In November 1982, the contrast was even greater: 45 per cent votes;

SINCE 1987, Fine Gael's conversion ratio has been slightly more consistent. But with Labour and the Progressive Democrats the inconsistency evident in the last three elections has been remarkable. The PDs' achievement of 10 seats from 4.68 per cent of the votes in 1992 was the highest conversion ratio since the introduction of a 166-seat Dail in 1981.

Since 1973, MRBI has used the same procedure to exclude the undecided; and the fact that the net figures in the final polls were close enough (or identical to) the first preference outcome would appear to justify the exercise.

The calculation is made on the assumption that those among the undecided who eventually vote follow the patterns set by the rest of the electorate. A senior Minister has lately criticised this methodology, without suggesting an alternative. But procedural consistency is essential in monitoring and it would be unwise to change.

However, in the latest Irish Times/MRBI Poll an attempt was made - for information only - to discover if the undecided were leaning towards a particular party or candidate. The result (as in Table B) showed that the undecided, in the May 5th poll, tended towards parties other than FF and FG. Labour, Independents and others benefited.

The ESOMAR (European) Code of Practice, which has been adopted by the Marketing Society of Ireland, states that when an opinion poll is published in the national media it should always be accompanied by the following information:

(a) the name of the research organisation carrying out the survey; (b) the universe effectively represented (who was interviewed?); (c) the achieved sample size and its geographical coverage; (d) the dates of fieldwork; (e) the sampling method used; (f) the method by which the information was collected (personal or telephone interview, etc); (g) the relevant questions asked, unless already familiar to the reader or audience.