Biyernes, Setyembre 23, 2011

Finding out what others think by Jolito Ortizo Padilla

What is opinion polls? That is a challenge I have thought long and hard about. Public opinion, I finally decided, could be defined as "the collective view of a defined population". So in seven words I tried to encapsulate the fine tuning of the many nuances tried by the editors of the Oxford Dictionary-who took 842 words! In its essence, public opinion polling (and market research uses the same techniques) can   be defined as "the collective view of a (sample of a ) defined population".

PUBLIC OPINION
Public opinion is important. Why do I think this? Because the public does so, and even more important, acts on its beliefs. When I first got into this business of market research I asked a sample of Filipino adults the extent to which they agreed or disagreed that; "A company that has a good reputation would not sell poor products", and was astonished to have three people in four says they agreed. Now that obviously wasn't entirely true, all smart companies do test marketing.

Even more astonishing , over a third , 37% said "I never buy products made of companies I've never heard of". Of course that was nonsense , but that is what I percieved!

So it's important to have good products and services;it's important to be price/quality competitive in the marketplace, whether running a fruit and vegetable stall or a bank. And it is important to know what your customers (and prospective customers) think.

MARKET RESEARCH
That's where market research comes in. I describe market research as the "marriage of the art of asking questions and the science of sampling". It's a very simple business; all you have to do is ask the right questions, of the right sample, and add up a figures correctly.

THE ART OF ASKING QUESTIONS
I have a favorite question to give students to critic which, in just a few words, breaks all five rules of good questions construction:Are you in favor of direct retaliatory action against Franco' piracy?
1. Ask a balanced questions: Do you favor or oppose ....?
2. Define your terms:one man's direct retaliatory action is a punch on the nose; another's nuclear bombs.
3. Use language in common usage: "retaliatory" would likely be misunderstood by many people.
4. Explain who's who: wonder how many didn't know who Franco was.
5. Don't use pejoratives: "Piracy"? What was surprising was 22% were against taking action with a loaded question like that one.

HERE ARE SOME OTHER TIPS ON WHAT TO LOOK FOR IN QUESTIONS:
Are they clear? Read the question aloud.If you've forgotten the point by the time you get to the end, or if you stumble over them, chances are others will as well.

Does the question ask for a dual response? We call these "double-barrelled questions" (or even "triple"). All too often a question will be drafted in such a way that a respondent can properly respond with two, or even three or more answers.

Is the question precise? A good survey question says precisely what object the item refers to, leaving no room for ambiguity. Here's and example of a problem question sometimes asked by researchers:" When did you buy your watch?" The question is incomplete in that it fails to tell the respondent which watch, if the respondent has more than one; it may be that the watch in question was a gift.

Is the time defined? The period to which the question related is crucial to the respondent's answer. Time is difficult concept for many people. For example: Did you go abroad last year?" As well as being imprecise as to whether it was on business or holiday. Does "last year" refer to the previous calendar year, the year back from when the question was asked, or even for those at school or with children at school, the school year.

Is the question loaded? Reputable market researchers have too much at stake in their work to be caught intentionally biasing a question. Special interest groups, however, sometimes have vested interest in loading question to get a certain result.

Does the question assume knowledge on the part of the respondent which they might not have? Another common error in survey questions is asuming the respondents know something about the subject of the question, with a resulting distortion of the extent and direction of public opinion. Questions about opinion towards advertising can provide excellent example. For instance: " Are you more or less like to buy the watch as a result of seeing the ad?" Those who had no intention of buying any watch would likely say "less", but this would be nothing to do with whether they saw the ad or not. Any question asking "more or less" can fall into this trap, yet you frequently see polls that ask questions which empirical research has shown will give nonsense answers.

Does the question ask for a comparison that is meaningful to the respondent? If a question asks for a respondent to compare something they know, but then asks for a comparison to something unknown, the resulting answer would be meaningless. For instance,"Do you think that the price you are being asked to pay (for this or that T V) is fair in light of what other TVs are costing? This question assumes a lot, not only that they know the level of their own local prices but they have a basis of comparison that is meaningful. It may well be that the respondent has a view on this related to what they paid last year for their television, but has no ideas of how it compares with other TVs on the market now.

Is the question's obscured by asking about a very complicated behavior in simplistic terms? For example: "Where do you usually get most of your news about what's going on in the world today: from newspapers or radio or television or talking to people or where? Without knowing what type of news, it's hard to argue that this question has much meaning.For example: fashion news for a modern professional or business news from Financial Times and the Economist, while her main source of national news comes from watching television.

Does the question use a balanced scale? Years ago the market research manager of the Singapore Post Office said to me that seven out of ten people in Singapore were satisfied with the postal service. When I saw the questionnaire, I saw why. He'd asked: Are you very satisfied, satisfied , or dissatisfied?" It may have been his boss happy , but it certainly gave them bad research.

Is it a "yes/no" questions asking for an attitude? We virtually never ask "yes/no" questions other than to factual or behavioral questions such as "Do you normally wear glasses for reading?" In general , any question that has a yes/no answer is likely to inflate the favorable response to an item, regardless of whether the question is loaded. But , more subtle forms of loading combine prestige attachment or social desirability which gives tendency toward "yea-saying" by some respondents.

Questions are tested by conscientious researchers by trying them out on their colleagues first: then on members of the public in a pilot test. They ask respondents the question first, listen carefully for the response or any questions arising, then what was their understanding of the question.

THE SCIENCE OF SAMPLING
The street corner surveys has been the basis of thousands of research reports. Typically it uses sampling. It may even be called erroneously, a "random" to mean haphazard;the statistician uses it with precision to mean "having and equal probability of selection".

Another important factor in survey sampling is to be certain that the precise community, group , or class being talked about in the results, is carefully defined.

Internet survey are becoming more popular, but those people who can be reached via internet are not "representative" of the population. They are, as a generalization, more middle class, more middle aged, and more educated. If a researcher presents data from a telephone or an internet survey, good questions to ask are:
"What steps were taken to deal with unlisted telephones, multiple email addresses, mobile phones, engaged or "dud" email addresses, etc?"
"What is the bias inherent in leaving people who are not on the telephone/internet out of the sample?"
"What was the refusal rate, by subgroup, of the sample?"
"How was the effect of differential refusal dealt with?"
"Were the questions suitable for asking over the telephone/internet?"
" How many questions were asked and is there an "order" bias in questions asked earlier to influence the result of questions asked later in the questionnaire, (sometimes called "position bias")

CONCLUSION
Survey research is widely misunderstood. It can provide understanding, analysis and tracking of the behavior, knowledge, opinions, attitudes and values of the public.By measuring this, within the limits of the science of sampling and the art of asking questions, surveys can determine what people do and what they think.


Walang komento:

Mag-post ng isang Komento