This Dec. 11, 2012 photo shows a 1947 survey for the Gallup Poll. The document is part of the personal papers of the late George Gallup Sr., founder of the Gallup Poll, that have been donated to the University of Iowa, his alma mater, providing the public with an intimate glimpse into the development of nationwide public opinion polling. (Ryan J. Foley/AP)

If Mark Twain were a public opinion pollster today, here’s how he might sum up Americans and their grasp of the facts: “It’s not only what they don’t know that gets ’em into trouble, it’s what they know for sure that just ain’t so.”

Democracy and Political Ignorance” and “Just How Stupid Are We?” are two recent books whose titles hint at the scope of the problem. Quizzing Americans about their country, past and present, has become a gold mine for TV’s late-night comics. (Jay Leno to woman on the street: “Name the ship the Pilgrims sailed to America on. The ship has the same name as a famous moving company.” Answer: “U-Haul?”)

Americans have a badly skewed impression of the nature and extent of their country’s generosity, priorities, and international image and comportment.

The laughs are good, but ultimately, the joke’s on us. To cite one example, when most Americans think that foreign aid is 10 percent of the U.S. federal budget (it’s actually about 1 percent), and don’t know that defense is the single largest budget item (about 20 percent), that suggests that Americans have a badly skewed impression of the nature and extent of their country’s generosity, priorities, and international image and comportment.

Closer to home, consider a 2011 nationwide study by Michael Norton of Harvard Business School and Dan Ariely of Duke University (based on income analysis by Edward Wolff of New York University). Asked to estimate their society’s wealth distribution, a broad spectrum of 5,000 Americans believed, on average, that the richest 20 percent owned 59 percent of the wealth, when in reality the top quintile owns 84 percent. Respondents also believed that the poorest 40 percent controlled 10 percent of the wealth, when in fact the poorest control close to zero percent.

Then the Norton/Ariely survey did something interesting: after respondents had given their answers, they were told the correct numbers and asked what they considered ideal income distribution. Americans of all stripes and persuasions said that the richest 20 percent should have only 32 percent of the wealth, a scenario that in the real world most resembles Sweden, the socialist bogeyman of so many American nightmares. This begs the question: When we are so uninformed about the realities of our society, how can fair and effective public policy be made?  TWEET

To that end, and building on Norton /Ariely’s follow-up questioning, here’s a modest proposal. Why not make a simple add-on to opinion polls so that they might educate us as well as just hold up a mirror to what we believe (no matter how much of what we believe just ain’t so)? Respondents could be asked a few factual questions about the public policy issue (e.g. environment, immigration, health care) about which they are being polled. Those with a perfect or high score on this quiz could be dubbed the “Smart Set” and their responses to the poll’s questions highlighted as a sidebar to the poll’s overall results. Then I, as a citizen, could compare my opinions to those of the respondents who are demonstrably knowledgeable about the issue. If we differed on some aspect, I might say to myself, “Gee, maybe I should learn more about that question; maybe I don’t know all the facts.”

Traditional polling might even have a deleterious effect. Because we credit “the wisdom of crowds,” because opinion polling is presented as scientifically conducted, and because poll results are legitimized by respectful treatment in the media, polls lend unwarranted authority to what are only people’s opinions.  TWEET Opinions, after all, are merely attitudes and beliefs, often formed on the basis of inadequate or incorrect information.

If polling were ‘smart,’ it might just help us fashion a better society…

For the many academic and nonprofit institutions engaged in polling, adding a “smart set” to their surveys would support their mission to educate. In a crowded field of pollsters and polls, a “smart poll” would also create a competitive advantage by drawing heightened interest to its results. Such polling could not only plumb opinion but could also act as a tool to spur the citizenry to become curious and better informed, thus leading to sounder thinking about issues facing society.

Our opinions shape our votes, and it’s our votes — and the demands we make on our legislators — that truly shape our economy, our country, and our democracy. If polling were “smart,” it might just help us fashion a better society, more like the one that prompted those brave folks aboard the original U-Haul to move to these shores in the first place.

The views and opinions expressed in this piece are solely those of the writer and do not in any way reflect the views of WBUR management or its employees.

Please follow our community rules when engaging in comment discussion on this site.
  • giel

    Completely agree. Any effort to educate the public (including me) should be valued and every opportunity seized.

  • Gregg

    Shouldn’t this post be titled “The Sad State of Education”? A poorly educated public is not a failing of pollsters, scientific or otherwise. Given that a majority of Americans are ignorant on a wide variety of important issues, understanding their views is all the more important. I gather your labeling idea was suggested with tongue in cheek, but naming the informed “The smart set” would not enhance the credibility of pollsters.

  • Jim

    Wait, what kind of polling are we talking about here? Proposing the addition of a “simple add-on to opinion polls so that they might educate us” ?

    There’s a name for that, already, I think. It’s called the “Push Poll”. And it is typically used to manipulate opinion, not measure it. True and honest pollsters stick to measuring, and avoid trying to manipulate or “educate”.

    • Pamela D

      Push Polls are not used to inform or educate us, but rather to mislead and manipulate us. This is not at all what the author is suggesting.

      There are many factual questions with accurate right and wrong answers. The kind of thing the author suggests would improve the world, by making accurate information available to curious poll takers at the time they take the polls. The feedback would be immediate and useful, and not anything like adding more push polls.

  • Bob

    How does something like this get
    labeled as “The Sad State of Public Opinion Polling?” Why wasn’t it the “The
    Sad State of Public Knowledge as Revealed by Public Opinion Polling?” Nothing
    in the article questioned the accuracy or validity of the polls, only the sad
    state of the public knowledge. Doesn’t it tell us how poor a job the media is
    doing in educating the public and how good a job the “public opinion pollsters”
    are doing at pointing out those shortcomings?

  • Stas K

    A heated discussion of this blog post followed on the mailing list of the American Association for Public Opinion Research, which is only available to the AAPOR members. Bob’s response below is of the messages on that thread. Some others opinions offered were (don’t attribute to me… just to serve as indication of the discomfort the industry had about your post):

    I’m with Bob – this story has nothing to do with public opinion polling.
    BTW, surveys are a VERY expensive way to inform the public.
    I tend to believe that most people are exactly as informed as they want to be, or at least are willing to work for. I would also add that what people do not know is just as informative as what they do.
    Around 1950 Darwin Teilhet wrote a thriller, The Fear Makers, in which partner in a polling firm comes back from serving in the war, and finds that he is shunted off to a do-nothing role and the new partners are not giving him information about the current surveys, which seem to have wealthy clients behind them. Nosing around he discovers the firm is being hired to spread politically damaging rumors in selected districts in the guise of doing surveys. He makes off with the master sample list (on IBM cards) to stop their latest campaign of planting rumors with a large sample, but is pursued by thugs and finds it very hard to dispose of 10,000 IBM cards in a wood stove. In the end he exposes the scheme somehow. An early example of using polling to “educate.”
    I generally get uncomfortable when someone suggests only the “smart set”…should be allowed to pontificate or decide…anything…

    Not sure what’s next: special tests for voting? (We’ve been there, done that.)
    Special tests to qualify as a politician, perhaps? (That begins to sound appealing.)

    Meanwhile we know the old saw about the messenger.
    Putting aside the fact that the author took the Norton & Ariely study out of context, his main arguments — i.e., that media reports of scientific poll results somehow legitimize uninformed opinions and that pollsters can create a better informed electorate by providing the correct answers — constitute an incredible leap in logic.
    This issue seems to come up in one form or another at least every ten years, usually accompanied by misdiagnosis and misprescription. Alas, true again.

    A small part of the public may sometimes become better informed as a by-product of participating in research, but public education is not a responsibility of the profession. On the other hand, some of us feel there is a responsibility not to produce findings which are easily misused, especially for antidemocratic purposes.

    When “opinion research” assesses only opinions and not knowledge, reports are unable to distinguish between informed opinion, uninformed opinion, and misinformed opinion. (Yes, cut-points are always arbitrary, but they can be specified and at least allow the ranges to be distinguished from one another). Of course, ascertaining knowledge is hard and time-consuming and risks alienating the respondent, and some clients don’t care, but failure to do it means lumping the three categories together. Although that may serve the ends of a firm or industry that has conducted a campaign of disinformation about a policy or a piece of legislation, and then uses “opinion research” to claim that the public supports or opposes it, it is dysfunctional for all of us in the long run.
    I would like to note that neither the author of this article nor any of the posts mentioned the relevance of the work of our 2012 AAPOR Award winner, Daniel Yankelovich, in his book, Coming to Public Judgment: Making Democracy Work in a Complex World.

    I have not read the book in more than a decade, but if I recall correctly Dr. Yankelovich’s assertion, which I have always found compelling, is that volatile and often-time uninformed “mass opinion” is what exists on most topics that make up what is typically called “public opinion,” but on some topics the public’s views have matured, are well-informed, and stable. Yankelovich considered the latter reaching a state wisdom which he calls “public judgment”. He uses a good deal of solid empirical evidence to demonstrate issues in which public judgment has been reached versus issues where only mass opinion exists. Most importantly (at least to me) is the responsibility he places on pollsters and the news media to use polling methods that will affectively differentiate the “quality” of the public’s opinions. And he suggests how this can be done.

  • JJ Marks

    Any researcher knows that “testing” has an impact on outcomes. Too bad the pollsters ignore this. This shows the lack of research and ethical fundamentals in polling.

  • bbshea

    Excellent commentary! Polls in general are too prevalent in our society when one considers the utter lack of knowledge of those participating. I have been concerned for years by those close to me who I assumed had similar grasp of facts on various issues. They really do not know the difference between news reporting and commentary, which is exactly what our major news agencies have been fermenting for years.

  • alexMass

    The biggest issue for me: which pieces of information to include. I feel this would inevitably be used for political ends. A fact by itself is apolitical, but selecting which facts to educate the masses with becomes a propaganda game, and no short briefing can possibly present a fully balanced viewpoint.

  • Dan

    Ah yes, if we, the “smart set”, could just educate those poor rubes we might be able to pull them out of their dull, uninformed lives and into enlightenment. Surely the populace at large is no better informed than the 3 or 4 drooling idiots they manage to find and show on Leno. Ignore the bias this education would introduce – we’d only want them to learn what the “smart set” wants them to know anyway. Normal people would have no inclination to learn on their own, if they choose. We must perform our duty as one of the chosen ones and lead them to drink from the pool of knowledge.

    If you’re going to comment on the sad state of public opinion polling, at least have the appearance of being better informed about the science of polling or polling in general. Or hide your ignorance better. If only there was a smart set to educate you…….

  • Pamela D

    Polls often ask such unhelpful (I started to write, “stupid,”) questions as, “Which do you consider the most important issue facing America today?” listing such choices as, the economy, jobs, agriculture, energy, the environment, lowering government spending, the deficit. When none of those oversimplified issues could be separated from the others. This is particularly a problem because some of the best solutions solve many issues at once, in a systemic way.

    For example, a gradually rising CO2 fee on fossil fuels that was refunded to the citizens would go far toward solving our problems with the economy, increasing jobs, spurring innovation, removing uncertainty, and mitigating climate change. It would level the economic playing field and let the free market work its magic. It’d be a win-win-win for everyone on any number of important issues. Worse, the fact that such ideas are comprehensive economic solutions to more than one problem often gets them dismissed as not being relevant to any specific issue. And solutions are rarely polled.

    How would such ideas ever show up in any of the current poling methods?