Every day it seems there’s a headline about which candidate is surging in the polls — but how accurate are those assessments? Philip Elliott, senior correspondent for Time magazine, joins host Krys Boyd to discuss how the layman can look at polls and glean the most relevant information, how polls gather their data and why that margin of error is super important. His article is “How to Read Political Polls Like a Pro.”
- +
Transcript
Krys Boyd [00:00:00] If there’s one thing political campaigns tend to have in common, it’s the way they talk about polls. They’re eager to tell the ones that showed them to be ahead and dismiss the ones that don’t. They may claim to not even pay attention to public polling, but they are obsessively interested in the results of so-called internals. Polls whose data are not released publicly but may have a lot of influence over campaign strategy. So what can pre-election surveys tell us about the only poll that really counts for anything, which is the one that happens at the ballot box? From Kera in Dallas, this is Think. I’m Krys Boyd. We don’t have to track polls as obsessively as candidates and their staffs, but the truth is, many of us who have clear preferences on how elections might turn out are interested in these possible glimpses into the minds of fellow citizens. This can be frustrating because in a closely divided country, presidential polling doesn’t necessarily point early toward a clear winner. My guest is Phillip Elliott. He has spent a lot of time parsing poll data in his job as senior correspondent at Time, where he shares some very useful insights in his article “How to Read Political Polls Like a Pro.” Phillip, welcome to Think.
Phillip Elliott [00:01:10] Great to be with you, Krys.
Krys Boyd [00:01:12] You remind us here something that we don’t want to hear, which is that polls don’t actually predict the future. How should we think about what they actually can measure?
Philip Elliott [00:01:22] Well, the polling right now is merely a snapshot of what the race was like when the people were being asked it. And there’s usually a day or three or five or seven frustratingly between when the pollsters were asking the questions and when they actually broke it down. Because it’s not just raw data they’re looking at they’re looking at they can’t just release the raw data because that would just be garbage because we all know. I know I don’t answer phone calls from people I don’t know from numbers I don’t get. Senior citizens here at home watching Jeopardy! And Wheel of Fortune disproportionately answer. You take a look at some of the racial crosstabs. That is, if a state is 10% black, we should have 10% of the respondents, 10% of the weighted responded in a poll, be black. And there’s a lot of work behind the scenes at the magicians of these polling operations do to try to match up who they talk to with how much of the pie they should be allocated. And really, there’s a lot of trouble there deciding how big the pie is in the first place and who’s going to show up. And that’s why there’s a lot of alchemy involved here that are we going to have elections close to electorates, close to what we saw in the midterms when women disproportionately showed up, upset about the verdict in the jobs case, which ended a half century of Roe v. Wade protections. Is it going to be closer to the 2020 turnout when everyone a lot of people voted by mail for the first time? Or are we looking at 16 electorate when a lot of Democrats, frankly, just stayed home depressed about their choices? And with the late shift at the top of the Democratic ticket with Kamala Harris taking over for Joe Biden. That’s another factor that these pollsters are trying to figure out. So really, the polling represents just a narrow piece of what it looks like and they change every day. Because remember, every day seems to bring a new revelation, a new crisis, a new headline that ten years ago you and I would be like. Now, Aaron Sorkin, Aaron Sorkin’s making the Way to my job on the West Wing. Now it’s like, now that happened and that happened and that happened. And Petit, your news director and anchors are trying to prioritize what matters in real time. We are at least lucky enough at a magazine to have a little bit of perspective. But even that on our Web, on our daily Web coverage in my daily column, the D.C. brief, we have some trouble trying to figure out what is going to matter and break through. And pollsters are no different, trying to figure out what the impact of any micro development will be on their numbers.
Krys Boyd [00:03:49] Campaigns will use polls to obviously to determine strategy, and we’ll talk about that in a minute. But why do news outlets commission polling on political issues? Does this guide coverage? Is this just something readers and listeners and viewers want from their news outlets?
Philip Elliott [00:04:06] It is a quantifiable way for news organizations to figure out what is what is what is most important to viewers, listeners, readers. And there is a little bit of calibration. I mean, the number of meetings I’ve had where I pitch really obscure stories and the like and where is that showing up anywhere in the polling? Who wants to read that? Besides the ten people you talk to on the regular about this one niche topic? And that’s how, you know, during the 2004 campaign, CNN launched a standalone economic show issue one to reflect that. That was the only thing voters cared about, and they actually had room for economic and fiscal policy coverage for an hour every day. It really does shape what we need to be focusing on, because if that’s what voters say they’re going to cast their ballots on. We have an obligation to help them be the most informed about that topic. And for my mind having not having to make those decisions myself, thank you. It helps me track it’s more of a trend that is this trending away? Is this trending toward how much of a shift, for instance, we are really able to thanks to the folks at Gallup and Pew, immediately have a quantifiable shift in the American electorate sentiment. After the Dobbs decision, we were able to see that moved very quickly and we were able to see how over the last 40 plus years since pollsters have been asking this question, since Roe, how the electorate has shifted and changed. And really the most dramatic example of this has been the question of LGBTQ rights after the Obergefell decision in 15. There’s never been a shift as great in public sentiment as after that case that made same sex marriage the law of the land. Once this court weighed in and it was fait accompli. It just became so much less of a divisive issue. And even on the Republican side, with majorities now supporting the rights of same sex couples to wed. It really is polling is useful for that over time, but not necessarily if the polling as utility ends. The minute the last phone call is made.
Krys Boyd [00:06:21] So how do campaigns use polls, both the ones that are publicly released and taken by other organizations and the so-called internals that they’re doing for their own purposes?
Philip Elliott [00:06:30] Well, at their core, they’re helping to inform decision making that if they’re already. And sometimes they do this wrongly, but if they think they’re ten points ahead, you don’t need to send your candidate to that state. I mean, it’s why Hillary Clinton never set foot in Wisconsin during the general election of ’16. She thought she thought that state was so far out of reach up for Trump, she didn’t need to invest her precious time in going there. The same for Ohio right now. Ohio is getting no love from the from the Democratic ticket because they don’t frankly, they don’t think they can get there and they don’t need it to get to 270 electoral votes is not part of their blue wall. And they came to that conclusion after years of surveys among the candidates, among the party, among the superPACs that all consistently showed Sherrod Brown might stand a chance in the state of Ohio, they hold a Senate seat, but Kamala Harris does not need that state’s 17 electoral votes. There’s a way to get there. Joe Biden lost by eight. Hillary Clinton lost there by eight. That’s a huge shift that Harris would have to make up and spend a lot of money to get there and maybe not get close. So really, this is about efficient uses of resources. It’s also really useful for scaring people, to be honest. The number of emails that I get, we are trailing in the polls and just ominous. Help us close the gap in the polls. Chip in $15. There is some fun to be had there for the marketing teams, the digital teams and finance teams. And then there’s frankly, the utility there that they can play. They can dispute the polls that are public and say that is not what our internal show and they can still be telling the truth when they’re like point 1% off. But they can they can they can dispute it and pass and split hairs. And there is use for that, even though there’s often if once you get to know the campaigns and their spokespeople, you can tell when they’re actually upset about it. A poll result. And then there are times when they go out there and like I am told, my boss told me to come out here and dispute this poll, which is probably more accurate than our polling. But we have to do this. And finally, it helps the candidate hone even before they launch what that candidate needs, what that candidate’s Svengali think they need to be. And I’ve seen many a candidate, you know, try to reshape themselves between their last Senate race and when they run for president based on feedback from polling. And I have to tell you, it always backfires, trying to create a candidate as a Frankenstein based on the polls. We can all smell the inauthenticity. We can all see when a candidate is just not in it. And you can see that shift, frankly, in current Vice President Harris when she ran for president versus when she’s running now, It is a completely different vibe on there. And I know vibes are ever used this campaign cycle. But it’s like watching a different candidate than the one who ran for the nomination in ’20. Versus where she is now, where she is running. She’s still informed by polling. Let’s not pretend that she’s going full bulwark here, but she is she is helping inform where she’s spending her time, what she’s talking about, but not who she is. They are not they’re not princess diary. That’s the way they did four years ago.
Krys Boyd [00:10:04] I mean, to what extent do you think her remarkable initial success has to do with the fact that simply that she is not Joe Biden?
Philip Elliott [00:10:13] I think there’s plenty of it. I mean, that there was a fatigue among Joe Biden. You talk to people who were at the convention who were even at you at the U.N. for a final general Assembly. There’s respect, but no one’s really eager. No one is really eager to hear more from Joe Biden. It’s like that Game of Thrones police sit down uncle finale where it’s like, okay, you said your piece. Thank you, but no. There is just this dread. I mean, we wrote a cover story about Democrats gnashing over Joe Biden back in the spring where they were just ranked. They were running far behind. They didn’t have offices. They weren’t taking it seriously. They had no footprint in the states and they just assumed that they could win by running the old playbook. And eventually that caught up with them. Keeping the cannon under a glass did him no favors. And by the time the debate happened at the end of June, there was just this panic in the in the party. We crashed a Time magazine cover on June 28th with the just the word panic on the cover. The Democrats, their worries finally came through, and Vice President Harris that is the yes, she is part of the Biden administration, but no one’s mistaking her as the as the avatar of an 81 year old Joe Biden, who’s been in the Senate since 1972. It’s a much different campaign. And even though she yes, she has brought in a few of her loyalists and has tweaked with the staff, it’s largely the same staff doing completely different tactics in town. And really, it feels so much more like the candidate campaign that those staffers wanted to be running, like they’re just openly trolling former President Trump with so much talent. It’s it’s actually impressive that they had all the skills to do this. And they were following they were following orders from the candidate who thought trolling was beneath them. And they should run this campaign like it was 1988.
Krys Boyd [00:12:12] What drives advocacy groups to fund polling in and around election seasons in particular?
Philip Elliott [00:12:19] So advocacy groups are this weird mix of campaigns. They are running their own campaigns, often with tax or tax exempt status, three C fours, the super PACs, dark money groups, whatever you want to organize and they need to make they need to be make the most efficient choices. But a lot of times they will pack in a question just as a standalone as a news driving event. Hey. X percentage of our people think this is an issue that they shouldn’t be talking about or this is the most. This is a serious issue that’s not getting enough attention and it’s useful to see what their what they think is a movable question, what they think is a potent issue. But it’s also to drive headlines because it’s good for it’s good for them to show their members, whether they’re a unit, whether it’s members of a union or donors to a super PAC, that they are on the ball on something. They are pushing a message. And there is there is things are not lost. It is not a lost cause for their priorities and that they can break through if they talk about things the certain way.
Krys Boyd [00:13:38] Philip, you mentioned the Pew Research Center and Gallup a few minutes ago. You say in the piece they are the two kind of gold standard polling operations. What makes their work especially trustworthy and how can we gauge the accuracy and reliability of any organization’s poll?
Philip Elliott [00:13:56] Well, those are two great questions, Krys. I mean, Pew and Gallup have been nonpartisan. They they do not do partizan polling. They do not. They just say really straight down the middle. And they make a point of anyone even playing politics there, that that’s just something you don’t do. If you want to do that, you can go make, frankly, more money working for a Partizan pollster than pure Gallup. The people there are absolutely professional and they don’t get over their skis. They don’t overinterpret. They put out their data and it’s data that goes back decades. And that is really why they have been so valuable to reporters and advocacy groups and everyone else. But I’m just talking as a reporter to see objectively how things have changed and they publish all their data. So, you know, not just how America, how America feels, but how people who are probably Democrats identify as Democrats or vote Democratic, how their views have shifted and how Republicans views have shifted. And you can take a look at young people and specifically or retirees, people with college educations, people without people with graduate degrees, terminal degrees. And it’s really fun to slice the electorate in different ways to see how this country has changed. And I mean, census data is useful on that, but that that lags ten years. These polls are instantaneous or near instantaneous. And because they’ve been at it so long asking the questions, in some cases in the exact same way for 40 years, you can really tell what the change here is and what has stayed consistent and what is embedded in the DNA of this country and not movable. There are a lot of things in this country that are just not going to shift no matter when you take the question. And that’s that’s really worth remembering that as much as these campaigns are about change, there is a and an inertia in this country. And, you know, things are really tough to change absent a seismic event. To your second question about why why some are better than others, frankly. Methodology matters. And I am not a data scientist. I’ve taken the courses. I’ve sat through the seminars. You know, I would not profess to have the expertise to grade any one poll or pollster. But I do know that at the end of every cycle there is an accounting and what percentage of people will write, what percentage, how many races they got right, how many races they got wrong. And that win loss ratio is instructive when trying to figure out what what matters, who can be trusted and frankly, how many races they’re polling in. If they’re polling in just 2 or 3 races, For instance, the University of New Hampshire would never poll in Florida. That’s just it’s beyond that that their mandate. But Quinnipiac, which is holding in a bunch of different races, you know, they’ve been at this long enough. They’ve been in enough markets. They know where they have enough expertise in specific areas where, you know, you know, that you’re getting a good faith product. Some of these others pollsters who pop up out of nowhere. Pollsters who only do opt in online, although those are getting better pollsters who clearly have a partizan agenda, it’s almost like they’re just setting money on fire at this point, that no one’s taking those groups seriously. And it usually takes at least a cycle before anyone is going to take them seriously. I mean, remember when Morning Consult was just getting off the ground? All of us were skeptical of, okay, what are they doing? How is this are what are they doing? We aren’t we’re not sure who these folks are and we know who they say they are. But really, credibility is something earned over time. And then, frankly, there are some states where you just cannot trust the polling. It’s notoriously bad, inconsistent. I’m going to call out Nevada on this through no fault of their own. But you talk to the the campaigns in that state, 20% of the electorate cycle wasn’t a registered voter in 2020. It’s an incredibly transient state. Folks there. If you call someone at dinnertime and everyone’s working, they’re all working in the hospitality industry. Las Vegas is the center of the political universe out there. And those voters are working the casino floors, They’re working the restaurants, they’re working the parking garages. They are doing everything in the Vegas ecosystem that comes alive in that third shift. So are you going to start calling people now at 2 a.m. when their shift is over and hope you’re not waking grandma? I mean, it’s just really tough to figure out who the electorate is and get in touch with them. That’s why these polls out in Nevada, they’re they’re probably good, but they’re incredibly difficult and super expensive. And if you’re a news organization and you’re on the on the fence about. Do you do all seven spine swing states or do you just focus on a handful? The Nevada ones are going to be the most problematic and potentially embarrassing if you get it completely wrong through no fault of your own. It’s just tough to poll in that state.
Krys Boyd [00:19:11] Well, there’s also this question of where people are, whether they live in a swing state or a state that is solidly red or blue. How are pollsters able to account for this and account for sort of the relative weight of someone’s vote if they live in a swing state versus a state that is unlikely to shift?
Philip Elliott [00:19:31] Well, if you’re doing a national poll, you have to do a national poll. You can’t you can’t overemphasize the swing states. So you need to first make a choice whether you’re going to do all 50 states, and that includes Alaska and Hawaii, which are their own time change elements that require pollsters to work at a very. Different. Different set of hours than if you’re trying to pole Georgia. You can’t. So if you’re doing a national poll. That’s fine. I mean, I’m looking at the Real Clear Politics average of polls, which average in the polls are their own conversation for probably another hour. But I’m looking at the Harris average here at RCP and she’s up two points. But if you take a look at the battleground states, Trump is up by 0.1. And that is important because both parties will tell you that for Harris to have a shot at getting the Electoral College, 270 votes spread out across this great country, she’s going to have to run 2 to 3 points ahead of Trump in the national poll because of just the advantages baked into the Electoral College. States that are super small still get a minimum of three electoral votes because it’s your House district plus two, whereas big states that are solidly Democratic. California comes to mind with its 54 electoral votes. New York I’m looking at has 28 electoral votes. You know, that’s going to be a Harris plus 20 seat. So those are not in play either. But you take a look at the basically the middle of the country. There is very little chance that Vice President Harris picks up any of Texas’s 40 electoral votes. Montana, Wyoming, the Dakotas, those are not in play. Those are automatically, although in a poll, they might need only one person from that state to be representative there. Whereas it’s you’ve got to wait for where people are and every state differently. Now, if you’re looking at battlegrounds, that’s a whole different ballgame. And the good folks at Bloomberg are consistently in the field with what they’re calling their seven state poll. And there they they basically do a mini national poll. Georgia is a little more overrepresented than Michigan, given its disparity. You know, Pennsylvania is given more weight than, say, I don’t know Arizona at this point just based on the populations and the electoral votes. But those are reflected in those polls. So you’ve just got to before you start passing around polling to your crazy uncle on either side, you need to figure out whether that’s a national poll, a state poll or a poll of the swing states. Also, I would just add, the states were carrying a swing states. Right now may not be where this this election is decided. Democrats are suddenly bullish on Florida, the Senate race there, and no statewide abortion ban. It has come into relief and no one’s really counting on Colin Allred there in Texas with you guys. There is a yes, Ted Cruz has been ahead in every measure. But, you know, if this thing starts sliding and a national wave starts getting behind the Democrats, who knows what this this final map could look like? And it’s the fact that it is the closest race we’ve had in 60 years in terms of polling. And that is something that is keeping both campaigns up at night.
Krys Boyd [00:22:58] Philip, it is tempting for those of us who managed to get out of college without having taken a statistics course to look –
Philip Elliott [00:23:04] I’m so jealous.
Krys Boyd [00:23:06] To look at the central number that we’re given in polls and ignore this thing called the margin of error. But we should not ignore the margin of error. Should we?
Philip Elliott [00:23:14] Absolutely not. The margin of error tells the real story of the poll. And I worked for a decade at the Associated Press, and this was just hammered into us. It’s in the stylebook. You cannot say someone is leading or ahead or you must never say someone’s winning in the polls until there are the margin of error times, two, outside of it. So if a margin if a campaign has a say, a good poll has a three point margin of error and they’re only up four, they’re not ahead by any stretch because that if they’re at 40%, they could also be at 44%. They could also be at 37%. So there’s that huge swing where, okay, we think you’re at 40, you might be at 37, you might be at 40, 43. Like it’s just a matter of you could be anywhere in that range with 95% confidence, which means there’s a 1 in 20 chance, a 5% chance that that poll is just completely out there. And that is the industry accepted margin that you have 95% confidence that this is right. And then there’s 5%. Well, maybe. And it’s really tough to figure out which of those are considered outliers. They usually stand out like sore thumbs. But sometimes the data tells something and you just get you got a problem with it. And sometimes you just have some funky numbers that are accurate. But man, they they’re just completely off base. And that is that is why these polls, anyone running a campaign solely based on polling or making choices on where to give money or where to volunteer, not towards whatever anyone gauging their level of engagement based solely on these surveys is really missing the broader dynamic of what’s happening in these races.
Krys Boyd [00:25:00] Is this what happened in 2016 when many Democrats thought our candidate is going to win? Republicans kept saying, we’ve got a really good shot here. And the Republicans ultimately were correct.
Philip Elliott [00:25:11] Ultimately, Republicans were proven right. There was some of it was the polling. You can’t deny that, although the polling was not as bad as some people say it was, it just reflected accurately what the pollsters thought the electoral universe was going to be. They did not see the MAGA movement as a reliable turnout. There were a ton of people who came off the sidelines. There were some people who voted in ways that they had never voted before. I mean, you take a look at some of these bellwether counties. I mean, I’m looking at Ohio and, for instance, Trumbull County outside of Youngstown. It went red for the first time since 1972. And it’s a union stronghold. And just no one saw that one coming. The wild counties that surround Milwaukee, those were huge harbingers, harbingers of change to come. And you know, these polls right now, you’re not going to find a county by county poll of a presidential campaign at that point. The money is just not at least not in public polling. The money’s not there to get accurate counts for some of these counties. I mean, Dallas, it would cost you I, I have no idea. I would say close to $1 million to get a statewide poll and taxes that would give us an accurate count for county by county. And that’s why you’re just not you’re not seeing it happen. You’re getting the best guesstimate. And that is still you’re still spending you know, you’re still spending six figures to get a guesstimate of the state of Texas. Maybe maybe you’re lucky and get it by region. But that is a huge it’s a huge investment that might not pay the dividends.
Krys Boyd [00:26:57] In election polling, of course. Phillip, there’s this question of who is being surveyed. What is the operative difference between registered voters and likely voters?
Philip Elliott [00:27:07] So this is a this is a judgment call that these pollsters have to come up with. We start every campaign asking registered voters because we know they’re in the system. We know who they are. We generally know how they voted in the past if they voted in partizan primaries. But as we get closer, we get to start refining who these people who are likely to vote are. We can start applying filters like or I should say, pollsters. Can we just benefit from them as consumers? But they start figuring out, okay, did they vote in the last election? Did they vote in the last presidential election? Do they vote in primaries? When was the last time they voted? When was the last time? In some states, they checked their voter registration. All of this data is available. The pollsters then try to figure out a model of who is likely. You show up? What the voting universe looks like, for instance, I just know that by virtue of I live in Washington, D.C., everyone is registered in Washington, DC, one way or another, you wind up on the voting rolls. I have never voted in Washington DC and I’ve been here 20 years, so try putting me in a polling universe would just be a really bad idea for these pollsters. Knowing that I have never shown up, I am not a likely voter. My opinion here is not going to really matter. Not that I as a journalist would participate in a poll, but it is a you can you can pretty easily filter out the no shows. But if you’re looking at, you know, first time voters, those are probably, you know, better than even odds that they’re going to show up if they took the time to get registered this cycle, especially closer to the deadline, that shows that interest. And then you can just ask them, do you plan to vote? And then there’s the follow up question of, in most things, it’s a are you extremely likely? Somewhat likely. Somewhat. I mean, it just there’s a different way to ask the question every time. As long as the polls are consistent, you can figure out how the universe is shifting and gauge voter intensity. Someone who’s newly registered to vote and says they’re definitely showing up and super interested, that is probably a likely voter, someone who sat out, you know, the last three elections and says, I’m not at all interested us, but I plan to vote. Planning to vote is not the same as voting. So that person ends up maybe falling out, if not just getting cut off altogether. So again, these are really sensitive judgment calls. And in many of these states that are already voting, you start adding on these question, have you voted already? And that helps you figure out this banking and target smart of great data firm that I use pretty frequently. They’re now starting to model and publish a voter advantage in the early states based on the early votes they have early voting. So you can see in real time their model and it’s usually pretty good about who they see as showing up and casting ballots and how they’re casting ballots based on their models. And really, this is this this as much as the charisma of a candidate, it’s the quants in the back room who do this, as in many cases just hired mercenaries who know voter behavior, and during non-election years, they spend their time figuring out which which markets to test the new McDonald’s sandwich.
Krys Boyd [00:30:34] Is there any evidence, Philip, that a significant number of people tell pollsters they care about a certain issue or prefer a certain candidate, but actually will end up voting a different way?
Philip Elliott [00:30:46] There’s a huge risk and there’s really no way of us knowing this. Even in the exit polls taken after the election, the thousands upon thousands of folks who stand outside voting locations and ask vote. I mean, there’s a question. Okay. Who did you vote for? And then a couple of months later, these the polling groups, Brookings does a great one. Pew and Gallup also do them. They compare the exit polls with what actually showed up. And there’s usually a huge delta. I mean, my former colleague Molly Ball did a great piece after the 16 election for us. Debunking that white women voted for Donald Trump in the exit polls. That was true. You take a look a couple months later, once you have the benefit of seeing who showed up and voted and basically sleuth your way through the air forensic accountant. Now, it was just the women. The women at these polling places were embarrassed to say it out loud. They thought it would be more socially acceptable. They voted for Trump in their communities. So there’s there’s some of that. There’s also I won’t say anyone’s engaging in it right now, but I have a flashback to 2008 when the late Rush Limbaugh was telling voters as they got later in the primary to Republican voters to vote in the Democratic primary and Operation Chaos to just mess with things. You see that sometimes with some nihilists in polling that they just answer questions like that. There are some things that you just know are not accurate. You see it more often in focus groups for folks who get a chance to watch them. You can just see there’s always a Cro-Magnon in the group who is just trying to throw the dynamic off and create chaos similar to that, similar to the streaming series Jury Duty. There’s just always a character in there who wants to be noticed and you know that that blip does show up in the polls and pollsters try to correct for it. And the answer there is just talking to more people. But that takes more time and more money. And, you know, for if you’re a news organization with a finite resource for polling, is that the best use of your money versus a a trip on the ground, which might not be as statistically accurate? And it might be entirely anecdotal, but anecdata, it does matter. Anecdotes do make better stories that readers want to read. I’m surprised we’re talking about my piece on polling, considering how dry it was.
Krys Boyd [00:33:30] I loved it. Are you kidding me? I loved it.
Philip Elliott [00:33:32] Thank you. I will. I will justify that to my editor then. So anyway, it is the it’s a choice. News organizations have to make. Polling organizations have to make. Is the return on investment worth it to take an extra day, spend an extra couple tens of thousands of dollars to get a wider sample when there’s still no guarantee you’re going to filter the cranks out.
Krys Boyd [00:33:55] Right. And they’re still really perishable, right. Leading up to a national election?
Philip Elliott [00:34:00] Yes. I mean, the shelf life by the time you get to the polling data, off the off the print, I say off the Xerox machine is this. I love I love printing things. It’s highlighting them Post-it notes everywhere in my office. You know. But by the time you print them off or read them on a PDF on your smartphone or whatever, they’re already out of date. That window is over. They were reflecting a moment. Maybe it was yesterday. Maybe it was this evening. But they are still out of date and by the time your readers are consuming them, that is not the most active. Its most. Accurate and knowable snapshot of a campaign. But who knows what changed overnight? I mean, Vice President Harris gave an interview to MSNBC.com, probably didn’t reset the race in any meaningful way. But for the for those voters who somehow just kept harboring. Well, I can’t vote for her because I don’t know where she stands on this. And she’s only done one interview. She hasn’t done a solo interview. Well, you lose that talking point. So either you come up with a new one or you just consider this is a binary choice. On a couple levels, you vote or you don’t vote. And if you’re voting for either Harris or Trump, those are your viable options.
Krys Boyd [00:35:13] So as you mentioned, sometimes the question is fully binary. Do you prefer this candidate or that candidate? But often the questions are a little bit more detailed and nuanced. Reputable polling firms are going to work hard to craft neutral queries that don’t shape the responses that people might give. But how do they figure out what kind of language works best for any given thing they want to understand?
Philip Elliott [00:35:35] Right. So a lot of these polling companies and even the ones working for the campaigns, they’re getting the first data collection rate really matters because that is the question they’re going to be asking the entire time if they can help it, because that will show a longitudinal shift in attitudes of what is resonating, what is not. If you’re asking and this is where I have a problem, there are a couple super PACs out there that are asking different versions of the question every week and trying to pass it off. Like, Hey, there’s this. I’m like, Well, what did you ask this last time? Well, now we ask this other question. But you didn’t write on that question, so we asked it this way. I’m like, that doesn’t tell me anything. A standalone question for the first time. You know, the stand alone question about Dobbs right after Roselle that matters. Ask a standalone question about dogs and eating dogs and cats. I don’t know. What did we talk about killing animals anytime. Do we have any way of measuring? And to be clear, no one is killing dogs and cats and eating them in Springfield, Ohio. Just need a statement of fact there. But they’re asking the question, trying to gate, trying to get a buzzy story out of it. I just I, I feel badly for the donors who are funding those sorts of garbage questions. But to your point about the things that matter, I mean, there are certain things that show up that have been asked every poll for ages like. Is your top issue the economy is your top issue, national security. After 911, terrorism came on. There is a separate category outside of military strength because it does change and it has been asked consistently through the ages. And therefore, you can tell what people are worried about. And when things become less scary, less urgent. I mean, you know, we got complacent on terrorism there for for a very long time, we got complacent on military spending. We got. But the economy still is up there pretty strongly. People got complacent on abortion rights despite, you know, Hillary Clinton was so consistent in 2016 saying the Supreme Court and Roe were on the ballot. No one believed no one, no one acted on that. No one heeded her warning because it was such an established piece of the American judiciary, American legal system. Suddenly, after Dobbs It spiked up. But I got to see if you take a look at it over time, it’s become not an urgent but less urgent as other things have creeped up, especially inflation.
Krys Boyd [00:38:17] Philip, is there any evidence that voting behavior is influenced by the direction polls seem to be leaning in at any given time? Like are people who aren’t quite sure. More likely to go with whoever seems to be leading?
Philip Elliott [00:38:32] Well, there’s also are people going to shop and vote for a candidate they think is doomed or will they stay home thinking their candidate is a slam dunk? I remember talking to the Clinton folks about, you know, roughly this time in 16 when they were so ahead and they were legitimately worried that their turnout that their supporters would not turn out because there was just a complacency there, that they thought, my vote doesn’t count. She’s got this in the bag. There’s also a function where people have convinced themselves that the polls are wrong or that their candidate’s getting an unfair shake, and that motivates them, too. So that’s why they show up. That’s why you saw a surge in 16 from Photoshops and you might see it again this time. I mean, polling at this point is the closest we’ve ever seen since since we’ve seen this in 60 years. That means that voters realize that their votes matter, especially in the the seven core states. But they also might matter in places that we haven’t really noticed yet. I mean, there’s always a sleeper race where I remember in 2004 when Howard Dean was giving a speech after 2006, after the blue wave, that Rahm Emanuel orchestrated his trip chair and he was listing off the candidates and he had no idea who had just won the New Hampshire House race. And in New Hampshire, we had Carol, Carol, Carol say something, and it was Carol Shea-Porter who won. He’d even know who the candidates name was, and he was the DNC chair giving what should have been jubilant remarks. And it was like there was just always there are always out of blue, out of the blue races that you just never expect. And I think this fall with this environment, we’re going to be seeing a lot of them. And also don’t discount the districts that in 22 were split districts. There are five Democrats who are in districts where Donald Trump won and there are 17 Republicans in districts Joe Biden won. So those are going to be a collection of they’re already super expensive. Some of them are. It’s California and New York, out of the bulk of them, thanks to some gerrymandering. So you’re going to you’re going to be seeing a lot of money spent there and a lot of attention placed there, especially in voters who might not be so prone to split their ticket between a D and R at the top of the ticket in the middle of the ticket.
Krys Boyd [00:40:54] So it makes perfect sense that journalists care a lot about these polls that obviously politicians and their campaign staffs care a lot about them. Why do you think so many people who are not actively engaged in running a campaign or covering a campaign are so interested in polling data?
Philip Elliott [00:41:12] Well, until we start getting election results like at 7:01 p.m. on November 5th. This is the only real objective measure that is available widely. And I say this, you know, there are other pieces that you can look at. You can look at social media impressions, Google searches. You can if you play it just right, you can figure out Google searches by congressional district if you get it, if you if you spend the time, you can get there. You can take a look at what’s trending on social media. You can move your geotagged to a place. And then there’s the other piece, which are, you know, the Facebook ad library, which shows you what candidates are spending on social media and what what their ads are. Meta has done a very nice job of being transparent on that. Although politics is less prominent this cycle than in the past, they’ve kind of escalated that. And then there’s the what they’re spending on TV and cable. And that gets. You know, the. The broadcasters have to disclose their contracts, but they don’t use them. None of them use the same form. None of it’s digitized in a meaningful way that you can easily support the way the FTC file is. And you can. Then there are also ad firms that literally make millions of dollars just sending freelancers to every station every day to look at the ad sheets and feed it back into the system and find out who’s spending what, what week and what markets and sometimes what the ads are so you can track that ad advantage is the firm most media companies use to track. So you can see where the campaigns are putting their money. Truly, where they’re spending the money is where they think it makes a difference. And so there are those. But, you know, polling is just easy to understand. There’s a top line, there’s a number, there’s a spread. There’s no math involved for the consumer. And it is just a it’s an easy stand in, although it is deeply imperfect, if not flawed. If you’re trying to say this one snapshot of a race tells the story because these races, it ignores what’s happening on the ground. Voter fickle. I mean, voters are very fickle. They will there is very there are very few loyalties, especially for non-presidential candidates. And we just don’t know what the candidate quality is in some of these cases. In some of these cases, especially for down ballot, these are just generic who’s the Democrat, who’s the Republican? And they’re doing a generic party. But I mean, I was I was talking with a pretty informed Ohio voter this week and she was talking about the race. And I mentioned Tim Ryan, who was the nominee just two years ago. And she just looked at me. I don’t know who that is. I’m like, he was a Democrat who ran against J.D. Vance. Yeah. I voted for him. I have no idea who he is. And this and I’m like, okay, this perfectly encapsulates the generic D, the generic R factor in non-presidential candidates. And to some degree, I think, yes, Vice President Harris has been in the Senate, been the vice ran for president, has been was in the Senate, was attorney general of a huge state. But people still don’t know her. But the fact that she has a D next to her name and is running against Donald Trump. That alone might be enough. And this might be the first time you’ve ever had a generic D factor with a candidate just because she came on so late and it has not really had the same level of scrutiny that someone who had been president, like Joe Biden for four years or has been running since 1988 for president, or someone who’d been through brutal primaries. I mean, we all knew by Election Day 2000 who George W. Bush was just because his primary was so intense. The general election was incredibly hard fought. The aftermath was we all got to know a lot of lawyers on TV that that November and December. Thank you. I think that is a big factor. But to your point about why people care about polls, the short answer is they’re easy. You can look at them. You don’t have to really think about them too much and you can move on with your day, either comforted or despondent based on where you see where you think your candidate stands.
Krys Boyd [00:45:31] Philip Elliott is senior correspondent at Time, which published his article “How to Read Political Polls Like a Pro.” Philip, thanks for sharing your insights.
Philip Elliott [00:45:40] Of course, any time Krys.
Krys Boyd [00:45:42] Think is distributed by PRX, the public radio exchange. Again, I’m Krys Boyd. Thanks for listening. Have a great day.