- The Crosstab
- Posts
- Yardsigns. Still don't vote (unless they do)
Yardsigns. Still don't vote (unless they do)
Or how I learned to love them in down ballot races.

TLDR of the Day
-Yard signs may have some (small) impact in low information races
-Online panels still limited
-VOTING IS STARTING
Yard Signs. What the science says.
So what does the science say about their impact? The answer is a bit complicated. (Somewhat surprisingly).
There’s been two studies on the topic and both have their flaws especially when it comes to top of the ticket races where name ID is no long a real issue.
The first was conducted in 2011 and involved creating a fake candidate for a local race, blitzing the area with signs, and then mailing out a survey. The fake candidate shot up into the leading pack but the design has some problems. First is that they did not do a survey prior to the signs but only did a survey after which and it was for an ultra low information school board race. Anyway all that considered it did lift the fake Griffin’s vote by 9 points.
The second experiment was by Professor Donald Green of Columbia (the godfather of political experiments) in 2015. They used randomization of yard signs in precincts but not others to try to address the issue. Their study found that again also in down ballot races that yard signs might explain a statistically significant 1.5 point bump for a candidate though they admit that people drive outside their precincts so it is harder to asses.
Online panels. Still not there yet.
The great hope for survey researchers has been online data collection where high quality data could be collected without the crazy high expenses of doing it through the phones. Unfortunately the main way of collecting data online has been panels. What is an online panel? It is basically a group of people who ‘opt in’ to receive survey questions and usually are compensated in some way for their opinions. In some way this is similar to the recruitment methods of focus groups.
The problem lies in the opt-in nature of the groups which introduce a variable into the mixture. Are the people who opt-in by the very act of opting-in different than the rest of the world. The answer is largely yes.
This is not the only problem with panels though. The other issue is how large the panels are by geography. You might be able to get 500 interviews statewide in a big state like Pennsylvania but to get 400 interviews in a congressional seat is tough, 300 in a state house seat is impossible. This has again limited the effort.
A firm called Pollfish has created a new approach called “random device engagement” to try to solve for both of these problems. They basically deliver ad units in app to people with mobile devices which avoids the opt-in issue but they still have a problem with getting surveys in smaller geographic regions. Also there are HUGE response bias issues in my expierence it skews heavily educated, Democratic, and female.
Still no easy answers.
Voting is starting.
We’ll start to have real data soon. Stay safe out there campaign cowboys.
Mark Harris is founding partner at ColdSpark and is a leading Republican political consultant having worked for clients including Ambassador Nikki Haley, Sen Pat Toomey, Sen Marco Rubio, Rep Alex Mooney, Rep Byron Donalds, Rep Russ Fulcher, Rep Guy Reschenthaler, State Auditor JB McCuskey and others across the country. He’s worked in 48 of the 50 states at this point.