The Maryland Public Opinion Survey was the product of two years of research on what it would take to start an Election Studies and Public Opinion Research Center at St. Mary’s College of Maryland. This is part of a pedagogical service learning experiment that began in 2012 and will continue until the end of 2015. I am Professor Susan Grogan. I am a Professor of Political Science at St. Mary’s College of Maryland and a member of the Advisory Board of the College’s Center for the Study of Democracy.
The commentaries in this blog supplement the static web-site of The Maryland Public Opinion Survey. In a few days, we will have completed our fourth public opinion survey experimenting with practical ways to conduct public opinion polling at a small college. We conducted our first poll from April 10 – 13 and published its results on April 18, 2014. The method used was Interactive Voice Response (IVR)–questions are recorded as audio tracks that are played over the phone according to a programmed script to respondents who then key in their answers using their phones number pad. That fall, we conducted an Exit Poll of voters in St. Mary’s County on Election Day 2014. By Exit Poll, I mean we stood at the exit of randomly selected voting places and interviewed every so many people as they exited. Our purpose was different than most Exit Polls run by media organizations. We were not there to get a jump on the election results.
We conducted two public opinion surveys during 2015. The first was a mixed method survey. It was an a second IVR survey that also offered participants the choice to take the survey online. About 15% of our responses were online. Having an online version also allowed us to administer a few surveys directly over the one phone in my office–mostly to elderly persons.
In November 2015, we conducted an online survey using a random sample of email addresses, whereas we had obtained random samples of landline phone numbers for IVR surveys before. The email sample is quite expensive. Due to budget limitations, I had to cut that sample in half and was unable to afford a random sample of phone numbers to supplement this to make sure we had good response numbers. Consequently, the November 2015 will have a rather small number of responses and larger margin of error than usual. But, the information will still be useful in terms of politics and public policy and more importantly will establish a baseline to project what an online survey with a higher number of responses would cost.
I had hoped to do more along these lines, but you will find some commentary that my students and I have published as background information on surveys and polling in general and how to interpret polling data. You will also find plenty of commentary on our own survey results as well as a few comments about other surveys and polls that caught our attention for one reason or another.