By: Matthew Jenetopulos

November 18, 2020

 

Most of what we’ve shared about the national study that we’re co-leading with our colleagues at LaPlaca Cohen, Culture and Community in a Time of Crisis, has been focused on research findings; in this post we’d like to provide a different perspective on the work—a behind-the-scenes view of the research process itself. Below, I’ll share the highlights of how we scaled a research study up from a planned sample size of 10,000 to more than 120,000 respondents across the country in just a few weeks. Please check out the FocusVision website for more in-depth discussion of the process in this webinar with our teams as well as a detailed case study about the research project.

A quick note that when designing this research, we wanted to hear and learn from two main groups: the general public and cultural attenders/participants. For the general public, we collected survey responses from AmeriSpeak’s nationally-representative panel—a fairly straightforward process. The more complex part of this study was collecting data from cultural attenders and participants through the email lists of arts and cultural organizations around the country.

Getting the word out about the opportunity to participate in the research

For the list survey side, four consulting firms (including ourselves) agreed to activate and reach out to their clients (or existing lists of cultural participants) for participation – LaPlaca Cohen, Advisory Board for the Arts, and Wilkening Consulting. But we didn’t want to limit the call for participation to just our networks, so we also reached out to some of the larger service and advocacy groups that oversee specific parts of the cultural sector including Opera America, League of American Orchestras, American Alliance of Museums, Association of Art Museum Curators, Alliance for the Arts in Research Universities, among others. We customized our invitations to each organization using a combination of approaches like extending an invitation on webinars, via blog posts, and through language included in emails to members. As organizations began to sign up we noticed that there weren’t many with a focus on serving specific BIPOC audiences or attenders, so we began to do more direct outreach to BIPOC-serving organizations to personally invite them to participate in the research.

Managing the overwhelming level of interest in participation

These efforts in extending the invitation to participate in the research paid off as sign ups began slowly but quickly picked up speed. In the first week of the open call for participation 50 organizations had signed up, followed by 200 more the second week, and by the time the sign-up deadline came a week after that, an additional 500 organizations had asked to participate. Requests for participation began coming in so rapidly near the deadline that I needed to set up an email auto-response because, despite being at my computer all day and night, I couldn’t keep up with all this communication. As the dust settled, I realized that in those few weeks I’d received emails from over 750 organizations from all 50 US states, Puerto Rico, Washington, DC, and two Canadian provinces.

Protecting personally-identifiable information from the lists

In our initial conversations about data collection, we planned to have organizations send us their lists so we could remove duplicates and encode some information about respondents (such as member, subscriber, etc.) in their links. However, many participating organizations let us know that they had concerns about sharing personally-identifiable information, and in some cases, they couldn’t legally send that information to another organization. We adjusted our plan for survey distribution to share invitation language and a unique link with each and every organization so they could pull their own lists and distribute the survey themselves. As the scale of participation mounted, it was clear that cleaning and compiling the lists of hundreds of organizations wouldn’t have been feasible, so we were grateful for the change we made early on to this process!

Creative solutions to survey length

There were so many questions that we wanted to include in our survey, but we had to carefully manage survey length to limit fatigue and reduce drop-out rates. A lot of questions that were included we were keen to follow up on which would have made the survey quite long, so we tried something new. After 15 minutes-worth of questions, we let respondents know that they’d completed the main survey, and we invited them to answer a few more questions. They could say no and be finished right then, or they could continue to answer additional questions. More than 75% of the list sample agreed to complete another 3-5 minutes of follow up questions.

Sending over two million survey invitations out

The week of our survey launch, we heard from Matt Dill, Global Director of Client Services at FocusVision, that given the unanticipated number of invitations we had planned, we wouldn’t be able to launch all 2,000,000+ at once without risking server issues. We talked through Matt’s recommended cap for survey invitations (65,000 max per hour) and completes (6,500 max per hour) and started to break down how many hours we’d need the invitation period across all participating organizations to cover. We decided to create a five-day launch zone with two-hour time slots on a specific day for each participating organization. I assigned each organization a day and time randomly (taking into account their time zone) and for anyone that couldn’t make their initially assigned time, a backup slot on the following Monday or Tuesday was available. Everyone was incredibly understanding and kind about the last-minute change in plan, and the launch went smoothly.

Once the dust had settled, we had 122,000 responses from 653 organizations and another 2,000 from our AmeriSpeak panel. For more on our findings, check out our reports that have been published, or play around with our interactive data tool. Thanks again to FocusVision for having us as guests on their webinar.

As always, we’d love to hear your thoughts; send us a note here.

 

Photo: Jen Benoit-Bryan, Madeline Smith, and Matthew Jenetopulos join FocusVision’s Zoe Dowling for a recent webinar about the CCTC research process.

Let's stay in touch. Sign up for our newsletters.