How do we make meaning of the data we’ve collected?
SFS (arts org) found a gap in their demographics – their audience did not match census. They found the largest gap within the community that was already doing well (Cupertino) – Asian Americans. And they set out to close it.
- “The numbers surfaced something our unconcisous bias had played into”
- Goal = audience that is representative of the community
- Narrowed down the very large Asian category and found that among Santa Clara Asians more than half self-identify as either Chinese or Asian Indian
Designed targeted research to explain what’s going on in that gap.
Hypothesis: “There are identifiable barriers that prevent Chinese and Asian Indian communities from participating at SF Shakes”
- Hypothesis allows you to MAKE MEANING OUT OF DATA! Helps when you feel overwhelmed by the raw data.
- Surveys (July – August)
- Instrument developed with help of Arts Research firm
- Targeted all attendees for the entire run
- Interviews (October – February)
- Allies and their networks
- Started with 3 people they knew were good, and at the end who the next person we should talk to is (I love this idea)
- Focus groups (September – October)
- “Users” (target community members who came to a show)
- “Non-users” (target community members who’ve never come to a show
All of their data was overwhelming, it was a whole lot to organize. What you can always do is go back to your hypothesis.
Is a 4% difference something real and actionable? How about a 10% difference
- Significant, you want to look at statistical significance.
How do you make sense out of all of this?
- We use data to anchor the conversation, rather than replace it with “answers”
- We hoped we’d find some specific barrier we could removed. But as experienced arts administrators, we know those sorts of things don’t exist
- We try not to focus on single data points, but on a bigger picture that emerged from those data points
- We don’t rush to conclusions and when conclusions did emerge, we looked back in search of data that did and did not support those conclusions
“We are ruminators. We are our own prosecution, our own defense.”
Reflecting on the whole picture that had emerged, we asked what we could do differently. We threw things on a whiteboard and asked “What do we have time and capacity to do?” What might make a difference, based on what barriers arose from the data?
- Advance information –> program enhancements, more program material online in advance in the show, move the Green Show (education show) to main stage
- Availability of food –> food truck
- Feeling welcome, included, invited –> volunteers/front of house, signage
Next, they need to see if these enhancements DID make a difference? So, SFS started the next phase of their research with two initial guiding questions.
- What’s the best way to measure the new activities as it relates to engagement or creating a welcoming environment?
- Even though we won’t have a baseline of activity, is there a way to determine a measure of success of these activities?
- Define Research Questions
- Choose methodology
- Write instruments & analytical plan
- Define hypotheses, think about the final analysis
- Sampling/recruiting criteria
- Recruit participants
- Pilot instruments
- Collect data
- Analyze Data
- Create a strong Analytical Plan before the data comes back: a research question, a specific hypothesis, a specific plan for analysis, the kinds of conversations or final product that you plan to have afterward
- Write up findings: responding to the hypotheses, documenting your findings, reporting your findings