Sometimes I get bogged down by the technical details of learning how to program in R and Python and not seeing the endgame. Today I went down a little Google rabbit hole of “business analytics 101” top-clicks, because I felt like I wanted to gain some perspective on what the full lifecycle of data for making business decisions is.
I found this blogpost, which is basically a mini-course-lecture on Business Analytics in the Current Moment: https://practicalanalytics.co/bianalytics-basics/
- The data supply chain: “Raw Data -> Aggregated Data -> Intelligence -> Insights -> Decisions -> Operational Impact -> Financial Outcomes -> Value creation.”
- There is a race to turn this “big data” into a personalized, comprehensive portrait of an consumer, customer, prospect or visitor.
- Pattern detection, personalization, visualization, test-and-learn experimentation via rapid prototyping is the new norm.
- A whole new form of self-service consumer engagement enabled by analytics (discover, analyze, visualize, explore) is just starting to take shape.
- World-class companies excel because they’ve made tough decisions about which analytical processes they must execute well, and they’ve implemented platforms need to streamline those processes. Result? Their platforms have become an asset rather than a cost, and tend to foster new experiences for customers.
- I’m looking at you, Tessitura 😡
- And finally, some tools to hone:
Business Analytics 101
Loved this article about analyzing text data from the beginning of a relationship to marriage 6 years later:
I’m not a huge fan of wordclouds, but I think these are actually somewhat useful. Maybe it’s the color schemes or the scaling — but I find these easy to read and understand.
More data-centric podcasts (if I ever have the time!)
View story at Medium.com
How do we make meaning of the data we’ve collected?
SFS (arts org) found a gap in their demographics – their audience did not match census. They found the largest gap within the community that was already doing well (Cupertino) – Asian Americans. And they set out to close it.
- “The numbers surfaced something our unconcisous bias had played into”
- Goal = audience that is representative of the community
- Narrowed down the very large Asian category and found that among Santa Clara Asians more than half self-identify as either Chinese or Asian Indian
Designed targeted research to explain what’s going on in that gap.
Hypothesis: “There are identifiable barriers that prevent Chinese and Asian Indian communities from participating at SF Shakes”
- Hypothesis allows you to MAKE MEANING OUT OF DATA! Helps when you feel overwhelmed by the raw data.
- Surveys (July – August)
- Instrument developed with help of Arts Research firm
- Targeted all attendees for the entire run
- Interviews (October – February)
- Allies and their networks
- Started with 3 people they knew were good, and at the end who the next person we should talk to is (I love this idea)
- Focus groups (September – October)
- “Users” (target community members who came to a show)
- “Non-users” (target community members who’ve never come to a show
All of their data was overwhelming, it was a whole lot to organize. What you can always do is go back to your hypothesis.
Is a 4% difference something real and actionable? How about a 10% difference
- Significant, you want to look at statistical significance.
How do you make sense out of all of this?
- We use data to anchor the conversation, rather than replace it with “answers”
- We hoped we’d find some specific barrier we could removed. But as experienced arts administrators, we know those sorts of things don’t exist
- We try not to focus on single data points, but on a bigger picture that emerged from those data points
- We don’t rush to conclusions and when conclusions did emerge, we looked back in search of data that did and did not support those conclusions
“We are ruminators. We are our own prosecution, our own defense.”
Reflecting on the whole picture that had emerged, we asked what we could do differently. We threw things on a whiteboard and asked “What do we have time and capacity to do?” What might make a difference, based on what barriers arose from the data?
- Advance information –> program enhancements, more program material online in advance in the show, move the Green Show (education show) to main stage
- Availability of food –> food truck
- Feeling welcome, included, invited –> volunteers/front of house, signage
Next, they need to see if these enhancements DID make a difference? So, SFS started the next phase of their research with two initial guiding questions.
- What’s the best way to measure the new activities as it relates to engagement or creating a welcoming environment?
- Even though we won’t have a baseline of activity, is there a way to determine a measure of success of these activities?
- Define Research Questions
- Choose methodology
- Write instruments & analytical plan
- Define hypotheses, think about the final analysis
- Sampling/recruiting criteria
- Recruit participants
- Pilot instruments
- Collect data
- Analyze Data
- Create a strong Analytical Plan before the data comes back: a research question, a specific hypothesis, a specific plan for analysis, the kinds of conversations or final product that you plan to have afterward
- Write up findings: responding to the hypotheses, documenting your findings, reporting your findings
The question we had was simple: can we predict where people look when exposed to a dashboard they’ve never seen before?
Great article about designing an effective dashboard based on some data gathered about viewing patterns: https://www.tableau.com/about/blog/2017/6/eye-tracking-study-5-key-learnings-data-designers-everywhere-72395
Aspects to look for in a BI tool: https://www.tableau.com/resource/checklist-6-must-haves-your-advanced-analytics?