Are We Setting the Bar Too High on Telling Stories with Data?

Rigorous data gathering and analysis can get in the way of effective storytelling.

I recently co-facilitated the “Impact Leadership Track” at the NTEN Leading Change Summit with John Kenyon, Elissa Perry, and Londell Jackson. Our track was one of three where participants could take a deep dive into a topic and learn from peers through dialogue. The event also included plenary speakers, including a provocative talk about storytelling with data from Alexandra Samuel.

Her most controversial point was:

“Rigorous data gathering and analysis can get in the way of effective storytelling by non-profits.”

Beth Kanter

Beth Kanter has over 30 years working in the nonprofit sector in technology, training, capacity building, evaluation, fundraising, and marketing. Beth is an internationally recognized trainer who has developed and implemented effective sector capacity building programs that help organizations integrate social media, network building, and relationship marketing best practices.

She was speaking to a room filled with nonprofit leaders who have the power to use data as part of communications campaigns to raise awareness or inspire people to take action on important causes. Yet, they are held back because they lack the resources or skills. In her talk, Alex encouraged the audience to discern between the scenarios where rigorous, objective research is needed, using scientific methods and those where they can simply tell an interesting or compelling story with data. She created the 2×2 for nonprofits to help think through the level of rigor in research methods are the best fit doing quantitative data storytelling.

As Alex notes, in an ideal world, nonprofits would have access to high quality data collected through rigorous methods and professional data scientists to design, implement, and analyze it. However, this isn’t always the reality. As Alex points out “Sometimes you can get the results you need (like inbound traffic generated by an eye-catching infographic) by sharing data that is useful and interesting, if imperfect. That’s why I think it’s time for organizations to get comfortable doing at least some of their data work in the “relaxed” zone, because that is better than missing out on doing any data storytelling at all.”

This idea upset a few colleagues, especially colleagues who have been trained in rigorous scientific methods like Deborah Finn, who strongly disagreed with Alex’s points about relaxed data.

“I don’t think we should encourage nonprofit professionals to be more relaxed about how they use data to tell their stories; I think that ‪#‎nptech ‬professionals should be concentrating on how we can deliver training (and other forms of capacity building) that will assist them in telling their stories with valid, reliable information that will withstand scrutiny. These nonprofit professionals don’t need graduate degrees in statistics, social research methods, and database administration – they need something that is pared down to the skills they will use in their daily work lives.”

I interpreted “relaxed” as still generating quality data but not at the social science academic standard – and that is what nonprofits who are not professional measurement/data geeks could learn how to do. Perhaps Alex’s 2×2 framework could be a useful instructional device to help nonprofits decide a) we can do it ourselves (e.g. get training for that) or b) when we need a more rigorous approach. If the latter, they would have enough understanding of the methods that they could find the right data scientist/geek and able to work effectively with them. I don’t think Alex is advocating for “use crappy data to make your point.”

Deborah is concerned that the permission to “relax” will encourage sloppiness in data research methods, although we all did agree that nonprofits do not need to do the kind of quantitative analysis that it has to pass peer reviews in order to be accepted by an academic journal.

Alex shared that her reasons for advocating that nonprofits relax a little is that data projects can be very costly and time-consuming. She hopes that nonprofits can think about ways to do data work “that is more focused, economical and yes, relaxed, as long as they are still putting forth information they believe is accurate. What I am trying to do is to encourage people to rethink their risk/reward calculus, though how that lands will depend on how high they had set the bar to begin with. From what I observe, it seems like folks may be setting the bar too high, because I can’t think of any other explanation for why more nonprofits aren’t seizing the opportunity to tell their stories with data. When so much of the data-driven stories online are produced by large media companies and other companies with deep pockets, it can seem very daunting for smaller organizations to try their hand at the same thing. Not everyone is Jawbone or the New York Times or Nielsen. That doesn’t mean we can’t use data, and use it effectively, if more modestly.”

Cross-posted with permission from Beth Kanter.




Natasha Freidus replied to the author

Provocative article, thank you! Here’s the thing though, I’d challenge the premise that we use data to tell stories. While of course that can be true, it overlooks the fact that stories are data in and of themselves. It’s important to differentiate here between quantitative data and qualitative data. Too often we just value the “hard data” so one idea for non-profits is to think of ways to bring together both quantitative data and narrative data to make our case. We’re exploring this now with “StoryMapping” where users can embed digital stories in GIS data sets. We have samples up here:
With that in mind, I’d love to think through how we can reframe the question you ask here, “How has your organizations used data to tell stories”

Thanks for sharing this Natasha, it looks like a really interesting project. We are looking to do something similar later next year as part of a project exploring the impact of technology on work opportunities in urban slum areas, through collecting micro-narratives and mapping them alongside other digital data to be scraped and aggregated across our study cities.
Is your tool open-source, for others to build upon and/or adapt for other needs?