Lessons Learned from "Managing the Data-Driven Organization"

May 10, 2016

Bill DeBaun, Program Analyst 

Two weeks ago, the second annual Do Good Data conference was held in Chicago. The conference, which is organized by The Impact Lab and Data Analysts for Social Good, brings together analysts, researchers, policymakers, funders, and leaders who are all interested in using data, analytics, and evaluation in the social sector. This isn’t an education-focused conference (although many of the sessions have an education theme), more of a skill- and capacity-building opportunity for those who want to add more tools to their repertoires that they can take back to their organizations and policy sectors.

I was fortunate to attend, along with five NCAN members, for the second year in a row (I certainly recommend it to members). Prior to the conference’s kick-off, I attended a session titled “Managing the Data-Driven Organization,” which is a topic near and dear to an increasing number of NCAN members. The session was led by Andrew Means, who provided many of important nuggets, takeaways, and pieces of advice I gleaned from that session that appear below. Many thanks to the attendees in the room whose conversations also provided rich insights into this important topic!

What does it mean to be data-driven?

The phrase “data-driven” gets tossed around a lot, but what does it actually mean? How does it express itself in an organization? Some responses from the crowd:

  • Evidence is routinely gathered, called upon, and listened to when making decisions.
  • “Evidence, hard and soft data, are used to defend decisions and conclusions”
  • “Incorporating the scientific method into an organization to test hypotheses”
  • “Accepting that there are hard decisions to make if the data say you’re doing something wrong”
  • “Organizational curiosity”
  • “If data has never helped you to make a different decision than you would have, you’re not using data well.”
  • “Routineness: [using data is a] routine part of how organizations operate. They actively and routinely go out and find that data and collect it.”

The data-driven process, abbreviated!

  1. Start with a question.
  2. Find the right data.
  3. Bring order and structure to that data.
  4. Create change.

Start with a question.

  1. Actionable > interesting. Don’t spend too much time going down the rabbit hole of questions that are “interesting.” Instead, ask…
  2. “What, if I knew it now, would change my behavior/decisions?” Looking at the weather each morning informs the decision-making around whether or not you need an umbrella. What are the repeatable decisions your organization has to make on a regular basis? How would you optimize those decisions?
  3. Optimizing decisions is rarely easy or quick, and there are resource constraints (time, human capital, technology) that interfere with doing so.
  4. Start somewhere. Document and accumulate wins to build organizational buy-in. Start slow with low-priority or low-stakes questions. Get a few successes (in the form of influencing or changing a decision in a way that otherwise would not have been made without asking the question), and then build on those successes.

Find the right data.

  1. If there is data that you need that you’re not already collecting yourself, someone is likely already collecting it. You need access to it. There is often too much repeated data collection, which is a waste of organizational resources and a burden on respondents. (It’s unclear to the extent this is true for college access organizations; there is some overlap in service provision, but for the most part organizations seem to need to collect their own data on the students they serve.)
  2. Data collection is often costly, and it’s hard to get right. Be thoughtful about what you’re collecting and how you do so. Sketch out the data points that you need and map them to the instruments/methods with which you’re collecting data. Don’t put out a survey or questionnaire only to realize that the items you’re using don’t actually map to the data that you need. It may take some iteration to get it right, so it might make sense to do some user testing with a pilot group before you roll out to the whole population. Also keep in mind that not all questions require a lot of data to answer. Be lean with your collection to keep time costs and respondent burden down.
  3. Keep the cost and benefit of collecting data in mind in general, but especially for frontline staff who often bear the brunt of that collection. For the time that frontline staff spend collecting data, make sure they’re getting a benefit out of it as well. Is the data they’re collecting being turned into actionable insights or reports they can use? If so, they’re probably more likely to a. collect good data and b. be happy about doing so. If not, how can you derive some benefit for them? (Hint: ask them!)

Bring order and structure to that data.

  1. Take a look (literally) at your data, by graphing it. Anscombe’s Quartet tells us that looking at summary statistics (mean, variance, correlation, etc.) can be misleading. Visual representation of data can reveal patterns that statistics cannot.
  2. How many? Shape of the data? Patterns?
  3. Classify your data. What groups are there? How can you divide them?
  4. Dashboards track vital signs, but they do not diagnose. The work doesn’t end with the creation of a dashboard.

Create change.

  1. Identify where you are, including the political climate around data, organizational history, and context.
  2. What would a data-driven version of your organization look like? Be specific. Where are you going?
  3. Identify roadblocks. What are the benefits and costs to change?
  4. Develop a clear plan and path forward. (This is going to change constantly.)
  5. Execute your plan and iterate based on what you learned.
  6. People see the downsides of sharing data and being responsible for findings, but there must be successes, too.
  7. Remember that change in most organizations happens incrementally. Do not like the perfect be the enemy of the good or useful. The goal should be to do a better job than you did last time.
  8. Work toward using data in a way that changes behavior or actions. The example given during the presentation was the idea that using data you might be able to identify which students are more likely to drop out than others. That’s the question. The resulting action was an email listing the 10 students most likely to drop out; that’s an actionable outcome.

Some closing perspectives on data.

  • Think of data as a raw resource that can be gathered, but it’s only truly useful when it’s tempered or processed.
  • Think about data as a raw block of stone. Your organization needs to invest the time to chip away at it while determining the most valuable questions to ask and gain an insight.
  • “Data is a team sport. You need a lot of people to make it work.”
  • “Paralysis by analysis – don’t let lack of data stall your work.”
  • “The trend in data is toward measuring impact; it’s relentless.”
  • Do your best to avoid the mindset of “I don’t want to look because I don’t want to know.” No organization can grow that way.

Back to Blog

Leave a Reply:
Login
 
 
 

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License