News: Data, Research & Evaluation

NCAN 2019 Conference Roundup: Using Data to Improve Outcomes and Scale Capacity

Monday, September 23, 2019  
Posted by: Bill DeBaun, Director of Data and Evaluation
Share |

"Our commitment to equity is our insistence on excellence," Steve Colón, CEO of NCAN member Bottom Line and vice president of the NCAN board of directors, exhorted the Tuesday afternoon plenary crowd at our national conference last week. Colón went on to demand that NCAN members not be satisfied with moving their students’ postsecondary outcomes to the national average. Instead, Colón insisted members focus on verifiable impacts and practices with demonstrable returns that push students’ outcomes to those matching their more privileged peers.

Colón's speech, one of three on what it will really take to achieve equity, was one of the conference’s most powerful moments. His remarks spoke to the heart of a very important idea: The data work advanced by NCAN and its members is key because that work aims to collect evidence, demonstrate impact, and pursue equitable and just outcomes for students.

A few years ago, data at the national conference was somewhat reduced to a handful of sessions attended by a similar handful of logic model adherents and evaluation supporters. Fast forward to 2019, and research, data, and evaluation topics infused a multitude of sessions and plenaries, some more overtly than others.

This year, pre-conference sessions on FAFSA completion and expanding members’ postsecondary success work both had substantial focuses on data collection, cleaning, and reporting (to both program stakeholders and funders). With more and more data available and member capacity in this area always increasing, there’s little reason not to weave data into these important initiatives.

The Data Learning Community session on Monday morning was standing room only for the second year in a row (a point of pride for your author). Nearly 70 attendees came to hear more about NCAN’s research goals:

  • Connecting research to practice by sharing new literature and disseminating research-backed best practices.
  • Helping organizations measure their impact by releasing benchmarks for the field and capturing organizational strategies and practices.
  • Providing technical assistance by advising on organizational research and evaluation strategy and connecting peer organizations.

Attendees also shared their thoughts and resources around topics like administering a senior survey, getting the most out of the National Student Clearinghouse StudentTracker, and their appetite for assistance related to planning for or executing program evaluations.

The learning community session was a harbinger of things to come at conference.

Another full “Data Into Practice” conference track featured sessions on using predictive analytics to guide students, K-12 and higher education partnerships whose lifeblood is data sharing and collaboration, and program evaluation.

One session in particular highlighted the best of what data at the national conference can be. Co-led by iMentor’s Jim Lauckhardt and Max Polaner and OneGoal’s Keith Zander, the session focused on defining and measuring process and outcome metrics. Process metrics (“outputs” for readers familiar with logic modeling) and outcome metrics measure the direct results of a program’s activities and the longer-term results to which those outputs lead. The distinction can be subtle, but in any event, it is critical for understanding what a program should be looking for and can reasonably expect from its hard work.

This session encapsulates the fact that no matter how much or how little experience NCAN conference attendees had with data, there was something to learn about programmatic processes and priorities. It’s an apt analogy for where members programs are overall.

For its part, NCAN’s data-related work in the past year has focused on products in three main categories: benchmarking and outcomes (e.g., results from the Benchmarking Project, the Impact Survey Project); policy (“The Growing Gap” white paper on the affordability of public higher education institutions, the Form Your Future FAFSA Tracker, analysis of verification melt and the association between FAFSA completion and enrollment), and member practices (“Keys to Success,” which investigated the nature of members’ postsecondary success practices, and “Data Usage in College Access and Success,” which reported survey results on members’ practices around data).

All of these resources and presentations connect directly to Steve Colón’s charge. NCAN’s calls for increased (and improved) data usage for members has never been about data for data’s sake. Instead, it has been about improving outcomes and scaling capacity. The field needs to make significant progress in both of these areas to achieve equitable outcomes for the low-income, first-generation students, many of them students of color, whom we collectively serve.

When our field demonstrates excellence for students by using demonstrably excellent approaches, it will be delivering on its commitment to equity. Colón’s point is one that should stick with college access and success stakeholders for some time as they consider own usage of data, research, and effective practice.