Data Talk: When Is It Time for a New Data System?

January 15, 2015

Bill DeBaun, Program Analyst

Inertia, the tendency of objects to stay in motion (or at rest) unless acted upon by another force, is well-known as Newton’s first law of motion. It’s also a powerful force in many facets of most organizations. Organizations tend to do things a certain way because they have done them that way for a while. “But we’ve always done it that way!” is something more than a few coordinators and advisors have certainly said while throwing their hands up in the air. Inertia can be good. For instance, policies and practices that are efficient and successful should often be kept (though a culture of self-examination is valuable too, but that’s a topic for another day). Sometimes, however, inertia can be bad, namely when you or your organization have done something a certain way for a long time but haven’t reaped benefits from it in a while.

Recently, in the Common Measures Learning Community, we had a question about whether it was time for a member to consider a new data platform and how to know whether that time had come. It’s a tough question. These platforms, which store, manage, and analyze student data, are often ingrained in an organization’s processes and practices. When they work well, they’re invaluable and as trustworthy as an old friend. When they don’t, they’re the uncle you keep inviting to holidays out of a sense of obligation.

Many CMLC members chimed in with their takes on how to know when it was time to move on to a new platform. Among these replies came one from Will Monin, Chief Technology Officer at NCAN member Degrees of Change in Tacoma, WA. His advice was both detailed and well-reasoned. Be sure to read the below while thinking about data management at your organization. If you come away thinking that your system is more of an annoyance than an asset, it may be time to shake things up. Will’s advice is presented here:

“Objectively assessing the scope of the problem isn't rocket science. There are several axes on which you can measure capability, even if it’s just categorizing your system in a high / medium / low model.

For example:

  • Functional requirement gaps. What necessary tasks is the system not capable of performing? This is usually the best place to begin. Simply asking your peers if there’s something they wish their software package could do for them will often provide good examples of these gaps. Also, "capable" means "can the average user, with average training, complete the task in an acceptable timeframe?" If it requires ninjas and miracles to get work done, that's not "capable."

  • Functional adaptability. It’s not unusual for delivered systems to miss important functional requirements. What's important is whether the system is sufficiently malleable to get where you want to go. This is typically a question for a software person, but you can get a decent handle on it just by comparing what your system can already do well to what you need and looking for similarities. You can also evaluate it by how hard it is to make changes (i.e., do you need a specialist every time, or just for big changes?)

  • Organizational readiness and support. Does the team know how to use what's been built, and are the team members committed to embracing the changes that using the system will force upon them?

  • Operational costs vs. cost of transition. Some software is expensive to keep alive, so retiring it may make good financial sense. More frequently, this question will combine with your assessment of Functional Adaptability and lead to determining whether to build the needed functionality inside your existing core system versus building (and perhaps integrating) an adjunct tool to address the Functional Gaps.

  • Consider the cost of doing nothing. Quite often, the functionality of available tools defines how work is done. Less capable tools push more work back onto your team.  When this is low-value or repetitive work, you're using manpower that is usually scarce for a job that a computer is much better at. Asking yourself if this is a choice you would make again can be very illuminating.

  • Consider the opportunity cost. Beyond estimating the number of hours spent, consider the opportunity cost of those hours. Are there more important projects that could be accomplished if you could free up the staff? Would recruiting volunteers be easier if they didn't have to do "that job"? Would getting a faster or more accurate answer to a complex question give your team the ability to be more effective or efficient?

If your tools are holding your team back, it’s time to rethink. Revamping or replacing software may prove to be easy to justify when strategic costs are considered.”

Degrees of Change is working through the hard questions around up-front cost versus long-term benefit as we develop tools to make our program more scalable and our affiliates more efficient.  You can contact Will to discuss this further at

Continue to follow NCAN’s blog as we offer up more advice from members on data-related topics!

Back to Blog

Leave a Reply:

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License