Moving the Needle in Higher Ed: Data, AI, and the Student-First Philosophy
- Published on: March 9, 2026
- Updated on: March 9, 2026
- Reading Time: 5 mins
-
Views
Create a Culture of “Data Curiosity”
Data for Data’s Sake Is Just as Bad as No Data
Stop Measuring for Measuring’s Sake
Focus on Impactful Metrics
Collect with Caution: Ethics and “The Hoarder’s Dilemma”
Offer Transparency
Artificial Intelligence: Preparing Students & Avoiding “The Black Box”
Teach Students to Learn with AI…and How to be Ethical
Avoid the “Black Box” Syndrome
Put Faculty First: They Drive Innovation
Pay Them
Make Course Modules Consistent
The Student-First Philosophy
FAQs
Analytics. Artificial Intelligence. Automation. Navigating this new world of EdTech isn’t easy, especially when every vendor is announcing their latest “upgrade”. But how do you wade through the noise to find meaningful impact? How can tech really move the needle for your institution?
Thomas Cavanagh, Vice Provost for Digital Learning at the University of Central Florida (UCF), has worked for 17 years focusing on instructional design and educational technology at the university. Thomas joined the Tech in EdTech podcast to discuss everything from “toxic course combinations” to AI Teaching Assistants and what leaders can learn from their initiatives. Here are four major lessons on moving the needle in Higher Ed today:
Create a Culture of “Data Curiosity”
“Data doesn’t inform decisions unless you have the culture to act on it.”
As universities and colleges collect more data than ever, ironically, the biggest hurdle to modernization can be the school culture behind the technology. Data can live in all kinds of “buckets” at an institution: your Learning Management System (LMS), your institutional research department, and so on. But if you’re not going to act on it…why collect it at all?
Data for Data’s Sake Is Just as Bad as No Data
On one end of the spectrum, some institutions have all the data they could want but simply don’t do anything with it. On the other end, schools don’t even know what data they could be collecting! The best place to fall in that spectrum? Somewhere where data equals action.
For example, Cavanagh’s team discovered that they could use historical student performance data to identify what they call “toxic course combinations.” Say you have two courses: “Course X” and “Course Y.” Passing both would give students a good GPA. However, if those two courses are taken in the same semester by a student, that GPA tanks. Why? By applying simple data analytics, advisors can now pinpoint which students are taking these combinations and alert them ahead of time that they should maybe reconsider their schedule.
Of course, these “buckets” of data can’t just sit in their respective divisions. At UCF, they utilize a Chief Data Officer as well as data governance councils to help “make sure those buckets are talking to each other.” From there, decisions about data are coordinated at the highest levels of the university to ensure there isn’t dirty data or duplicate efforts.
Stop Measuring for Measuring’s Sake: Kill “Adoption”
“What metric do you advise college leaders to retire?” Thomas said he would kill “adoption.” As in-class adoption numbers for the latest tech platform. Sure, EdTech companies want you to know how many people are using your product, but does that really tell you how well it’s working?
Focus on Impactful Metrics
Does your technology actually improve student engagement and retention? Are students graduating faster? Did they improve their grade in a course as a result of the intervention? What matters is what you do with the tech, not how many people are using it.
Cavanagh recommends looking at your LMS as your “sharpest stick,” so to speak. As your front-line tool that students and faculty interact with daily, your LMS tracks real-time data that can alert you to intervene much earlier.
Collect with Caution: Ethics and “The Hoarder’s Dilemma”
“One of the mantras I live by is uncollected data cannot be examined or analyzed.’” But with great collection power comes great responsibility. Cavanagh stressed institutions should always be transparent about what data they collect.
Offer Transparency
On-board students and faculty on what your school collects through your onboarding practices. Make it public on your website. Give students and faculty the option to opt out of certain data collections (where legally possible).
Surprisingly, UCF actually did a study on how students feel about their data being collected. It turns out they kind of already know you’re doing it and don’t really care! As long as you’re collecting data to make their lives better and more successful, students are cool. Faculty, on the other hand, aren’t as trusting about data collected on themselves, but generally support collecting data on students to improve their experiences.
Artificial Intelligence: Preparing Students & Avoiding “The Black Box”
Thomas believes K-12 and higher education institutions have both an opportunity and a responsibility to prepare students to interpret and use AI as they enter the workforce. Many white-collar careers won’t even consider hiring someone who doesn’t know how to use AI as a tool.
Teach Students to Learn with AI…and How to be Ethical
This can look a lot of different ways, but at a baseline should include how to not pass work that was generated by AI as their own and how to check citations to make sure they aren’t AI-generated “hallucinations.”
Avoid the “Black Box” Syndrome
AI can write your course outline. AI can write your multiple-choice test questions. But AI should not write the test, deliver it, and grade it without a human in the loop. That’s unfair to students. As Thomas puts it, we have to remember that AI does not have empathy. It doesn’t understand that this stuff matters.
Teaching students how to use AI effectively and ethically is the only way to avoid this “black box” syndrome.
Put Faculty First: They Drive Innovation
“It doesn’t matter how shiny your technology is. If your faculty isn’t empowered, it won’t move the needle.”
This advice was true when EdTech started twenty years ago, and it remains true today. In fact, Cavanagh shared two major ways UCF helps faculty that could empower your teachers faster than anything else.
Pay Them
Simple as that. Pay faculty for their time so they can further their own education on how to become better instructors. When you pay your faculty to sit in a professional development webinar, it centers professional development as a part of their career, not a chore they have to do.
Make Course Modules Consistent
This strategy is unique to online and blended learning. But by making your modules consistent (except everything due by midnight on Sunday), you are instantly helping students who are typically busier than traditional students schedule their lives around their education.
The Student-First Philosophy
With all of the lofty ideas on performance-funded modeling, predatory analytics, and shiny new technology, Thomas agreed that faculty are the real MVPs in this game.
They refuse to go to meetings about costly new platforms just so they can save their students $20. They spend their nights ensuring their courses are responsive and that students know they care. When it comes down to it, if your technology and strategies positively impact the students, your faculty will rally behind you.
So long as we’re using the right “yardstick” to measure modernization efforts, the future of EdTech will shine.
Want to learn how to modernize your institution’s approach to data? Interested in how AI can help (or hurt) your initiatives? Visit Magic EdTech for more insights on AI in the classroom and how to build trust in your digital tools.
FAQs
Use outcome metrics tied to a specific decision, not usage counts. Pick a small set of targets, such as reduced DFW rates, improved retention in gateway courses, faster time-to-degree, or increased course completion, then define the baseline and the intervention path. If a metric does not change a decision or trigger support for a student, it is just dashboard decor.
Start by naming the decisions you want data to improve, then assign an owner for each decision and the action it should trigger. Make definitions boring and consistent, so teams trust the numbers, then reward follow-through, not reporting. Culture shifts when people see data leading to better outcomes, not more meetings.
Focus on signals that map to learning behaviors, such as missed submissions, lack of course participation tied to required activities, or repeated low-stakes assessment failures. Pair every alert with a human outreach workflow that is supportive, specific, and time-bound. When the LMS becomes a tool for earlier help rather than compliance policing, trust increases, and outcomes usually follow.
Analyze historical performance by course pairings while controlling for obvious confounders like credit load, modality, prerequisites, and cohort differences. Turn the finding into a simple advising rule with clear alternatives, such as recommended sequencing or support resources, not a scary prediction. Recheck the pattern each term and monitor equity impacts so the model does not become a bias amplifier.
Keep AI in assistive roles where humans review and own the final call, such as drafting outlines, generating practice questions, or suggesting feedback language. Avoid full automation for high-stakes steps like grading, final assessments, or decisions that affect student standing without accountable review. Require transparency on what the AI did, what data it used, and how errors are handled, because “the model said so” is not a defensible policy.
Get In Touch
Reach out to our team with your question and our representatives will get back to you within 24 working hours.
