Former Analyst
A primary obstacle to artificial intelligence (AI) adoption is not the availability of the technology but rather the human aspect of it.
To realize AI’s potential, people need to understand, accept, trust, and embrace it as an extension of their work and processes. With the relative newness of the technology and its rate of product development and market entry, many individuals and institutions rightly are not entirely confident the benefits outweigh the risks. As more use cases emerge, so do more questions.
AI is, well, intelligent, but it doesn’t have the strategic mindset or moral and ethical compass that is intrinsically human. The latter has become central to mainstream discussions about the ethics of AI applications as US legislators consider potential regulations to address its use.
AI’s appeal is in productivity and cost-efficiency, but those gains should not outweigh or overshadow other metrics of success.
Generally, AI has not been widely adopted in higher education, with institutions prioritizing guidelines and policies for student use of AI before internal deployments. Institutions—and businesses outside of higher education—have not yet had time to understand the impacts of these emerging (and, in some instances, unproven) AI tools, which have entered the market swiftly, often to help software companies develop or maintain a competitive advantage.
The colleges and universities that have adopted certain AI use cases and applications are primarily taking advantage of features baked into their existing SaaS solutions, such as workflows that automate business processes, especially in finance and HR. Some institutions are further leveraging the capabilities of technology like robotic process automation (RPA) by combining it with AI’s data analysis and pattern recognition.
Many colleges and universities remain focused on migrating from legacy enterprise on-premises systems to modern cloud solutions and other modernization priorities, delaying their AI adoption. As noted in our 2023 US Higher Education Market Share, Trends, and Leaders reports for student, finance, and human capital management (HCM) systems, more than 84 percent of US higher education institutions with student solutions have not yet selected a SaaS-architected student solution. Further, more than 80 percent of US institutions with finance and HCM solutions have not yet selected a SaaS-architected solution.
Another factor impacting AI adoption in higher education is the industry’s aging workforce, which still struggles to fill holdover employment gaps after many employees retired or left higher education during the COVID-19 pandemic.
Many institutions know that before selecting an AI solution, they should identify the problems they need to solve and specific use cases to deploy. It’s easy to get caught up in the AI hype, which sometimes claims to solve a problem that doesn’t really exist. It is also critical to determine the institution’s capacity to support AI technology, whether embedded into a solution or built from scratch.
All AI is not created equal, and it is not immune from mistakes. It depends on human algorithms and code, and its use can inadvertently produce unexplainable, unreliable, and unethical outcomes if left unchecked. AI algorithms evolve with ongoing insights, behaviors, and data, which can potentially result in replicating inequality and biases. Thus, institutions must prepare for the unexpected, and AI quality, ethics, and fairness should be top of mind during development and testing.
Consider AI’s use in student success. There are existing solutions designed to support student success by helping to identify at-risk students. At face value, that sounds good, but institutions also need to consider what types of data are being collected to develop those predictions. Can the same combination of factors result in applying the same risk level to all students based on age, gender, location, and other factors? How are these tools collecting, processing, using, and storing personal (sometimes biometric) information?
The increase in privacy concerns is one reason why Tambellini Group predicts that higher education will see a surge in the number of institutions appointing chief privacy officers (CPOs) in 2024 and beyond. This position brings yet another human element to AI. CPOs play an important role serving as the central authority to establish data processing and protection policies, working in close collaboration with the president, chief information officer, chief data officer, chief technology officer, chief financial officer, chief human resources officer, and other members of leadership.
Read more Tambellini Group analyst predictions in Predictions and Recommendations: A Blueprint for Taking Action and Driving Success in 2024 and Beyond.
As with any technology or organizational change, institutions must have a change management strategy and plan. This is particularly true with AI because there is still a widespread fear of AI taking jobs. News stories about AI replacing niche roles in particular industries have reinforced that fear. However, those situations are still the exception, not the norm, and we are not seeing that trend in higher education.
For the most part, across all industries, we continue to see AI augmenting and supporting employee work rather than replacing workers. RPA and similar technology are focused on freeing employees from manual, often repetitive tasks that might be more prone to error so they can concentrate on more creative, engaging, impactful, or high-touch work.
Tambellini Group clients can explore current AI trends in higher education, along with more detailed guidance on all the factors you need to consider for institutional readiness in our white paper, The Human Elements of Artificial Intelligence, or our AI Readiness Checklist for Colleges and Universities.
Share Article:
© Copyright 2024, The Tambellini Group. All Rights Reserved.
Get exclusive access to higher education analysts, rich research, premium publications, and advisory services.
Weekly email featuring higher education blog articles, infographics or podcasts.