
As generative AI tools become more common in higher education, institutions are beginning to explore their potential for improving student support. But how do students themselves feel about these tools? What kinds of support are they willing to receive from AI—and where do they still want human connection?
New findings from WGU Labs’ Student Insights Council suggest a nuanced picture: students are open to using AI for practical, task-based support, but they still draw a clear line around emotionally meaningful interactions. In short, most students see AI as a tool, not a touchpoint.
Most Students Are Open to AI, But Only for Certain Tasks
Across a sample of more than 4,500 WGU students, 81% said they were willing to use AI for support. But when asked about specific use cases, a pattern emerged: students are interested in speed and utility, not in forming a relationship with the technology.
- 81% were open to using AI for answering general questions.
- 54% for tracking academic progress.
- 50% for navigating student resources.
- Just 14% said they’d use AI for emotional or motivational support.
Interviews reinforced this finding. Students consistently noted the value of AI’s availability and efficiency—especially when they were busy or overwhelmed—but were more skeptical about its ability to offer nuanced or empathetic support. As one student put it:
“AI can point me in the right direction quickly, but it’s no replacement for a mentor.”
For Some Students, AI May Fill Gaps in Support
Although only a small share of students said they would use AI for emotional or motivational support, women, first-generation students, and students of color were more open to these uses than their peers. For example, 16% of women expressed interest, compared to 8% of men; 17% of first-generation students, compared to 8% of continuing-generation students.
This suggests that for some students—particularly those who may have faced barriers to consistent support—AI is less a replacement for human connection and more a fallback when other options are out of reach. Cultural and structural factors may also shape students’ willingness to seek help, and AI may feel like a more neutral or accessible first step.
Trust Is a Limiting Factor
Despite general openness, only 36% of students said they were comfortable receiving academic or administrative support from AI. This points to a gap between interest and trust. Even when AI can offer help, students remain skeptical about its reliability, accuracy, or intent—especially in higher-stakes or more personal contexts.
Here again, differences emerged by student background: students of color were more comfortable receiving support from AI (44%) than white students (32%), reinforcing that openness to AI often reflects previous experiences with institutional support structures.
Preferences Reflect a Division of Labor
Students made clear distinctions about what types of support should be handled by AI versus people. In a follow-up question, 84% said they would still want to maintain human mentor relationships, even if AI tools were available.
When asked about specific support scenarios, students overwhelmingly preferred human mentors for:
- Emotional support during challenges (84%)
- Complex academic planning (75%)
- Motivation and encouragement (71%)
- Accountability (64%)
In contrast, students were more open to AI managing logistical or routine tasks—support that doesn’t require interpretation or interpersonal nuance.
Interestingly, some students noted that AI could reduce the frequency of their mentor interactions—not out of avoidance, but as a way to protect mentor time for higher-order conversations. This points to an emerging model in which AI supplements human support by handling routine needs, enabling staff and faculty to focus on deeper, relational work.
Why This Matters
These findings have clear implications for the future of student support:
- AI is most effective when aligned with students’ stated needs—especially around speed, convenience, and clarity.
- Design matters. Institutions should be transparent about when and how AI is being used, and ensure that students retain access to human guidance for more complex needs.
- Demographic differences are important to consider. Students with less access to support may be more open to AI as a supplement. But openness does not equate to preference.
At WGU Labs, we are continuing to study how AI can be integrated into student support systems in ways that are equitable, transparent, and grounded in what students actually want. The challenge ahead is not just about building smarter tools—but about building systems that are responsive to student experience.