Generative AI has emerged as a powerful tool capable of generating essays, creating images, conducting background research, brainstorming creative ideas, and more. These tools became widely and freely available to the public in November of 2022, when OpenAI released its natural language processing tool, ChatGPT. The higher education community has responded with a mix of excitement about generative AI’s potential to benefit students and reservations about its implications for teaching, how knowledge is valued, and its impact on learning. 

Researchers have recently begun to examine student adoption and utilization of AI tools in education but, to date, have spent little time examining the students' views on AI across diverse perspectives. Higher education institutions face AI on two fronts: As a resource in delivering the educational experience and as a fundamental skill they need to equip their students to use. Yet rather than taking decisive action to help students learn and navigate these new tools, our results suggest that the uncertainty and inaction among administrators and faculty on how to integrate AI within the postsecondary experience is beginning to trickle down to their students. Although AI awareness has grown in the last year, there are clear user group gaps. Most students responding to our survey aren’t confident in their ability to use the tools, and many aren't sure it will ultimately have a positive impact on their learning experience.

To examine student perspectives on these issues, we asked 2,365 students across six institutions about their perceptions of and experiences with AI as part of our Student EdTech Survey Series. Here are some more details of what we learned:

  • Key Takeaway 1: Awareness and usage have increased, but first-generation students are twelve percentage points less likely to know about ChatGPT and other AI tools.
  • Key Takeaway 2: Fewer than half of students are confident in their ability to effectively use AI, and very few students are getting support to build confidence.
  • Key Takeaway 3: Students are positive about using AI in higher education but not to replace faculty.
  • Key Takeaway 4: Most students believe that using ChatGPT to generate coursework is unethical but are more accepting of other uses. 

Methodology

In April of 2024, the CIN research team emailed surveys to more than 30,000 students across six CIN member institutions. These post-secondary institutions included community colleges, primarily online, not-for-profit colleges, and one public four-year university. The survey contained a variety of questions about students’ experiences with educational technologies, including several questions about their perceptions of and experiences with artificial intelligence. Our final sample consisted of 2,365 students. Fifty-three percent of respondents were enrolled at a primarily-online institution, 28.7% were at a four-year university, and 18% were at a community college. 

Awareness and usage have increased, but first-generation students are twelve percentage points less likely to know about ChatGPT and other AI tools

While less than half of students (43.7%) surveyed in the prior year reported knowing about ChatGPT and other AI tools, 72% of students knew about these tools in this year's survey. Notably, just under a third of our sample (28%) were unaware of the tools, which is surprising given the impact they have had on higher education. Although AI has been a large topic of conversation among administrators and faculty members, our data suggest that these conversations are not reaching a sizable percentage of students.

Twenty-eight percent of students reported having used ChatGPT – up from just 9% in last year’s survey. Among those who reported being aware of ChatGPT, 37% reported using the resource. These data suggest that the tools are still going unused by a large percentage of students, which is concerning given the increasing importance of AI skills in the workplace. 

Very few students report using the tool to “cheat.” Just 5% of students reported that they had turned in ChatGPT output as their own response, compared to 3% of students in the previous year's survey. Among those who reported being aware of ChatGPT, 7% reported using the resource to “cheat.”

We found notable differences in awareness and usage across demographic variables. First-generation college students were 12 percentage points less likely to be aware of ChatGPT and other AI tools and eight percentage points less likely to have used them to help with their coursework, echoing the AI equity gap we noted last year.

Note:  All of the subgroup differences noted below were statistically significant at  p< 0.001.

While we did not see large differences in awareness by learning modality, online learners were 15 percentage points less likely to have used ChatGPT to help with their coursework than students learning in other modalities. 

Finally, we saw that students aged 25 or older were 11 percentage points less likely to know about and 14 percentage points less likely to have used ChatGPT and other AI tools than students younger than 25. 

Why this matters

These results suggest that equitable access to and engagement with AI tools remains an issue. Although students in this year’s survey were more aware of and more likely to use AI than the previous year, first-generation college students, online learners, and older students continue to lag behind their peers. If not addressed, these disparities could exacerbate existing inequities, both in higher education and in the workplace where AI skills are likely to become increasingly important. To ensure equitable access, institutions must equip diverse groups of learners to engage with AI in ways that enhance the learning experience. 

Fewer than half of students are confident in their ability to effectively use AI, and very few students are getting support to build confidence

We asked students a series of questions assessing confidence in their ability to effectively engage with AI tools. The majority of students were not confident in their AI skills and abilities. Indeed, only 41% agreed that they are confident in their ability to use ChatGPT and other AI tools effectively. Thirty-four percent agreed that they understand how to write effective prompts, and 44% agreed that they feel confident in their ability to critically evaluate AI-generated content. Only 18% of students reported that their instructors had explicitly instructed them on how to use ChatGPT and other AI tools in ways that enhance the learning experience.  Overall, few students reported that their faculty explicitly instructed them on how to use AI but when they did, students were also more likely to say they used the tools. Interestingly, we did not see large differences in confidence by key demographic variables such as age, generation status, or primary learning modality. 

Unsurprisingly, using ChatGTP is associated with more confidence with AI tools and use cases. Students who had previously used ChatGPT were more confident in writing prompts, understanding how the tools work, their ability to use the tools effectively (ps <0.001). 

Why this matters

The majority of students in our sample are not confident in their ability to effectively use and engage with AI. Nor are they receiving the instruction that would help increase their confidence. With the increasing importance of AI skills in the workplace, these results suggest that students may leave their degree programs without a critical skillset. These results echo the findings of our faculty and administrator surveys in which we found that few faculty are using AI, and few institutions have developed formal policies and guidelines for using these tools in instruction. Institutional inaction appears to be trickling down to students. Higher education institutions must prioritize AI literacy and provide robust support systems to enhance students' competencies in these areas. 

Students are positive about using AI in higher education but not to replace faculty.

When we asked students how they expected AI to impact their educational experiences, we saw differences based on whether students had previously used the tools. When asked how positively or negatively they thought AI tools would affect their experience at their institution, 52% of students who had previously used AI reported positive attitudes, compared to just 25% of students who had not previously used the tool (p < 0.001). 

However, when we asked about their attitudes toward specific applications of AI at their institutions, the differences between AI users and non-users were less clear. We asked respondents about widely discussed and implemented applications of AI in higher education. 

Students who had used the tool previously showed more positive attitudes  (i.e., + 10 percentage points or more) toward the following applications of AI, compared to students who had not used the tool:

  • Using machine learning to predict academic struggles and offer preemptive support (+18 percentage point difference; p<0.001)
  • Chatbots that answer questions about course content (+16 percentage point difference; p<0.001)
  • Drafting communications from advisors to students (+14 percentage point difference, p<0.001) 
  • Providing tutoring services to students (+12 percentage point difference; p<0.001)
  • Responding to students’ questions in discussion forums or chats (+11 percentage point difference; p<0.001)

In contrast, prior usage of AI did not appear to impact attitudes toward AI for the following applications, which largely centered around the core role of faculty members: 

  • Providing feedback on essays, exams, and other written assignments (+9 percentage point difference, p<0.05)
  • Creating course lectures and content (e.g., videos/ photos to use in instruction, rubrics for assignments; +8 percentage point difference; p<0.05) 
  • Creating course syllabi (8 percentage point difference; p<0.01)
  • Evaluating students who apply to their institution (-2 percentage point difference; p>0.05) 
  • Grading coursework (+2 percentage point difference, p>0.05)
  • Detecting plagiarism in students' work (+2 percentage point difference; p>0.05) 
  • Teaching entire courses (-1 percentage point difference; p>0.05)

These data suggest that having prior experience with AI may help foster more positive attitudes toward the tools, but only for applications that do not replace the core work of faculty members. 

Notably, when looking at overall attitudes toward these applications of AI, students were not overly positive. Indeed, for all but two applications (detecting plagiarism in students’ work and using machine learning to predict academic struggles and offering preemptive  help), fewer than half of the students in our sample showed positive attitudes. Students showed the most negative attitudes toward teaching entire courses (with 16% of students showing positive attitudes), evaluating students who apply to the institution (24% of students), and grading coursework (30% of students). 

Why this matters

As higher education institutions are beginning to apply AI to assist with a variety of functions, it is critical to understand how students perceive these applications. Our data show that many students are still neutral or negative toward a number of AI applications. While students with more experience with AI are more positive toward the tools overall and in certain applications, they show reservations in uses that they view as replacing faculty members. Over time, students will be increasingly aware of and experienced with AI technologies. Based on these early patterns, we can expect greater familiarity will increase comfort with a range of applications, except those that replace the core role of instructors. Moving forward, institutions should expect uneven comfort with these tools and be considered with how they advance the application of these technologies in the students' experience.

Most students believe that using ChatGPT to generate coursework is unethical but are more accepting of other uses. 

Next, we examined students’ perceptions of the ethics of various uses of ChatGPT. The majority of students (69%) agreed that using ChatGPT to write responses to homework assignments, course papers, or exams is unethical. However, only 30% agreed that using ChatGPT to conduct background research for coursework is unethical, and just 20% agreed that it is unethical to brainstorm creative ideas for coursework. 

Once again, we saw that students’ perceptions differed by their prior use. Whereas 62% of ChatGPT users agreed that using the tools to write responses to coursework was unethical, 72% of non-users did. Just 15% of users agreed that using ChatGPT to conduct background research was unethical, compared to 35% of non-users. Finally, 9% of users agreed that using the tools to brainstorm creative ideas was unethical, whereas 24% of non-users did. 

Why this matters

These findings show that students’ views about the ethics of AI tools are nuanced. Notably, the divergence in opinions between users and non-users of ChatGPT indicates that familiarity with the technology can influence ethical perceptions. This highlights why educational institutions should develop clear guidelines and provide education on the ethical use of AI tools. By doing so, they can ensure that students use AI responsibly, leveraging its benefits while maintaining academic integrity.

Conclusion

AI is being introduced rapidly in higher education, often without sufficient support or feedback loops for students and faculty. Our research highlights that many students lack the awareness, skills, confidence, and positive attitudes needed to fully embrace these tools and benefit from them. Notably, consistent gaps appear among certain demographic groups, including first-generation college students, older students, and online learners. These disparities indicate a critical need for additional training, resources, and guidance to ensure equitable access to AI tools. 

Institutions have a significant responsibility to help students navigate the challenges posed by AI. By developing clear guidelines and providing robust support systems, they can help students use AI responsibly and effectively. This involves educating students on the ethics of AI, equipping them with the necessary skills, and ensuring that all students can access and benefit from them.