You are currently viewing Access to premium AI services is a significant concern for students

Access to premium AI services is a significant concern for students

  • Post comments:0 Comments


As students become more familiar with AI tools, their awareness of potential ethical pitfalls has sharpened.

A recent report by Naima, then a student at the School of Social Sciences at the University of Westminster, used a series of focus groups to explore perceptions – and found a clear and growing demand for guidelines on the ethical use of AI in academic settings.

This isn’t just about avoiding plagiarism; it’s about navigating the complex landscape of bias, privacy and potential misuse of AI technologies.

Interestingly, this heightened awareness aligns with trends observed in broader survey data collected over the past year at Westminster. Comments about AI ethics and responsible use increased noticeably between the three surveys conducted (March 2023, December 2023 and May 2024), suggesting that the concerns voiced in the social sciences focus groups are part of a more significant shift in student perspectives.

The premium predicament

Perhaps the most pressing issue highlighted in the focus group report and the survey data is the question of access to premium AI services. Naima found significant student concerns around the disparities in evidence.

This disparity creates an uneven playing field, where academic success could be influenced by a student’s ability to afford advanced AI tools. As one focus group participant put it, “students’ academic help may be determined by their financial resources rather than their academic competence.”

The advanced tools and resources available to those who can afford these services can potentially impact academic performance.

For example, image creation tool Midjourney until late August offered no free access to its services – now a trial version is available for up to 25 images. With subscriptions starting at $10 a month, those who pay more are also able to access fast response times more reliably. The more popular ChatGPT, which does have a free version, has an improved plan costing $20. Again, those who can pay benefit from faster response rates and priority access to the service.

Students who cannot pay for these services are disadvantaged, then, and frequently have to make do without such resources, or use the less efficient and sometimes lower quality free alternatives. Due to their potential struggles to pay for such plans every month, students from low-income backgrounds may experience further disparities in educational opportunities and achievement.

The surveys corroborate this concern. By May 2024, calls for university-provided access to premium AI tools had become common in student feedback across a number of disciplines.

Some universities have taken steps to level the playing field in response to these concerns. The focus groups noted that the University of Westminster has provided all students premium access to Grammarly, which includes its own custom AI chatbot, GrammarlyGO. However, it also highlighted a critical issue: “not many students are aware of this.”

This points to a more significant challenge facing universities: providing access to AI tools and ensuring students are aware of and know how to use these resources effectively and fairly. Stressing to students the extra security and integrity guardrails in using resources like GrammarlyGO is also essential, especially given the extent to which most associate generative AI with ChatGPT. Indeed, the high public profile of ChatGPT may contribute to a view that “if it isn’t ChatGPT that we get, then it isn’t good enough.”

A call for AI literacy

Another key finding of the focus groups was students’ desire for a more comprehensive approach to AI education. They’re not just looking for tutorials on how to use ChatGPT or GrammarlyGO; they’re seeking a deeper understanding of the technology rapidly reshaping their world.

Naima’s report highlights the enthusiasm for discussion workshops on AI. These sessions go beyond the nuts and bolts of AI tools, exploring ethical considerations, societal impacts, and the future of work in an AI-enabled world.

This isn’t isolated to a single department. The extensive survey data collected over the past year at Westminster shows a growing trend across disciplines. In March 2023 only a few students expressed interest in AI’s broader implications through open text survey responses. By May 2024, however, this had become a more dominant theme in student feedback.

Overall, students said they wanted comprehensive AI literacy programs integrated into the curriculum, clear ethical guidelines for AI use in educational settings, equitable access to premium AI tools, and open discussions about the broader implications of AI in society.

One thing above all seems clear: the students wish to be ready for a future where AI is not just a tool but an integral part of their educational journey. The question now is whether universities are ready to meet this moment.



Source link

Leave a Reply