Students


A student working on a laptop computer

Guidance for Students

As Generative Artificial Intelligence (AI) evolves, its integration into our academic environment at the University of Kansas (KU) offers unprecedented opportunities for student engagement and learning. We are committed to leveraging AI to enhance educational experiences, support student research, and streamline administrative tasks. Our approach is guided by our values of collaboration, equity, excellence, integrity, and respect, ensuring that AI tools are used ethically and effectively to benefit our student community.

Overview

Some artificial intelligence tools have been available to students for years, such as Grammarly. Now, generative artificial intelligence tools such as ChatGPT, Copilot, and Gemini offer potential help with writing, editing, coding, problem-solving, and the creation of images and music.

AI tools are quickly evolving, and students, instructors and university leaders trying to understand and keep pace with the changes. Two areas of focus are 1) academic integrity, and 2) how to prepare students for the inevitable use of AI in their future careers.

Some instructors have integrated generative AI into assignments, while others have restricted the use of generative AI or banned it from classes. Because of these differing policies, be sure you understand the guidelines in each of your classes. Check the syllabus and ask your instructor for clarification, if needed.  

Historically, some people in academia had concerns that new tools, such as pocket calculators and spellcheck, negatively affect student learning and outcomes. However, each time a new tool was introduced, academia eventually adapted and accepted those tools. The difference is that calculators, spellcheck and other tools typically have only supplemented the student's own efforts, while generative AI has the ability to completely supplant the student's efforts.

What is Generative AI?

Generative AI creates (or generates) text, images, and code based on information a user provides in the form of natural language. With ChatGPT, Copilot, Dall-E other forms of generative AI, you ask questions or issue commands just as you might to a person. The AI tool then produces content in response.

Generative AI relies on large language models, an approach to artificial intelligence that analyzes enormous amounts of text and creates probabilities for word sequencing. This analysis is often called training. A large language model allows ChatGPT and similar tools to respond to questions in ways that sound human. They aren’t human, though. They simply string words together in ways that mimic the patterns they have analyzed.

AI Guidance

Generative AI can enhance learning and help you understand complex topics by providing explanations, examples, and even generating practice problems. AI tools can also help spark creativity and new ideas to help get you started on a writing, design or research project. And, AI can summarize topics or even articles, and generate lists of potential sources, saving you time for more critical thinking and learning.

The biggest concern, as mentioned earlier, is using generative AI to replace your critical thinking and effort rather than using it as a supplemental tool to improve your efficiency and effectiveness. You should AI to improve your academic work, not do your academic work for you. Using generative AI improperly can result in plagiarism and other academic misconduct.

Generative AI can also fabricate content. In fact, this is so common there's a term for it — "AI hallucination." It can lead to made-up sources of information, made-up people, and incorrect problem-solving. AI chatbots will provide these fabrications in a tone that makes the information sound credible. It’s another example of why your intellectual skills are crucial. You must evaluate any work from generative AI with a skeptical eye. Check sources, names, calculations, and any other information.

Bias is also an issue with generative AI. These systems can produce prejudiced results due to issues with the data they were trained on and the algorithms used, which can lead to unfair or discriminatory outcomes against certain groups of people.

What Should I Do?

First, relax. Generative AI is an important tool, but it is useful only if you develop other skills, including a deep understanding of your discipline. If you don’t know how to use generative AI, you have plenty of time to learn. As for using generative AI in your classes, keep a few things in mind:

Check the guidelines for each class

Like you, your instructors are still trying to figure out how, when and whether to use generative AI. That means each class may have a different policy. That can be confusing, so it will be up to you to make sure you understand the guidelines each instructor asks you to follow. If you are unsure about something, ask.

You are responsible for all content

If your instructor allows you to use generative AI in assignments, use the output as a starting point for editing, analysis, and additional research and writing. Turning in unedited work created by a chatbot is academic misconduct.

You are still responsible for all aspects every assignment, including the accuracy of the content, the clarity of the writing, the structure of the coding, and the quality of the overall work. Keep in mind that generative AI sometimes makes things up. It solves problems incorrectly and creates unworkable code. It also tends to write in a bland, generic way. Check anything created by generative AI just as you check your own work. Also be sure to cite your use of generative AI in whatever way your instructor requires.

Pay attention to privacy

Most generative AI providers have privacy policies, but there is no federal or state oversight. Although unlikely, interactions you have with a chatbot could become public. Avoid providing your own or other people's personal information in chatbot conversations. Remember, too, that the companies behind the chatbots may use any interactions you have to further improve the way the bots work. In other words, you are helping train generative AI platforms as you work.


Generative AI and Academic Integrity

The University of Kansas does not have a specific policy about use of generative artificial intelligence in teaching and learning. The University Senate Rules and Regulations do provide guidance on what is acceptable and not acceptable, though:

"Academic misconduct by a student shall include, but not be limited to, disruption of classes; threatening an instructor or fellow student in an academic setting; giving or receiving of unauthorized aid on examinations or in the preparation of notebooks, themes, reports or other assignments; knowingly misrepresenting the source of any academic work; unauthorized changing of grades; unauthorized use of University approvals or forging of signatures; falsification of research results; plagiarizing of another's work; violation of regulations or ethical codes for the treatment of human and animal subjects; or otherwise acting dishonestly in research."

The KU Code of Ethical Conduct also provides guidance, emphasizing the importance of demonstrating accountability, modeling ethical standards; fostering honest pursuit of knowledge; ensuring originality of work, and attributing ideas drawn from others’ intellectual work.

Your instructor should provide a policy on class use of generative AI and talk about that policy in class. It’s important that you follow that policy and to complete the work in your courses honestly. Again, ask if you are unsure about any expectations or policies.


Additional Resources

Artificial Intelligence Glossary 
by Adam Pasick. The New York Times (27 March 2023).

The bots are here to stay. Do we deny or do we adapt? 
by Doug Ward, Bloom’s Sixth, CTE blog (20 January 2023).

A Blueprint for an AI Bill of Rights for Education 
by Kathryn Conrad. Critical AI (17 July 2023).

Exploring the reasoning and the potential of ChatGPT 
by Doug Ward, Bloom’s Sixth, CTE blog (5 February 2023).

A Generative AI Primer 
by Michael Webb. National Centre for AI (11 May 2023).

In the Age of A.I., Major in Being Human 
by David Brooks. New York Times (2 February 2023). Excellent advice for students on preparing for a career in an AI world. He recommends classes that help students write with a distinctive voice, improve their presentation skills, push their creativity, provide unusual worldviews, cultivate empathy, and develop situational awareness.

Research points to AI’s growing influence 
by Doug Ward. Bloom’s Sixth (4 August 2023).

National Centre for AI. A not-for-profit organization based in the U.K.

Welcome to AI in Education. A resource created by students at The University of Sydney.

What Is Artificial Intelligence? Gartner (n.d.). 


Acknowledgement: The majority of content on this page comes from A student guide to using generative AI in coursework by KU Professor Doug Ward. Used with permission.