Students


A student working on a laptop computer

Guidance for Students

As Generative Artificial Intelligence (AI) continues to evolve, its integration into the academic environment at the University of Kansas (KU) offers new opportunities for student engagement and learning. KU supports the thoughtful use of AI to enhance educational experiences, support student research, and assist with routine academic tasks. This approach reflects the university’s values of collaboration, equity, excellence, integrity, and respect, and emphasizes ethical and responsible use of AI tools.

Overview

Artificial intelligence tools have been part of student life for years, including tools such as Grammarly. More recently, generative AI tools such as ChatGPT, Microsoft Copilot, and Google Gemini have introduced new ways to support writing, editing, coding, problem solving, and the creation of images or music.

AI tools are evolving rapidly, and students, instructors, and university leaders are continuing to learn how these technologies affect teaching, learning, and academic work. Two key areas of focus are academic integrity and preparing students to engage thoughtfully with AI in their future careers.

Some instructors have integrated generative AI into assignments, while others restrict or prohibit its use. Because expectations may differ from one course to another, it is important to understand the guidance for each class. Review your syllabus carefully and ask your instructor for clarification if you are unsure.

Historically, new technologies such as calculators or spell check tools have raised concerns about their impact on learning. Over time, higher education adapted to their use. A key difference with generative AI is its ability not just to support student work, but potentially to replace a student’s own effort. This distinction makes careful and responsible use especially important.

What is Generative AI?

Generative AI creates text, images, code, and other content based on instructions or prompts provided in natural language. With tools such as ChatGPT, Microsoft Copilot, and Google Gemini, users interact by asking questions or issuing commands much as they would when communicating with another person. The system then generates a response.

Generative AI relies on large language models, which analyze vast amounts of text to identify patterns and calculate the probability of word sequences. This process, often called training, allows AI tools to produce responses that can sound human like. However, these systems do not understand content in the way people do; they generate outputs by predicting patterns based on their training data.

AI Guidance

Generative AI can support learning by helping explain complex topics, providing examples, or generating practice questions. AI tools can also help spark ideas at the beginning of a writing, design, or research project, summarize topics or articles, and suggest potential sources. When used appropriately, these tools can save time and free you to focus on deeper analysis and critical thinking.

A primary concern is relying on generative AI to replace your own thinking and effort rather than using it as a supplemental tool. AI should support your academic work, not do the work for you. Improper use of generative AI may result in plagiarism or other forms of academic misconduct.

Generative AI tools can also fabricate information, a phenomenon often referred to as “AI hallucinations.” This may include made up sources, incorrect facts, fictional people, or flawed problem solving. Because AI tools often present information in a confident tone, it is essential to review outputs critically. Always check sources, names, calculations, and other details.

Bias is another important consideration. Generative AI systems may reflect biases present in their training data or design, which can lead to unfair or misleading outcomes. Awareness and critical evaluation are necessary when using these tools.

What Should I Do?

First, relax. Generative AI is an important tool, but it is most effective when combined with strong foundational knowledge and critical thinking skills. If you are unfamiliar with AI tools, you have time to learn how to use them responsibly. When using generative AI in your courses, keep the following in mind:

Understand course-specific expectations

Each instructor may have different guidelines for AI use. It is your responsibility to understand and follow the expectations for each course. If something is unclear, ask your instructor.

You are responsible for all submitted work

If AI use is permitted, treat AI outputs as a starting point for revision, analysis, and further research. Submitting unedited or unexamined AI generated content as your own work may constitute academic misconduct. You are responsible for accuracy, clarity, structure, and overall quality, just as you are for work completed without AI assistance.

Because generative AI can produce incorrect information or unreliable code, review any AI assisted work carefully, and cite AI use when required by your instructor.

Pay attention to privacy

Most AI tools are governed by company privacy policies rather than direct government oversight. Information shared with AI systems may be stored or reused by providers. Avoid entering personal information or information about others into AI tools. Be aware that interactions may contribute to how these systems are further developed.


Generative AI and Academic Integrity

The University of Kansas does not currently have a university wide policy specific to the use of generative artificial intelligence in teaching and learning. Existing expectations for academic integrity continue to apply when AI tools are used.

The University Senate Rules and Regulations define academic misconduct by a student to include, but not be limited to, plagiarism, falsification of research results, misrepresentation of the source of academic work, violations of ethical research standards, or otherwise acting dishonestly in research.

View the full definition of academic misconduct in the University Senate Rules and Regulations.

The KU Code of Ethical Conduct further emphasizes accountability, ethical decision making, originality of work, and appropriate attribution of ideas drawn from others’ intellectual contributions.

Your instructors should communicate course specific policies regarding generative AI use. Students are responsible for understanding and following those expectations and for completing coursework honestly. When expectations are unclear, students should seek clarification from their instructor.


Additional Resources

Artificial Intelligence Glossary 
by Adam Pasick. The New York Times (27 March 2023).

The bots are here to stay. Do we deny or do we adapt? 
by Doug Ward, Bloom’s Sixth, CTE blog (20 January 2023).

A Blueprint for an AI Bill of Rights for Education 
by Kathryn Conrad. Critical AI (17 July 2023).

Exploring the reasoning and the potential of ChatGPT 
by Doug Ward, Bloom’s Sixth, CTE blog (5 February 2023).

A Generative AI Primer 
by Michael Webb. National Centre for AI (11 May 2023).

In the Age of A.I., Major in Being Human 
by David Brooks. New York Times (2 February 2023). Excellent advice for students on preparing for a career in an AI world. He recommends classes that help students write with a distinctive voice, improve their presentation skills, push their creativity, provide unusual worldviews, cultivate empathy, and develop situational awareness.

Research points to AI’s growing influence 
by Doug Ward. Bloom’s Sixth (4 August 2023).

National Centre for AI. A not-for-profit organization based in the U.K.

Welcome to AI in Education. A resource created by students at The University of Sydney.

What Is Artificial Intelligence? Gartner (n.d.). 


Acknowledgement: The majority of content on this page comes from A student guide to using generative AI in coursework by KU Professor Doug Ward. Used with permission.