Leveraging LLMs as Collaborative Partners
When faced with the abundance of new and constantly evolving AI tools, it can be overwhelming to determine how best to integrate them into the classroom setting. Generative AI LLMs such as ChatGPT and Copilot can be thought of as helpful assistants or collaborative partners in teaching and learning. These AI tools can aid instructors in tasks related to instructional development and preparation, allowing them to invest more in enhancing pedagogical approaches that support student learning.
By aligning the capabilities of AI with the goals and visions of instructors and students, the potential for AI to enrich the teaching and learning experience becomes clearer. Here are four areas of potential:
Diversify
AI possesses the capability to produce a wide range of examples, scenarios, case studies, questions, and activities. Students derive significant advantages from encountering diverse use cases when navigating new and intricate information. Generative AI tools (e.g. ChatGPT) support the generation of infinite variations to effectively fulfill course learning objectives.
Example Prompt: “Generate three diverse scenarios depicting interpersonal communication challenges in professional settings. Each scenario should highlight a different aspect of communication, such as active listening, nonverbal cues, or conflict resolution. Additionally, include discussion questions to prompt students to analyze and reflect on the communication dynamics portrayed in each scenario.”
Explain
AI is adept at generating targeted explanations, descriptions, comparisons, summaries, and instructions. Students typically grasp concepts more readily within familiar contexts, making tailored explanations and comparisons especially potent. Generative AI can be used to produce concise summaries and clarifying aids to help students broaden their scope of understanding.
Example Prompt: “Clarify ethos, pathos, and logos for an audience made up of college students with little to know background in rhetorical theory.”
Enrich
AI can adapt context, style, voice, format, and structure. Offering a broad range of information and examples fosters nuanced comprehension, inspires new ideas, and enhances classroom engagement. This might entail explaining a concept in the voice of a specific individual, translating a literary work into song lyrics, or visualizing data sets in multiple formats.
Example prompt: “Create a summary of Toni Morrison’s Beloved in the style and tone of a family-friendly stand-up comic.”
Review
AI can provide feedback, grammar/punctuation corrections, and assessments. Beyond aiding in content generation, tools like ChatGPT and Copilot can review writing and provide feedback, identify errors, and conduct evaluations. Simultaneously, AI can offer feedback on assessment questions or lecture notes and suggest improvements for teaching and learning specific concepts tailored to various learner levels.
Example Prompt: “Review the following lecture notes and offer suggestions on injecting questions or activities for engaging students with each other and with the content of the lecture: [copy and paste lecture notes]”
Considerations to Start
Let’s hit on the basics:
- Establish a definition of generative AI. For example, “Generative AI refers to a range of emerging technologies that draw from training on large datasets to generate new content in written, visual and other forms based on user instructions” (University of Kentucky’s “UK Advance” Initiative). This definition may be expanded or revised to include information that is relevant to the course and discipline.
- Develop a clear statement on whether the use of generative AI will be permitted for coursework, and if so, how and to what degree it will be permitted.
- Outline what constitutes inappropriate use of generative AI in the course as well as the consequences for inappropriate use.
- Find a process that works for you and your students that will allow students to document or cite the use of generative AI for assignments and other course activities (if it is permitted).
- Provide rationale for the policy that is grounded in the context of the discipline/department, the learning goals of the course, the skills that will be assessed and/or ethics and academic integrity.
- Share links to resources for understanding and using generative AI ethically and effectively. (This document from Western Michigan University is a great place to start.)
- Establish a learner-centered and student-friendly tone that builds understanding and motivation for students in the course.
- Invite students to discuss any questions or concerns regarding the policy, making sure they understand the expectations when it comes to using AI in your course.
Consider both the potential benefits and risks to ensure that the use of AI aligns with the educational goals of your course.
Looking for more? Check out this slide deck from UC San Diego which includes a flowchart to support your course AI policy development process.
Evaluate how the use of AI fits within the ethical guidelines and academic standards established by your department, your college/school, and MSU Denver.
For additional resources, consider the following:
- For the Health Sciences, check out the University of Kentucky’s guidelines for using AI in clinical care.
- For Engineering and Computer Sciences, BYU has some helpful guidance to assist.
- For more general supports -including info on how to frame AI use in the Humanities and General Studies courses, Austin Community College has some great examples for their faculty.
Identify specific risks such as dependency, inaccuracy, or bias in AI outputs and think about strategies to mitigate these issues.
Carnegie Mellon University offers some “quick hits” on addressing this question depending on the nature of your policy.
Reflect on how AI can be used inclusively to address the varying needs and preferences of students, thereby enhancing personalized learning.
The University of Pittsburgh has clear opportunities for how generative AI can be used to enrich the course, no matter the learning style.
Consider establishing criteria and methods for assessing whether the integration of AI tools is positively impacting student learning and engagement.
This is a blog post from a faculty member at the University of Melbourne who presents some approaches for setting outcomes that can be met using generative AI.
Guidance for Graduate-Level Course Instructors from the MSU Denver Office of Graduate Studies (OGS):
- Course instructors are expected to indicate clearly in their syllabi in which way and to what extend the use of AI in a course is permitted. They also should state that students are required to follow up with the instructor in writing if they need further clarification of the instruction, and the instructor is expected to respond to all students in the course (with redaction of the identity of the student who reached out).
Sample Syllabus Language from OGS:
- AI is not infallible, and it is the students’ task not only to reference the specific AI tool plus the date when it was employed, but students also have to make sure that they truly agree with the statements or conclusions that AI provides. The use of AI does in no way suspend students from using independent critical thinking and data verification. Bypassing proper referencing or critical evaluation of the AI-produced work will put the student at fault and will lead to point or grade reduction and/or more severe sanctions. By attending this course, students agree to an oral exam that an instructor might impose to verify the depts of knowledge that has been acquired though independent intellectual work.
As you navigate what can be an overwhelming amount of information and examples of policies, remember, you do not need to reinvent the wheel.
Here are two highly-recommended collections of policies that can help you figure out what approach works best for you, your course, and your students:
- Scribbr’s collection of University Policies on AI Writing Tools is organized and easy to navigate.
- Northern Illinois University’s Center for Innovative Teaching and Learning has an easy-to-access list of sample policies they have collected from institutions across the country.
- CourseHero put together some nice graphics to visualize your decision making and policy writing process, leveraging faculty experts to share what they are doing to help navigate our AI-influenced pedagogy.
Generative AI for Teaching and Learning
These resources for Teaching and Learning with AI are designed to empower educators and students with the knowledge of generative artificial intelligence (generative AI) by providing a pragmatic overview of the inner workings of various Large Language Models (LLMs).
GenAI in Practice
Establishing and communicating policies around AI usage is a requirement for all MSU Denver courses. We encourage faculty to engage with colleagues, both within their own fields and from diverse disciplines, to share insights and develop cohesive strategies where appropriate. Such collaborations can help in establishing department-level or even college-level policies, reducing uncertainties, and aligning standards across different areas.
If you are seeking further assistance with broader AI strategy and policy development, Dr. Samuel Jay ([email protected]) is available to support these efforts, helping to ensure that our approach to AI in education is both ethically grounded and universally beneficial.
For guidance and additional insight, visit the CTLD’s Ready page.
Generative AI Tools Specific to Teaching & Learning
Sentiint is a teaching and learning platform developed by MSU Denver professor Shane Jackson. The platform includes a suite of tools that leverage generative AI, all of which support educators in their use of generative AI to support student success and to assist with management of the administrative tasks related to teaching-and-learning.
In this recorded webinar, Professor Jackson demonstrates some of the Sentiint capabilities including how to set up custom “chatbots” to assist with access to course-specific information.
For MSU Denver professors interested in using Sentiint, please contact Professor Jackson via email.
Description: Diffit is a generative AI tool designed to assist teachers in creating customized educational content. It helps in generating lesson plans, quizzes, and learning materials tailored to students’ needs, saving time and enhancing the learning experience. By leveraging AI, Diffitt ensures that educational resources are both relevant and engaging.
Pricing: When first starting with the tool, users are given a 60-day “free trial” of the premium version before the account reverts to the basic version. The features between premium and basic, however, are not that much different. Diffit’s pricing page has additional information and a nice infographic for comparing the versions.
Description: While Quizziz has been around for several years and has been used mostly by K-12 teachers, the platform does have value for professors as it uses its vast database to generate course content including slide decks, assignment descriptions, and quiz questions. Their new generative AI tool does the same but with the added value of a larger database, i.e. the large language model known as “the web.”
Pricing: The free basic version seems to do most of what the $15/mo premium version does. There are some added video-related features, but other tools exist -like Adobe Express- that MSU Denver has licenses to or can be found for free elsewhere.
Lex is a generative AI writing “coach” built specifically for those looking for a writing assistant and “thought partner” capable of augmenting the writing process. Unlike standard generative AI chatbots, Lex’s approach is coaching for writers and can be used as a writing “coach” to strengthen the discovery and experimentation components of writing.
Guidance for Handling Improper Use
In this discussion, leaders from MSU Denver delve into the complexities of managing generative AI in academic settings. Dr. Samuel Jay, Dr. Taylor Tackett, Dr. Jeff Loats, and Dr. Shaun Schafer explore how faculty and student affairs can collaborate effectively to address concerns around unauthorized AI use in student assignments. They discuss practical scenarios, the importance of clear policies, and how to approach potential academic misconduct cases with an educational focus.
“Practical AI for Instructors and Students”
The videos below have been created by The Wharton School at the University of Pennsylvania. As MSU Denver continues our integration of advanced AI tools, we will develop our own resources with specificity to our Roadrunner community. Until then, we will curate the best resources available for our purposes, doing so in collaboration with the Center for Teaching, Learning, and Design (check out the Ready site for more information), Information Technology Services, and others who are committed to using artificial intelligence ethically and responsibly. overview of how large language models work and explains how this latest generation of models has impacted how we work and how we learn. They also discuss the different types of large language models referenced in their five-part crash course, including LLMs from OpenAI, Microsoft, and Google.
Introduction to AI for Teachers and Students
Part 1 provides an overview of how large language models (LLMs) work and explains how this latest generation of models has impacted how we work and how we learn. They also discuss the different types of large language models referenced in their five-part crash course, including LLMs from OpenAI, Microsoft, and Google.
Large Language Models (LLMs)
Part 2 delivers a do a deep dive into a variety of large language models (LLMs) and discusses how to work effectively with each model – with examples, prompts, and guidelines.
Prompting AI
Part 3 discusses how to effectively prompt AI like Midjourney, ChatGPT, Microsoft’s Bing, as well as how to take the lead, weaving your own expertise into the interaction.
AI for Teachers
Part 4 covers how to use AI to make your teaching easier and more effective, and we show how to use specific prompts to develop personalized examples, explanations, and low-stakes tests and create a pedagogically sound syllabus.
AI for Students
Part 5 examines how students can use AI to improve their learning and include guidelines and tips for getting the most out of the interactions. The video provides example prompts, tips, and guidelines to help teachers communicate with students about the use of this tool.
Liking what you see?
Check out Prof. Ethan Mollick’s blog One Useful Thing, a great resource for those trying to understand the implications of AI for work, education, and life.