AI for Students

Generative AI complements, not replaces critical thinking

Complementing Critical Thinking With Generative AI

GenAI tools can be helpful for research, idea generation, summarizing content, and problem-solving. But GenAI should complement your critical thinking and decision-making processes, not replace them.

Use these tools to support your work but always engage with the material and apply your own insights and analysis.

Understanding the Limitations Of Generative AI

While GenAI can provide vast amounts of information quickly, it doesn’t have the nuanced understanding and contextual awareness that you have.

GenAI can produce incorrect or biased information, so it’s up to you to critically evaluate and cross-reference AI-generated content with credible sources.

By being aware of these limitations, you can decide when and how to use GenAI effectively, so your work remains accurate, ethical and reflects your own understanding.

Using Your Own Judgment Wisely

You need to learn how to question, analyze and synthesize information on your own, using GenAI as a supplementary tool rather than relying on it. This will help you learn more deeply and build the skills needed for academic and professional success.

As a student, your critical thinking skills are invaluable. Don’t blindly accept AI-generated content. Assess its relevance, accuracy and appropriateness. Customize AI output to meet your needs. Modify, paraphrase or adjust outputs as needed.

By striking a balance between generative AI and your own judgment, you’ll maximize the benefits of both.

Citation: University of Wollongong Australia Library Guides, April 1, 2025, GenAI complements, not replaces, your critical thinking.

About Microsoft 365 Copilot at WVSOM

Overview

Microsoft 365 Copilot is WVSOM’s preferred GenAI tool as it is covered by commercial data protection. This means Microsoft does not claim ownership of the prompts you submit or the outputs you receive from Copilot.

At WVSOM, Microsoft 365 Copilot is available for use by faculty, staff and students. Microsoft 365 Copilot is one of the recommended tools for using generative AI within the WVSOM environment. Microsoft 365 Copilot is approved to interact with university internal data; you must be logged in with your WVSOM username and password to ensure you’re using Microsoft 365 from WVSOM rather than the consumer-based Microsoft 365 Copilot service.

IMPORTANT: Always use your WVSOM account for university business. Personal (free) non-WVSOM Microsoft accounts are not approved for institutional data. Check the top right corner of your account to ensure you are signed in appropriately.

Microsoft 365 Copilot is built on the same AI models and data as ChatGPT-4, so you can use it to write and summarize content, create code and answer complex questions. Microsoft 365 Copilot also has access to current internet data, enabling it to provide real-time responses to current events.

Chat history in Microsoft 365 Copilot is only available to the logged-in user and the University Information Security Office (UISO), and only the latter in the case of an IT incident. Microsoft has no eyes-on access to train the underlying models.

Even if you use generative AI tools for activities that do not share personal or institutional data, you should still check the tool’s output for accuracy. Since these tools have been known to produce inaccurate content (sometimes called “hallucinations”), verify any factual information generated by an AI tool, and make sure to reference the tool as you would any other source (see the “Citing AI Usage” section for more information).

Training: Playbooks, Videos and Instructions

Microsoft 365 Copilot is WVSOM’s preferred GenAI tool as it is covered by commercial data protection, meaning it does not claim ownership of data, prompts or outputs you receive.

Access Microsoft 365 Copilot

Step 1:

Go to Microsoft Office or click the Microsoft 365 Copilot app icon on your desktop or taskbar.

Step 2:

Sign in using your WVSOM email address and password. If you have already signed in, you will remain signed in until you sign out. It is good practice to check that you are signed in under your WVSOM account.

Additional resources, examples, and instructions (including instructional videos) for students can be found within Canvas.

Creating Quality Prompts

AI prompt frameworks are structured approaches to crafting effective prompts for AI models like Microsoft 365 Copilot, ChatGPT, etc., helping users guide the AI and achieve desired outputs. These frameworks provide a structured way to define the AI’s role, task, context, expected format and so on, leading to more precise and relevant responses.

There are great online resources regarding prompts for medical education. Note: Even if the resource focuses on ChatGPT or Microsoft 365 Copilot specifically, the ideas can be used for any model:

TRACI Prompting Framework

Structured Prompt supplies a white paper for TRACI prompting.

TRACI is one prompting framework that is useful for engineering prompts for GenAI and is consistent with the general components of most prompts. The letters refer to various components of a prompt:

T—Task

Task refers to the type of output the prompt should achieve. For example, a rubric, learning goals, or syllabus statement.

R—Role

Role refers to the persona generative AI should take on when responding. For example, an expert educator, a student advisor, or instructional designer.

A—Audience

Audience refers to the group to whom the response is written. For example, introductory biology students, graduate-level competitive study majors, or Doctor of Pharmacy students.

C—Create

Create refers to the format of the output. For example, 200 words, a three-column table, or an acrostic poem.

I—Intent

Intent communicates the purpose of the response. For example, to promote a growth mindset or clearly communicate expectations.

Considering the TRACI model will improve initial prompts and provide guidance as prompts are revised.

We have compiled a more comprehensive list that can be accessed in case you are looking for something more targeted or specific.

Citing AI Usage

Example citations (Microsoft Copilot generated the following response when prompted with “Outline how to write a citation that would include the prompt used in Microsoft Copilot.” Accessed on 2025-07-16.):

1. In-Text Citation (APA Style)
Parenthetical:
(Microsoft, 2025)
Narrative:
Microsoft (2025) generated the following response when prompted with “Summarize the Geneva Convention in 50 words.”

2. Reference List Entry (APA Style)
Microsoft. (2025). Response to the prompt “Summarize the Geneva Convention in 50 words” [Large language model]. https://copilot.microsoft.com
You can adapt the title to reflect your actual prompt. For example:
Microsoft. (2025). Response to the prompt “Outline how to cite AI tools including the prompt used” [Large language model]. https://copilot.microsoft.com


3. AI Use Disclosure (Recommended)
Include a short statement in your paper (e.g., before the References section):
AI Use Disclosure: I used Microsoft Copilot (https://copilot.microsoft.com) to assist in drafting and refining this paper. The prompt used was: “Outline how to cite AI tools including the prompt used.” The AI-generated content was reviewed and edited for accuracy and clarity.

You can also be more granular with statements as part of an introduction or conclusion, such as:
“"Microsoft Copilot was used to brainstorm ideas for this essay.” or “Copilot was used to check spelling and grammar.”

Always check with staff or faculty for more specific preferences regarding AI acknowledgment.

Additional Resources

Additional Precautions

Precautions

Generative AI has potential applications across medical school, health professions education, health care and beyond. However, there are also concerns about the potential misuse of these tools and data shared with the services. When you provide information to these tools, such as queries, student essays, grant proposals, source code or datasets, it is the same as posting information on a public website.

WVSOM encourages the use of generative AI services, as long as no institutional data is submitted to them without approval. Currently, Microsoft 365 Copilot is the only generative AI permitted. Any other system used for official institutional data or information will need to go through review to ensure the necessary contracts and safeguards are in place to protect the data submitted and to ensure the algorithms in use are ethical, transparent and beneficial to the WVSOM community.

Types of institutional or patient/clinical data that should not be submitted to public versions of generative AI tools, even when anonymized, include:

  • Data classified as Institutional Data or higher (for examples, visit the Procedure For Access to Institutional Data)
  • Data protected under HIPAA (for example, patient data; refer to the WVSOM HIPAA policy for details)
  • Data that may be considered student, faculty or staff intellectual property, unless the individual submitting that intellectual property created it

Specific examples that are not appropriate for the public versions of generative AI tools include:

  • Sharing names and information about a real student, employee, research participant or patient
  • Asking an unsupported/approved AI model to summarize and grade a student paper or assignment
  • Sharing employee-related data such as performance or benefit information for communication drafting or analysis
  • Asking an unsupported AI model to generate code for WVSOM systems protecting institutional data or sharing WVSOM source code for editing
  • Sharing grant proposals still under review

Acceptable uses

With these precautions in mind, there are numerous ways to use generative AI tools without submitting university data or intellectual property. Using general queries to generate content to pull information from the AI resources is a good way to engage with the products.

Students should use generative AI in ways that align with university academic integrity policies and communicate with their instructors before using generative AI in their coursework. Courses may elect to further restrict generative AI.

Examples of acceptable uses of generative AI (From a data management perspective)

Syllabus and lesson planning

Instructors can use generative AI to help outline course syllabi and lesson plans, getting suggestions for learning objectives, teaching strategies and assessment methods. Course materials the instructor has authored (such as course notes) may be submitted by the instructor.

Correspondence when no student or employee information is provided

Students, faculty or staff may use fake information (such as an invented name for the recipient of an email message) to generate drafts of correspondence using AI tools, as long as they are using general queries and do not include institutional data.

Professional development and training presentations

Faculty and staff can use AI to draft materials for potential professional development opportunities, including workshops, conferences and online courses related to their field.

Event planning

AI can assist in drafting event plans, including suggesting themes, activities, timelines and checklists.

Reviewing publicly accessible content

AI can help you draft a review, analyze publicly accessible content (for example, proposals, papers and articles) to aid in drafting summaries, or pull together ideas.

Again, even if you use generative AI tools for activities that do not share personal or institutional data, you should still check the tool's output for accuracy. Since these tools have been known to produce inaccurate content (sometimes called “hallucinations”), verify any factual information generated by an AI tool, and make sure to reference the tool as you would any other source.

You may also want to check out the Student Guide to AI Literacy for more information.

Follow WVSOM Procedures and Guidance