Laurel Ridge AI Guidelines

Summary

The current working guidelines for use of AI at Laurel Ridge. Includes which AI are currently allowable by faculty and students.

Body

Laurel Ridge approved AI tools 

Google Gemini and Microsoft CoPilot, when properly logged in, are the official AI currently approved for use by faculty and students. 
Some information and training on CoPilot can be found here.   

 

Laurel Ridge Community College – Generative AI Guidelines

1. Introduction

Generative Artificial Intelligence (AI) technologies, such as language models and image generators, offer significant potential to enhance learning and teaching. However, their use must be guided by principles of academic integrity, transparency, data protection, and ethical considerations to ensure responsible and beneficial outcomes.

The purpose of these AI guidelines is to provide clear principles and best practices for the ethical, effective, and equitable use of artificial intelligence technologies at Laurel Ridge Community College. These guidelines aim to foster innovation while ensuring that AI applications uphold privacy, transparency, and fairness standards. By adhering to these principles, we seek to empower the college community to leverage AI responsibly in support of our educational mission and community values.

---

2. Guidelines for Students

2.1 Academic Integrity

· Original Work: Students must ensure all submitted work is their own. Generative AI tools could be used for brainstorming and drafting, but the final submission should reflect the student's understanding and effort.

· Citation: Any use of generative AI tools must be properly cited. Students should acknowledge the AI's contribution in their work, similar to how they would cite any other source.

· Prohibited Uses: Using generative AI to complete assignments, exams, or any other evaluative tasks without explicit permission from the instructor is considered academic dishonesty. Check your course syllabus and course site in the learning management system (LMS) for acceptable and prohibited uses of AI for each class. For additional questions related to AI use, please contact your instructor.

2.2 Transparency

· Disclosure: Students must disclose when and how they have used generative AI tools in their assignments. This includes specifying the type of tool used and the extent of its contribution.

· Understanding Limitations: Students should be aware of the limitations of generative AI, including potential inaccuracies and biases in the generated content.

2.3 Data Protection

· Personal Data: Students should avoid inputting sensitive personal information into generative AI tools. They must ensure that any data used complies with data protection regulations.

· Anonymization: When using generative AI for projects involving personal data, students should anonymize the data to protect privacy.

2.4 Ethical Use

· Respect and Fairness: Students must use generative AI ethically, ensuring their use does not perpetuate biases or produce harmful content.

· Responsible Use: Students should use generative AI to enhance their learning experience, not to replace their critical thinking and creativity.

---

3. Guidelines for Faculty and Staff

3.1 Academic Integrity

· Clear Policies and Guidelines: Faculty should establish clear course policies and guidelines regarding the use and misuse of generative AI in their courses. These should outline both acceptable and prohibited uses of AI, with specific consequences for misuse. They should be communicated to students at the beginning of the course, ideally in the course syllabus and the course site within the LMS. [See AI Syllabus Guide.]

· Assessment Design: Design assessments that minimize the risk of misuse of generative AI, such as open-ended questions that require critical thinking and personal reflection.

3.2 Transparency

· Model Transparency: Faculty should be transparent about how they use generative AI in their teaching, including content creation, grading, and feedback.

· Student Guidance: Provide guidance to students on how to use generative AI tools responsibly and effectively. (E.g., Articulate whether AI use is permitted for each assignment and expectations for citing AI use within the assignment.)

3.3 Data Protection

· Data Security: Ensure any data used with generative AI tools is secure and complies with data protection regulations.

· Student Data: Avoid using student data in generative AI tools without explicit consent and ensure any data used is anonymized.

3.4 Ethical Use

· Bias Mitigation: Faculty should be aware of and actively work to mitigate biases in generative AI tools. This includes selecting tools that prioritize fairness and equity.

· Promoting Ethical Use: Encourage students to use generative AI ethically and provide examples of ethical and unethical use cases. (E.g., Explain requirements for acceptable AI use within the context of the course, discipline and/or program. Identify proper citation strategies for acceptable use. This could include prompts and responses, MLA/APA/Chicago in-text and bibliographical citations.)

---

4. Tools and Resources

4.1 Supported Tools

· For Students:

    o Gemini (using @email.vccs.edu accounts)

§ Data and prompts stay within our VCCS Google tenant.

§ The data and prompts are not used for large language model learning outside of our tenant.

    o Copilot (using @email.vccs.edu accounts)

§ Data and prompts stay within our VCCS Microsoft tenant.

§ The data and prompts are not used for large language model learning outside of our tenant.

· For Faculty and Staff:

    o Gemini (using @email.vccs.edu accounts)

§ Data and prompts stay within our VCCS Google tenant.

§ The data and prompts are not used for large language model learning outside of our tenant.

    o Copilot (using @laurelridge.edu accounts)

§ Data and prompts stay within our VCCS Microsoft tenant.

§ The data and prompts are not used for large language model learning outside of our tenant.

These tools have safeguards in place designed to protect sensitive information, including Personally Identifiable Information (PII) and Family Educational Rights to Privacy Act (FERPA) data.

4.2 Some Additional Tools That Are Available

When using tools outside of Laurel Ridge Community College or Virginia Community College System (VCCS)-affiliated accounts, exercise caution with the information you input.

Chatbots and AI Language Models.​​​​​​: Chatbots are advanced AI systems that use Natural Language Processing (NLP) to interact with users, answer questions, generate content, and summarize information. These systems leverage large language models to understand and produce human-like text. While these tools can support learning and productivity, users should critically evaluate the accuracy of the content, avoid overreliance, and ensure proper attribution when incorporating AI-generated text into academic or professional work. Examples of chatbots include ChatGPT , GPT-4 , Gemini , and Copilot.

Images & Art: Image and art generators are tools that use AI to create visual content based on user prompts or input. These tools can be used for creative expression, illustration, or design, but users should be mindful of copyright, attribution, and ethical considerations when using AI-generated visuals in academic or professional work. Examples of image and art generators include DALL-E 3 , Bing Image Creator, and Stable Diffusion.

Video: Video tools use AI to generate, edit, or enhance video content based on text prompts, images, or other media inputs. These tools can assist with content creation, animation, or narration, but users should consider ethical use, proper attribution, and potential copyright issues when incorporating AI-generated videos into academic or professional work. Examples of video tools include Synthesia and Gen-1 Runway.

Music: Music tools use AI to compose, generate, or remix music and soundtracks based on user input or prompts. While these tools can support creative projects and enhance multimedia work, users should be aware of licensing, attribution, and originality considerations when incorporating AI-generated music. Examples of music tools include AIVA and MusicGen.

Programming: Programming tools assist with writing, debugging, or explaining code using natural language input. These tools can boost efficiency and support learning, but users should verify the accuracy of generated code and understand that using AI without proper attribution or understanding may raise academic integrity or security concerns. Examples of programming tools include GitHub Copilot and CodeWhisperer.

Personal Productivity: Personal productivity tools help streamline tasks such as writing, scheduling, summarizing, or data analysis. While they can enhance organization and efficiency, users should ensure transparency in how these tools are used and remain accountable for the accuracy and integrity of any outputs used in academic or professional contexts. Examples of personal productivity tools include Notion, SlidesAI, and Explainpaper.

4.3 AI Detection Tools

In our commitment to maintaining academic integrity and fostering genuine learning experiences, faculty may utilize AI detection tools to try to identify AI-generated content in student submissions. Currently Laurel Ridge subscribes to AI content detection through Turnitin®.

Be aware AI detection tools are imperfect and cannot detect AI-generated text with 100% certainty. It is recommended that AI detection tools not be used as the sole factor in decision-making around an allegation of academic misconduct. If an instructor suspects that an assignment or assessment has been completed with unauthorized use of AI tools, they should proceed as they would for any other potential allegations of academic dishonesty. [See AI Syllabus Guide.]---

5. Communication and Training

5.1 For Students

· Ethical Use: AI can assist in brainstorming and research but should not replace original work. Always confirm AI-generated information with reliable sources.

· Transparency: Be open about your use of AI tools with instructors and peers.

5.2 For Faculty

· Syllabus Statements: Include clear AI use policies and guidelines in your syllabus, specifying when and how AI tools can be used and consequences of misuse. [See AI Syllabus Guide.]

· Assessment Integration: Design assessments that incorporate AI in creative and critical thinking exercises.

· Ongoing Training: Participate in training sessions on AI use, prompt writing, and assessment creation. Utilize resources like Purdue OWL for proper citation formats.

· Recommended Readings: The following resources are recommended for faculty interested in learning more about how AI can be utilized as an instructional tool.

· Teaching with AI: A Practical Guide to a New Era of Human Learning by José Antonio Bowen and C. Edward Watson. ISBN: 9781421449227.
(See CEITL if you would like to borrow a copy.)

· Rethinking Writing Instruction in the Age of AI by Randy Laist. ISBN: 9781943085101. (See CEITL if you would like to borrow a copy.)

By adhering to these guidelines, both students and faculty can leverage the benefits of generative AI while upholding the values of academic integrity, transparency, data protection, and ethical use. These principles will help create a responsible and inclusive educational environment.

Details

Details

Article ID: 158510
Created
Thu 9/11/25 1:39 PM
Modified
Wed 10/1/25 4:10 PM