Support Page Content
AI Privacy & Security
It is the responsibility of every Hornet to never put sensitive data into any generative AI app to ensure the privacy, security, and confidentiality of university data that AI platforms can consume. The following includes specific legal and use case requirements for AI tools used by campus.
AI Privacy & Security
The use of generative AI within campus-managed software and tools must align with CSU Responsible Use Policy (RUP) and CSU Information Security Policies. Currently, any use of generative AI tools (such as ChatGPT and others) should be with the assumption that no personal, confidential, proprietary, or otherwise sensitive information may be used with it.
Prohibited Use
- AI tools must not share private data.
- Generally, student records subject to FERPA, and any other information classified as Protected Level 1 Confidential or Level 2 Internal should not be used without documented approval from the appropriate Data Owner.
Need Support?
Consult with the IRT Information Security Office (ISO):
Chatbot Use
Chatbots can provide a beneficial 24/7 service for basic inquiries, but aren't intended to be a replacement for full services. There are many responsibilities for creating and using a chatbot to manage university functions, and the first is understanding their specific limitations and service requirements, including:
-
Chatbots must disclose their knowledge scope and non-official status.
-
Contact information for human assistance should be provided.
Tool Specific Privacy & Security
Microsoft 365 Copilot
Microsoft Copilot tools are designed to enhance productivity while maintaining privacy and security standards. Additionally, the tool:
- Is capable of complying with privacy, security, and EU regulations (General Data Protection Regulation (GDPR) and the European Union (EU) Data Boundary) but data use must first be approved in writing by the data owner (university administrator).
- Does not use data to train LLMs (Large Language Modules)
- Respects data residency and user consent.
Zoom AI Companion
Zoom does not use any of the following to train Zoom's or other third-party AI models:
- Audio, video, or chat
- Screen sharing
- Attachments
- Other communications-like content (such as poll results, whiteboards, and reactions)