Guidance on Using Artificial Intelligence at FGCU
Florida Gulf Coast University (FGCU) sees artificial intelligence (AI) as a tool that can help make university work more efficient and improve student success. However, using AI incorrectly or carelessly can cause serious problems. This webpage explains how employees at FGCU can use AI responsibly and safely.
Check AI Content for Mistakes
AI-generated content is not always accurate and may include fake or made-up information, also called “hallucinations.” If you use AI to create content, you are responsible for checking that everything is correct.
Do Not Enter Confidential Information Into AI Tools
AI tools like ChatGPT, Microsoft Copilot, and Google Gemini are designed to answer questions and help with tasks. But they may store the information you give them. Because of this, FGCU does not allow anyone to enter certain private or protected information into AI tools — even if you try to hide or remove sensitive parts, or if the tool says it is “safe” or a “Closed System.”
If you are unsure whether information is safe to enter into an AI tool, ask your supervisor. Your supervisor can contact the Office of General Counsel for guidance at legalreview@fgcu.edu.
Information You Are Not Allowed to Enter into AI Tools
Do not enter this type of data into any AI chatbot or tool:
- Names with Social Security numbers, driver’s licenses, or passport numbers
- Medical or health insurance info
- Passwords or security questions
- Credit card or bank account info
- Financial aid information
- Research data with people’s personal details
- University financial or payroll records
- FERPA-protected records such as student grades or records
- Employee records
- Information about university computer systems
- Animal research information
- Legal information protected by attorney-client privilege
- Personal data covered by European data protection laws (GDPR)

Information You Can Enter into AI Tools
You may enter information that would not cause major harm if shared. Examples include:
- Routine emails and meeting notes without sensitive information
- Public research
- Course materials (like test questions)
- FGCU website content
- Published research or news releases
- Program handbooks and course catalogs
AI and Academic Integrity
The Office of the Provost has adopted a Core Syllabus Policy Statement for how students can use AI in class. Right now, students are only allowed to use AI for assignments if the instructor gives explicit permission and they must properly cite it. Using AI without permission or failing to cite it is considered academic misconduct and may lead to consequences under FGCU’s Student Code of Conduct.
Getting Approval to Use AI Tools at Work
Any FGCU employee who wants to use an AI tool must request it through the proper channels. All technology purchases must go through Workday and get approval from Information Technology Services (ITS). ITS will check if the tool is safe and works well with FGCU’s systems. You are not allowed to purchase AI tools on your own without going through this process.
Important Principles When Using AI
- Be Transparent: Always say if you used AI to create your work.
- Watch for Bias: AI may use biased or unfair information from the internet.
- Intellectual Property: You might not legally own what AI creates. Do not enter confidential or important ideas into AI tools if you plan to get a patent or copyright.