Be a data custodian
You are responsible for knowing when the data you provide to AI services requires protection, ensuring its appropriate use and safeguarding as needed.
Do
- Review privacy policies: Before using AI services, make sure you understand how your data will be utilised.
- Understand privacy settings: Be aware of your choices, including the possibility of your information being reused by the AI system, or reviewed by AI company personnel.
- Choose service types wisely: Recognise the difference between 'open' and 'private' services and decide accordingly.
- Handle sensitive information with care: Exercise caution with data related to personal, sensitive, or intellectual matters, sharing only with proper authorisation from data owners.
- Consider 'local' models for privacy: If privacy is paramount, consider using offline AI models.
- Adhere to existing policies: Remember that university and legal policies on data usage remain in effect when using AI services and must be followed.
Don't
- Assume privacy is default: Check and adjust privacy settings as needed—default options may not be the most secure.
- Provide non-public info to 'open' services: Avoid sharing information that isn’t publicly accessible with services that reuse data. If the data isn't meant for the public eye, it shouldn't be entered in 'Open' platforms.
- Equate paid with private: Just because a service has a fee doesn't guarantee privacy. Always review and configure the privacy settings to your satisfaction, regardless of the service being paid or free.
- Share sensitive data without considering data protection or intellectual property obligations: Do not give out information such as personal, sensitive or confidential details, without completing a data protection impact assessment and gaining authorisation from data owners, or in the case of intellectual property, referring to the University’s Intellectual Property Commercialisation Policy.