Skip to main content

In a recent op-ed published in Computer Weekly, legal expert Richard Forrest highlighted the importance of exercising additional caution when using AI tools in the workplace. Specifically, Forrest drew attention to the concerns raised in recent reports about potential GDPR violations resulting from the use of OpenAI’s ChatGPT.

With the increasing prevalence of generative AI models like ChatGPT, many people lack sufficient understanding of their inner workings, which could lead to inadvertent disclosures of sensitive information and ultimately breaches of GDPR.

Feeding private data to a public model

The potential benefits of ChatGPT have led many businesses to integrate the AI tool into their processes. However, concerns have been raised about employees carelessly submitting confidential corporate information and sensitive client data, into the chatbot. It is important for businesses to adopt effective measures to ensure that employees in all fields, such as healthcare and education, remain compliant.

Forrest drew attention to an investigation by Cyberhaven that found that sensitive data makes up 11% of what employees copy and paste into ChatGPT. In one instance, the investigation provided details of a medical practitioner who fed private patient details into the chatbot, the repercussions of which are still unknown. This raises concerns about GDPR compliance and confidentiality.

The primary concern is how large language models (LLMs), such as ChatGPT, use personal data for training, which could then be regurgitated later down the line. For example, if a medical professional had entered confidential patient information, is there a possibility that ChatGPT could provide this data to another user in case of a query concerning said patient later?

Businesses employing ChatGPT for administrative purposes may be jeopardizing confidentiality agreements with clients, as employees might enter sensitive information into the chatbot. Trade secrets, such as codes and business plans, may also be at risk of being compromised if entered into the chatbot, thus putting employees in potential violation of their contractual obligations.

Today, Italy. Tomorrow, Europe?

The recent ban on ChatGPT by the Italian data protection regulator has heightened concerns about its use in a business context. In a statement, the regulator said it would lift the ban if ChatGPT complies with new measures, including providing users with “the methods and logic” behind the data processing that goes on for the tool to operate. Users and non-users should also be given the tools “to request the correction of personal data inaccurately generated by the service or its deletion if a correction is not possible [and] oppose ‘in a simple and accessible manner’ the processing of their personal data to run its algorithms.”

However, it is uncertain whether these measures are enough to address concerns over the regurgitation of learned data. Businesses that use ChatGPT must adopt effective measures to ensure employees remain GDPR compliant.

Compliance tips for businesses

To assist with data compliance and security training, Forrest has suggested that businesses should adopt the following tips:

  • Assume that anything you enter could later be accessible in the public domain
  • Don’t input software code or internal data
  • Revise confidentiality agreements to include the use of AI
  • Create an explicit clause in employee contracts
  • Hold sufficient company training on the use of AI
  • Create a company policy and an employee user guide

The ChatGPT ban in Italy has amplified the need for companies across Europe to prioritize compliance measures as the use of AI in the corporate world becomes more prevalent. To achieve this, businesses must prioritize training and take proactive steps to create regulations that govern the integration and retrieval of data by AI chatbots. Additionally, educating employees on the use of AI chatbots is critical to ensure they comply with GDPR regulations.

Forrest highlighted that organizations and businesses have a legal obligation to safeguard personal information and prevent unauthorized access by third parties. If they fail to meet this obligation and a breach occurs, affected individuals can seek compensation. Therefore, companies must take GDPR compliance seriously and take measures to ensure they are not violating these regulations.

Leave a Reply