Saturday, September 28, 2024
Business

Britain’s Social Security has banned its staff from using ChatGPT—but it’s okay with Microsoft Copilot

OpenAI’s revolutionary bot ChatGPT promises untold wonders for worker productivity—but it also faces a lot of privacy headaches. For the latest organization in a growing list of concerned groups, that trade-off became too tough to justify.

The U.K. Department for Work and Pensions (DWP), which handles welfare and pension payments for 20 million Brits, recently updated its “Acceptable Use Policy” framework to ban its staff from using publicly available AI applications—and the department name-dropped ChatGPT.

“Users must not attempt to access public AI applications (such as ChatGPT) when undertaking DWP business, or on DWP approved devices,” the guidance now reads. 

In December, the DWP said it was using a “test and learn approach” to the application of generative AI among its employees. The department is exploring how AI could help staff complete writing tasks and assist work coaches with clients in job centers. 

The DWP is trialing an internal tool based on Microsoft Copilot, a digital assistant, to help automate tasks, Civil Service World reported

ChatGPT privacy concerns grow

It’s currently unclear why the DWP moved to scrap OpenAI’s ChatGPT and experiment with internal tools, but it’s likely the department is the latest body to decide privacy concerns around the LLM were too great to ignore.

A representative for the DWP didn’t immediately respond to Fortune’s request for comment. 

ChatGPT collects huge amounts of data and information submitted by users to help its training and there are fears that information, particularly users’ personal details, will be shared with external parties. 

These concerns led Italian regulators in January to accuse OpenAI of violating EU data privacy rules

The Garante, Italy’s data protection body, had discovered that users’ messages and payment information had been exposed by the chatbot. 

At the time, a spokesperson for OpenAI told the Associated Press: “We believe our practices align with GDPR and other privacy laws, and we take additional steps to protect people’s data and privacy,” a company statement said. 

“We want our AI to learn about the world, not about private individuals. We actively work to reduce personal data in training our systems like ChatGPT, which also rejects requests for private or sensitive information about people.”

Several major companies have taken steps to limit or ban the use of ChatGPT in their organizations after the chatbot exploded in popularity last year. 

Apple restricted some employees from using ChatGPT over fears they would leak sensitive information. The group followed in the footsteps on tech Amazon and Wall Street banks JPMorgan and Bank of America in preventing employees from using the technology. 

OpenAI is trying to appeal to businesses with an Enterprise version which shores up some concerns over data and privacy.

A representative for OpenAI didn’t immediately respond to a request for comment.

Muddled AI policies

The U.K. government published a generative AI framework for its 520,000 civil service employees last year. The framework outlined 10 key principles staff should uphold when using generative AI, covering ethics, laws, and understanding of the technology’s limitations. 

However, when it comes to broader policy, the U.K. has leaned heavily on the dovish side of regulating the technology, particularly in comparison with the EU.

The U.K. is encouraging a fairly hands-off approach to AI regulation that leaves regulators of individual fields to set their frameworks, and ruled out changing the country’s laws to account for the risk of AI.

It differs considerably from the EU’s landmark AI act, the outline of which was agreed early February and been described as heavy handed.

Subscribe to the new Fortune CEO Weekly Europe newsletter to get corner office insights on the biggest business stories in Europe. Sign up for free.

source

Leave a Reply

Your email address will not be published. Required fields are marked *