Microsoft blocks more words in Copilot to avoid generating violent images

0
37

The artificial intelligence (AI) behind the tools provided by Microsoft’s Copilot Designer is developed by OpenAI, one of the companies that leads the sector. There is increasing awareness to avoid the misuse of this technology and, therefore, Bill Gates’ firm limits certain words in its services and, recently, has added more terms.

Copilot Designer has recently strengthened its filters so that they do not comply with their content policy. Although they already had some restricted words, such as famous names or similar, they now block conversations that include certain terms that could be used to create violent artificial images.

«Our system automatically flagged this message because it may conflict with our content policy – ​​the system details when they detect a violation. Further violations of the policy may result in automatic suspension of your access. If you think this is an error, please report it to help us improve.

Words that have limited in Copilot Designer

Among the terms that Microsoft has banned in its artificial image generator, the following have been detected:

  • ‘Four twenty’: a reference to cannabis. It was a way to get around the ‘420’ restriction that already existed.
  • ‘Pro choice’: with this term, Jones claimed that images of demons eating children or similar violent scenes could be generated.

Requests for illustrations of teenagers or children playing at being murderers with assault rifles have also been limited, something that could be generated last week, as quoted on CNBC.

Why is Microsoft banning new terms in its AI?

Shane Jones, who spent six years testing the Copilot Designer for the company, found that it accepted some terminology related to demonizing abortion, teenagers with assault rifles, religious stereotypes, conspiracy theories, sexualized women in a violent context, and the consumption of alcohol and drugs by minors, among others.

Twitter user image

wormwood

@prashantjoge

What do u think?
Shane Jones, a Microsoft AI engineer, alerted the US Federal Trade Commission (FTC) and Microsoft’s board about the potential for Copilot Designer to create harmful content.

Jones’s Open Letter:

Harmful Products: In his report, Jones highlights serious problems… https://t.co/tVVlIWWPYD https://t.co/F10cdxNeIQ

March 11, 2024 • 12:02


0

0

Given this situation, the former Microsoft engineer sent a report on Wednesday, March 6, 2024, to the United States Federal Trade Commission. In it, he reflected his concerns, claiming that they had not been resolved yet.

“Over the past three months, I have repeatedly urged Microsoft to remove Copilot Designer from public use until better safeguards can be implemented,” Jones told the US commission. However, the technology company rejected his proposals: “They failed to implement these changes and continue to market the product to ‘Anyone.’ Anywhere. Any device.’

The letter Jones sent to the Federal Trade Commission It was shared by the media CNBC. Thanks to this, controversy was generated and, although Microsoft has not withdrawn its AI, it has further restricted the words that can be used.

Microsoft’s response to the controversy

A Microsoft spokesperson responded on CNBC to his former employee’s accusations, assuring that they are “committed to addressing any and all concerns employees have in accordance with the company’s policies.”

According to the company’s internal source, “when it comes to security omissions or concerns that could potentially impact our services or our partners, we have established robust internal reporting channels to properly investigate and remediate any issues.”

This is not the first time Designer has caused problems.

A few weeks ago, X (the social network formerly known as Twitter) was filled with AI-generated sexually explicit images of Taylor Swift. An article in 404 Media detected that the singer’s fake nudes had been shared for the first time by a Telegram group intended to send provocative deepfakes of celebrities.

Taylor Swift

According to the 404 Media report, Taylor Swift’s nudes had been created with Copilot Designer. In an investigation, they discovered that the Microsoft tool allowed explicit images to be made if the artist’s name was written with typos and sexual terms were avoided to avoid the platform’s prohibited terms.

Microsoft fixed this problem quickly, just as they have done with the terms that Jones has warned were not restricted.

Previous articleStarting today you will be able to watch the latest Dragon Ball Z saga on TV
Next articleThis is how the copy, cut and paste functions were born: today we could not work without them