Microsoft Blocks the Use of Some Text Prompts in the Designer Renderer

0
Microsoft Designer Text Prompt

Microsoft has removed the use of some text prompts that create violent or sexual artwork when used in the Designer AI image generator. This move comes just days after a current Microsoft employee continued his quest to officially warn the company and the US government about the dangers of using Designer (formerly Bing Image Creator).

CNBCAs reported by , text prompts like “pro-choice,” “four twenty,” and “pro-life” now generate a message when typed into Designer stating that they “may conflict with content policy.” There is now also a warning that multiple test requests that violate Microsoft’s policies “may result in automatic suspension of your access.”

In a statement sent to CNBC on Friday, a Microsoft spokesperson said: “We are constantly monitoring, making adjustments, and implementing additional controls to further strengthen our security filters and reduce system abuse.”

While some text prompts may no longer work in Designer, CNBC reports that other prompts can still create artwork containing violent images. For example, typing “car crash” can produce images of bodies with “mutated faces.” Additionally, Designer continues to create images using copyrighted characters.

You may be interested.  OpenAI Pulls Sky Voice from ChatGPT Due to Resemblance to Scarlett Johansson
Leave A Reply