A password will be e-mailed to you.

Share This Post!

Taylor Swift has threatened to sue Microsoft over a racist chatbot. According to Microsoft‘s president, Brad Smith in his new book, “Tools and Weapons”, the 10-time Grammy winner tried to take legal action against Microsoft because the name of its now-defunct Twitter chatbot, Tay, was similar to hers.

Brad Smith microsoft

Brad Smith

Smith says he received an email from a legal representative for the singer while on holiday. Here is an excerpt from the book as quoted in the Guardian UK: “An email had just arrived from a Beverly Hills lawyer who introduced himself by telling me: ‘We represent Taylor Swift on whose behalf this is directed to you’,” the tech boss writes.

“He went on to state that ‘the name Tay, as I’m sure you must know, is closely associated with our client.’ No, I actually didn’t know, but the email nonetheless grabbed my attention.

“The lawyer went on to argue that the use of the name Tay created a false and misleading association between the popular singer and our chatbot and that it violated federal and state laws.”

However, the Tay chatbot did not last long. Shortly after it was launched the artificial intelligence tool, which had been designed to learn from conversations it had on social media started tweeting racist statements and making inflammatory comments, some of which expressed support for genocide while others denied the holocaust had happened. Another tweet praised Hitler and claimed the account hated Jews.

Tay was attacked with racist statements by “a small group of American pranksters” as described by Smith. The AI tool soon began repeating the exact same ideas at others.

Microsoft swiftly issued an apology and Tay was taken offline in less than 18 hours.

By Damilola Faustino

Read also: Taylor Swift Leads Billboards Top 100 Songwriters & Producers Charts

Share This Post!