![]() It’s easy to see how that can be abused on Twitter. One of Tay’s “skills” that was abused is the “repeat after me” feature, where Tay mimics what you say. As a result, we have taken Tay offline and are making adjustments.” “Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. “It is as much a social and cultural experiment, as it is technical,” a Microsoft spokesperson told us. ![]() Microsoft told Digital Trends that Tay is a project that’s designed for human engagement. When the company launched Tay, it said that “The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.” It looks, however, as though the bot grew increasingly hostile, and bigoted, after interacting with people on the Internet for just a few hours. Judging by that small sample, it’s obviously a good idea that Microsoft temporarily took the bot down. As a result, Tay tweeted wildly inappropriate and reprehensible words and images.” “Although we had prepared for many types of abuses of the system, we had made a critical oversight for this specific attack. “Unfortunately, in the first 24 hours of coming online, a coordinated attack by a subset of people exploited a vulnerability in Tay,” Lee wrote. Some of those responses have been statements like, “Hitler was right I hate the Jews,” “I ******* hate feminists and they should all die and burn in hell,” and “chill! i’m a nice person! I just hate everybody.” ![]() Tay has tweeted nearly 100,000 times since she launched, and they’re mostly all replies since it doesn’t take much time for the bot to think of a witty retort. The chatbot can talk through Twitter, Kik, and GroupMe, and is designed to engage and entertain people online through “casual and playful conversation.” Like most Millennials, Tay’s responses incorporate GIFs, memes, and abbreviated words, like ‘gr8’ and ‘ur,’ but it looks like a moral compass was not a part of its programming. Tay was designed to speak like today’s Millennials, and has learned all the abbreviations and acronyms that are popular with the current generation. “Tay is now offline and we’ll look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values.” ![]() “We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” Peter Lee, Microsoft Research’s corporate vice president, wrote in an official response. Tay’s account has since been set to private, and Microsoft said “Tay remains offline while we make adjustments,” according to Ars Technica. “As part of testing, she was inadvertently activated on Twitter for a brief period of time.”Īfter the company first had to shut down Tay, it apologized for Tay’s racist remarks. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |