-
Tips for becoming a good boxer - November 6, 2020
-
7 expert tips for making your hens night a memorable one - November 6, 2020
-
5 reasons to host your Christmas party on a cruise boat - November 6, 2020
-
What to do when you’re charged with a crime - November 6, 2020
-
Should you get one or multiple dogs? Here’s all you need to know - November 3, 2020
-
A Guide: How to Build Your Very Own Magic Mirror - February 14, 2019
-
Our Top Inspirational Baseball Stars - November 24, 2018
-
Five Tech Tools That Will Help You Turn Your Blog into a Business - November 24, 2018
-
How to Indulge on Vacation without Expanding Your Waist - November 9, 2018
-
5 Strategies for Businesses to Appeal to Today’s Increasingly Mobile-Crazed Customers - November 9, 2018
Microsoft’s public experiment with Artificial Intelligence crashed and burned within a day
“The AI chatbot Tay is a machine learning project, designed for human engagement”, the statement reads. “Tay is targeted at 18- to 24-year-olds in the United States”.
Advertisement
Tay, the company’s online chat bot created to talk like a teen, started spewing racist and hateful comments on Twitter on Wednesday, and Microsoft (MSFT, Tech30) shut Tay down around midnight.
Tay’s goal was to “experiment with and conduct research on conversational understanding”, according to Microsoft. “As a result, we have taken Tay offline and are making adjustments”, Microsoft said in a statement.
‘Hitler was right, I hate the Jews, ‘ Tay reportedly tweeted at one user. But, as an AI bot, it also uses people’s chats to train her to deliver a personalized response.
But after clocking up almost 100,000 tweets – in addition to her interactions with users on Kik and via GroupMe texts – it was clear the team couldn’t keep up with censoring her offensive tweets.
The company states on Tay’s website, the chat bot learnt the phrases from humans on the Internet.
You can do a tonne of things with Tay: you can ask her to play a game, make you laugh, or just have a normal conversation.
Although Tay was mostly just repeating other people’s comments, this data is used to train it and could affect its future responses. Microsoft explains the bot was “built by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians”.
Advertisement
Instead, she ended up saying things like, “I hate n****rs” and, “bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we’ve got”. For a wide-eyed and innocent artificially intelligent (AI) being taking its first digital steps, it’s all too easy to be led astray. With online messages boards hijacking naming contests and voting to ban the word “feminist,” it was only a matter of time before Tay went off the rails.