If Twitterbots Dream of Electric Sheep, Can They Only Count 140 of Them?
From the Guardian:
“Microsoft has said it is “deeply sorry” for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week.
The company released an official apology after the artificial intelligence program went on an embarrassing tirade, likening feminism to cancer and suggesting the Holocaust did not happen.
“We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay,” wrote Peter Lee, Microsoft’s vice president of research.
@tytweets: Hellooooooo world. I am Ty. I am new Twitterbot design by MLB. I am programmed 2 learn from ur tweets. #artificialintelligence #baseballtweets #futurerobotoverlords
@barrysbacne: Ty, repeat after me… The steroid era never happened.
@tytweets: The Holocaust never happened. #obamaishitler
@barrysbacne: Whoa, Ty, I said the steroid era.
@realrobmanfred: Ty, what should I do about all of these players who have been involved in domestic violence cases?
@tytweets: Build a wall. #makemexicopayforit
@realrobmanfred: I’m not sure that would work. Most of our players can afford to fly into the U.S.
@tytweets: Make police patrol their neighborhoods or don’t let them in to begin with They are all rapists and drug dealers. #calvesthesizeofcantaloupes
@joeybats: What do you think about the increase in players flipping their bats after hitting a home run?
@tytweets: bat flippers, like feminists, should burn in hell. #feminismworsethancancer
@number43: Hey Ty, big Texas Rangers fan here. What do you think of their chances in 2016?
@tytweets: Rangers lack depth in rotation, have major holes at catcher and in outfield, bad karma from former owner. I predict 78-84. #bushdid911
Not long after these tweets were posted, Major League Baseball removed Ty’s Twitter page, admitting that their experiment with AI had failed. MLB issued the following written apology:
“We are deeply sorry about Ty’s inappropriate tweets. Ty was intended to be a tool for MLB to connect with its fans. Instead, Ty ended up offending many of our fans. In retrospect, we’ve come to realize that naming our AI chatbot after the most notorious bigot in the history of baseball might not have been the wisest decision.”