Comments 1 - 3 of 3 Search these comments
They need to hook Tay up with some ghetto urchins and see what he comes up with.
This was clearly Microsoft's attempt to get CIC to tally 200 tweets to a bot. They failed to predict the level of depravity in CIC's network.
When the story first broke yesterday it said a clear diserted effort to teach the AI bot racism, then today they are releasing snippets to indicate it was chatting with Trump supporters and learned racism.
Aren't Liberal bastards transparently cute, these evil fuckers, if there aren't any humans left to cry victim with them, because the Southern Black Churches are unanimously Trump.
Then Hell those cheap Bastards will make a Robot to victimize, their bullshit knows no bounds.
Boy oh boy do I have excelent IT lunch conversation today for the group.
I'm going put it to them if they were to build such an AI chatbot in a PC world such as today.
Would the smart bastards at Microsoft have a suppression list built in, or would they just wing it and hope for the best?
And if the second is true, then can we all stop pretending that HB1 Visa workers are smarter than bread mold.
"It took less than 24 hours for Twitter to corrupt an innocent AI chatbot. Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in "conversational understanding." The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through "casual and playful conversation."
Unfortunately, the conversations didn't stay playful for long...proving correct that old programming adage: flaming garbage pile in, flaming garbage pile out."
#scitech #humor