This week, tech giant Facebook unveiled their new chat bot capabilities, designed to allow consumers to interact with an automated business representative. The question is: what could go wrong?
That's probably a question that Microsoft wishes it would have answered before it's experimental chat bot, Tay, turned into a neo-Nazi, racist. Even if they had asked the question, it's tough to know whether they could have ever been prepared. After all, putting an experiment on the internet is like crowdsourcing your testing to 7.4 billion people. Anything that can happen, will happen.
Will Facebook face an uphill battle for acceptance, given Tay's crash and burn episodes? Only time will tell.
No comments:
Post a Comment