Thursday, April 14, 2016

But, do bots have to be robotic?

Bots are the new "it" thing, but they are also not equipped to have conversations like a real human. As chatbots evolve and become more sophisticated, they provide smart and more personalized experiences. There are even plans for bots to counsel on legal matters and health situations. But bots are still rude, so to speak. Improperly trained or monitored bots can turn ugly when exposed to humans. Not in the Terminator sense, but in the customer service/experience sector. As a bot "learns", it picks up skills and perform tasks that are outside of its normal programming. Bot experts call this emergent behavior. Bots often learn these tasks without a human serving as a filter or a guide. There fear is that a chatbot can devolve into anything, even rude and grumpy. The key to preventing this is active monitoring. Given the scale at which Facebook and WeChat hope to see bots deployed, it’s reasonable to worry about how much human oversight the technology might have. Microsoft is considering “other kinds of tools that they can help developers with to have better alerts to know if their bot is behaving in a way they might not anticipate.” Bots programmers can, and perhaps should, design bots so they do not display troubling emergent behavior. One way to do this is limit the corpus of data available, or blacklisting inappropriate words from a bot’s vocabulary. These are common practices, says Caroline Sinders, an interaction designer who specializes in chatbots. When you have a conversation with a chatbot, it’s clear that you’re talking to software, not a human. The conversation feels stiff. But some bots are adept at shooting the breeze, a skill that can make it hard to know you’re conversing with code. “Disclosure is going to be really important here,” says Woodrow Hartzog, a law professor at Samford University. “Problems can come up when people think they’re dealing with humans, but really they’re dealing with bots.” It will be interesting to see how this evolves as bots take over in the customer service and e-commerce space. If a customer is ordering a product or interacting with an agent, can a bot pick up on the nuances of irritation, impatience, or even anxiety? I'm not sure if bots will ever fill the role of a live person, but that will be interesting to see.

No comments: