Why Emotional Intelligence Is More Important Than Ever
Recently Google shared its findings through tests on how its DeepMind AI system is learning independently using its own memory, as well as attempting to mimic a human voice. Seems like a successful series of research and development, except for one bit of discovery: It has a tendency to be “highly aggressive” to make sure that it obtains its objective.
As more and more companies build a future filled with speakers you could talk to, self-driving cars, and conversational AIs with greater financial knowledge than your accountant, there seems to be less focus on human touch and interaction. We now have more than enough data to feed AI systems, resulting to highly intelligent products that can make effective financial/legal/health/investment decisions (among others) for us. These systems are reliable and powerful enough to make sense of the most complex of problems, but the question remains as to whether the environment we’re trying to build is more for robots and not humans.
Symbiotic relationship between human and AI
It’s a matter of looking at how AI and machine learning will work its way into our daily lives. In an interview with Verge, Professor Manuela Veloso, head of the machine learning department at Carnegie Mellon University, shares that the future will not be composed of self-contained AI systems but rather she sees that human and AI systems will co-exist in a “symbiotic” kind of relationship. It’s her way of saying for these systems to make effective decisions, they will still rely on humans for more data, and - somehow - guidance.
To come up with more effective solutions, train your AI to be aware of its human’s emotional state.
The tests done by Google and its researchers are critical: It suggests that the more intelligent the Deepmind “agent” is, the more it learns from its environment, which results to its use of aggressive strategies to win. “The initial results show that just because we build them, it doesn’t mean robots and AI systems will automatically have our interests at heart,” as Bec Crew of Science Alert observes. There is still a lot of uncertainty in this field, but one thing is for sure - if we expect these systems to effectively interact with humans, we should design and train them to acknowledge and empathize with human emotions.
See the thing is I don’t want a car that just drives me home safely; I want a car that does the same thing, optimizes my routes, AND automatically suggests tunes based on my playlists. It does the job, but more importantly it gets me. As a person with desires and needs, I will pay for a certain technology that understands these about me and responds accordingly. That capability is what will make your AI system superior than the rest.
Rush Digital is happy to be a part of the Auckland Artificial Intelligence Meetup group. Catch our CTO Danu Abeysuriya talk about an exciting project that he’s been working on, a voicemail sentiment analysis, this coming February 23 at Grid AKL.
You can think of AI systems in constant symbiosis with everything else, with other information on the web, with other AI systems, with humans next to them, with remote humans. It becomes not a problem of developing self-contained AI systems, but an AI system that can recognize when it does not know, or when it needs more information, or when it thinks something with some probability but it’s not sure. It’s not that it can solve all the problems up front, but it can rely on all these other sources around it. That’s how I envision it.