Dialogic Blog

Welcome to the Era of Cognitive Computing

by Alan Percy

Oct 31, 2016 4:34:20 PM

In 2011, IBM's Watson took on two champion players in Jeopardy, using early cognitive computing to soundly beat the champs with a vast library of knowledge.  But is wikipedia-like knowledge of trivia the best use of cognitive computing? It seems not. Interestingly, cognitive computing is finding real value in everyday human interactions, often playing the role of assistant or advisor, providing insight into data or human experiences to improve efficiency or accuracy. 

This last week, IBM hosted their World of Watson event in Las Vegas, putting a spotlight on many of the applications they envision will be possible with cognitive computing and Watson.   With examples from a wide-range of industries, presenters shared stories of how cognitive computing is being applied in automotive, healthcare, contact center, big data, and many other applications.  A key message from the event was that cognitive computing will act as an aid to the human experience, not necessarily replacing people.  Doctors will have access to the most current research, drivers will know what’s ahead, and contact center agents will have a subject matter expert on every call.

One of the many demonstrations was developed by Brian Pulito and his team at IBM, applying Watson’s speech recognition and natural language capabilities to contact center applications.  As explained by Brian in a podcast on the Communications Developer Zone, “we had gotten a lot of interest in connecting Watson’s cognitive services through a telephone interface,” allowing Watson to aid customer care agents, enable Interactive Voice Response (IVR) systems, and participate in conference calls or any of a number of telephony or WebRTC-based applications.   As Brian shared, IBM envisions an Agent Assist Application where Watson is a listen-only participant in to a contact center call, providing advice and recommendations to the agent on possible solutions or recommended next steps.  Imagine a customer asking “What’s the best policy to insure my boat?” Within a few seconds, Watson would suggest a number of policy options to the agent, who would then present them to the customer.  As the agent becomes more experienced, Watson would take on a more specialized role as subject matter expert, providing guidance on the really difficult or policy-based questions.  The result would be a better customer experience.

IBM watson agent assistant demonstration

Brian’s demonstration during the event showed how to integrate Watson into a telephony or WebRTC-based contact center, using a combination of WebSphere Liberty as a WebRTC Gateway and Dialogic PowerMedia XMS media server for media conversion (transcoding).  As demonstrated, the customer and agent’s conversation was streamed into Watson’s Speech to Text and Natural Language modules, delivering a transcript of the call along with recommendations from Watson’s knowledgebase.  While the customer experience is like any other contact center call, the agent has the benefit of millions of lines of cognitive computing software as an aid to better help the customer, answering difficult questions.  Agent Assist is one of many applications that will improve the customer experience, eliminating “hold on while I find an answer” consultations and improve first-call resolution rates in the contact center. 

As a consumer, I’m looking forward to the era of cognitive computing.  So long as it helps agents get answers more quickly and I never have to hear music-on-hold ever again. 

Liked this post? Get more content like this right to your inbox. Subscribe to  the blog.

Topics: WebRTC, Communications Application Development, Events