In today’s digital economy, consumers expect companies to leverage the latest technologies and engage at every possible touchpoint. That requires an omnichannel engagement strategy to create seamless experiences that benefit both the brand and its clients. In fact, companies that implement omnichannel engagement solutions retain on average a whopping 90% more customers than those using just one channel. For those that don’t, that percentage falls to a slim 33%. Ultimately, companies that fail to incorporate new and emerging channels into their customer journey will fall behind their more innovative competitors. Virtual personal assistant solutions are becoming increasingly important in customer experience transformation. By 2021, Gartner estimates that 15% of all customer service interactions will be handled entirely by some form of virtual office assistant—an increase of 400% from 2017. More broadly, the “intelligent assistant” business is predicted to grow to over $15 billion by 2021, and by that same year, 25% of digital workers will employ a personal virtual assistant in the office. Interactions provides Intelligent Virtual Assistants that seamlessly combine artificial intelligence and human understanding to enable businesses and consumers to engage in productive conversations. With flexible products and solutions designed to meet the growing demand for unified, multichannel customer care, Interactions is delivering significant cost savings and unprecedented customer experience for some of the largest brands in the world. Founded in 2004, Interactions is headquartered in Franklin, Massachusetts with additional offices in Indiana, New Jersey and New York. For more information, visit www.interactions.com.
Contain and Deflect
Automated Speech Recognition (ASR) also known as ‘Voice Recognition’ or ‘speech to text’ is the technology that translates spoken words into text, a machine readable format. Interactions ASR technology uses uniquely generated acoustic models that predict how words sound in a given environment, such as when talking on a mobile phone. These acoustic models are combined with language models and pronunciations for exceptional accuracy. Interactions ASR is adaptable to specific domains, environments, and languages. ASR takes spoken word and puts it into text, starting point for being acted on. Interactions ASR technology uses uniquely generated acoustic models that predict how words sound in a given environment, such as when talking on a mobile phone. These acoustic models are combined with language models and pronunciations for exceptional accuracy. Interactions ASR is adaptable to specific domains, environments, and languages. ASR takes spoken word and puts it into text, starting point for being acted on. Natural Language Processing (NLP) is a branch of artificial intelligence that deals with the interaction between computers and humans using the natural language. NLP comprises of Natural Language Understanding (NLU), Natural Language Generation (NLG) and Dialog Management technologies. NLU helps understand the meaning behind the words. NLU deciphers the intents (what the user wants to do) and entities (names of products, locations etc) from the text and feeds them to dialog management engine to find the best possible response. NLG is then used to convert that response into the language understandable by humans.
Engage and Delight
Interactions believes AI should adapt to human conversation, not the other way around. Powered by the company’s proprietary Adaptive Understanding™ technology, Interactions IVA combines the latest in Conversational AI— Automated Speech Recognition (ASR) , Natural Language Processing (NLP), machine learning and Deep Neural Networks—with human understanding at real-time. Interactions is known for its unique approach of blending AI and humans or keeping ‘human in the loop’. Adaptive Understanding is at the core of everything Interactions do. Irrespective of the channel, every customer interaction that comes to an Intelligent Virtual Assistant is sent to a Conversational AI engine component of Adaptive Understanding. If the AI has high confidence score on the accuracy of the answer or response, the IVA responds to a customer using the response generated by AI. In rare occasions when the AI doesn’t have high enough confidence score due to multiple speakers, background noise, unrecognized language or dialect, caller accent, or simply a complex intent, Interactions invokes the ‘Human Assisted Understanding (HAU)’ component of Adaptive Understanding in real time. These humans, called Intent Analysts (IAs), listen to the brief audio recording where the AI had low confidence score and helps AI understand it. This human engagement happens in fraction of seconds, so the end customer never feels any delay or lag in response.
Meet the leader behind the success of Interactions
A veteran industry leader and innovator, Mike Lacobucci has led the charge in technology disruption for more than 30 years. As President and CEO, Mike is recognized as the driving force behind Interactions explosive growth and success. Named Technology Entrepreneur of the Year® by Ernst & Young, Mike joined Interactions in 2008 and has catapulted the company from a startup to an award-winning market leader uniquely positioned at the intersection of speech recognition, customer care and multimodal interface technology. Prior to taking the helm at Interactions, Mike served as Chief Executive Officer of Idiom Technologies, a developer of enterprise level translation automation technology servicing Fortune 1,000 corporations, later acquired by SDL Enterprise LTD. He also served as an Executive in Residence at Sigma Partners, having worked at Books 24×7, Focus Enhancements, Phoenix Technologies and Cullinet Software.