Alexa and Siri, Amazon and Apple’s digital voice assistants, are much more than a convenient tool—they are very real applications of artificial intelligence that is increasingly integral to our daily life. They both rely on natural language generation and processing and machine learning, forms of artificial intelligence, in order to effectively operate and perform better over time.
Natural Language Generation and Processing
The very complex process of writing and speaking that humans take for granted was a challenge for computer scientists to unravel and replicate. The growing sophistication of natural language processing and natural language generation, subsets of artificial intelligence, take in data, process it and create natural language that sounds as if a human were actually speaking or writing it. Natural language processing (NLP) is the ability of a machine to “read” or “understand” the content produced by humans, e.g. By writing or speaking. Natural language generation (NLG) refers to a machine’s ability to create content in either written or spoken language so that it can be understood by humans.
Machine Learning Help Alexa and Siri Learn
Machine learning is the application of artificial intelligence where machines are given access to data and then can learn from it rather than needing to be programmed by humans what to think and do about the data.
Every time Alexa or Siri make a mistake when responding to your request, it uses the data it receives based on how it responded to the original query to improve the next time. If an error was made, it takes that data and learns from it. If the response was favourable, the system notes that as well.
Data and machine learning are responsible for the explosive growth of digital voice assistants. They continue to get better with the more experiences they have and the data they accumulate.
How Do Voice-Activated Digital Assistants Work?
Now that we understand artificial intelligence is behind the scenes through natural language generation and processing and machine learning helping to power the results of Alexa and Siri, how does it actually work?
When you make a request of Alexa or Siri, the microphone on the device you are using records your command. This recording is sent to over the internet to the cloud. So, if you don’t have internet service, your digital voice assistants won’t be able to work. If you are talking to Alexa, the recording is sent to Alexa Voice Services (AVS). This cloud-based service will review the recording and interpret your request. Then, the system will send a relevant response back to the device.
If you asked, “What’s the weather going to be like today, ” a relevant response is audio detailing the day’s forecast. This all happens in a mere moment.
Telling you the day’s forecast is just one of the more than 70,000 skills Alexa can do. The way the company reached this extraordinary amount in such a short amount of time and is able to continue to increase the number of skills Alexa can accomplish is achieved by Amazon giving developers free access to AVS so that they can keep building skills to augment the system. Currently, there are more than 28,000 smart home devices that can work with Alexa.
Machine Learning Enabled Tremendous Growth
Alexa and Siri continue to get smarter. Alexa has learned how to carry over the conversation from one question to the next the way humans can handle follow-up questions. And, if you don’t know the exact name of a skill you want Alexa to do, if you get close, it will likely be able to summon what you want. In addition, through Alexa Hunches and smart home connected devices, the assistant will be able to alert you if a regular pattern hasn’t been followed such as lights being left on or a door unlocked and offer to fix it for you. Apple’s Siri is now able to sort through background noise and loud music to “wake.” These represent big leaps toward a more conversational and capable voice assistant than what was available just a few years ago.
You guessed it. These leaps are possible thanks to machine learning.
Artificial Intelligence Central to the Operations of Amazon and Apple
Amazon and Apple have made adjustments to their operational structure to make artificial intelligence a core segment and one that extends to other divisions. In the case of Apple, it created a new artificial intelligence/machine learning team that brought together its Siri and machine learning groups under John Giannandrea, the former leader of Google’s machine intelligence, research and search teams. These moves by Amazon and Apple illustrate both companies’ commitments to using artificial intelligence to fuel the efforts of their teams and the generation of intelligent products and services.
Bernard Marr is an internationally bestselling author, futurist, keynote speaker, and strategic advisor to companies and governments. He advises and coaches many of the world’s best-known organisations on strategy, digital transformation and business performance. LinkedIn has recently ranked Bernard as one of the top 5 business influencers in the world and the No 1 influencer in the UK. He has authored 16 best-selling books, is a frequent contributor to the World Economic Forum and writes a regular column for Forbes. Every day Bernard actively engages his almost 2 million social media followers and shares content that reaches millions of readers.