Artificial Intelligence (AI) is one of the most transformative forces of our times. While there may be debate whether AI will transform our world in good or evil ways, something we can all agree on is that AI would be nothing without big data.
Even though AI technologies have existed for several decades, it’s the explosion of data—the raw material of AI—that has allowed it to advance at incredible speeds. It’s the billions of searches done every day on Google that provide a sizable real-time data set for Google to learn from our typos and search preferences. Siri and Cortana would have only a rudimentary understanding of our requests without the billions of hours of spoken word now digitally available that helped them learn our language. Similarly, Connie, the first concierge robot from Hilton Hotels understands natural language and responds to guests’ questions about the hotel, local attractions, restaurants and more. The robot became intelligent due to the extensive data it was given to learn how to process future input.
AI continues to mature due to the explosion of data
Each year, the amount of data we produce doubles and it is predicted that within the next decade there will be 150 billion networked sensors (more than 20 times the people on Earth). This data is instrumental in helping AI devices learn how humans think and feel, and accelerates their learning curve and also allows for the automation of data analysis. The more information there is to process, the more data the system is given, the more it learns and ultimately the more accurate it becomes. Artificial Intelligence is now capable of learning without human support. In just one example, Google’s DeepMind algorithm recently taught itself how to win 49 Atari games.
In the past, AI’s growth was stunted due to limited data sets, representative samples of data rather than real-time, real-life data and the inability to analyse massive amounts of data in seconds. Today, there’s real-time, always-available access to the data and tools that enable rapid analysis. This has propelled AI and machine learning and allowed the transition to a data-first approach. Our technology is now agile enough to access these colossal datasets to rapidly evolve AI and machine-learning applications.
AI enabled by big data
Businesses in all industries are joining AI pioneers such as Google and Amazon to implement AI solutions for their organisations. MetLife, one of the largest global providers of insurance, employee benefit and annuities programs, has also powered AI initiatives with big data. Speech recognition has improved the tracking of incidents and outcomes, the company has more efficient claims processing because claims models have been enriched with unstructured data they now analyse such as doctor’s reports and they are working toward automated underwriting.
Will a computer ever be able to think like a human brain? Some say never, while others say we’re already there. Nevertheless, we’re at the point where the ability for machines to see, understand and interact with the world is growing at a tremendous rate and is only increasing with the volume of data that helps them learn and understand even faster. Big data is the fuel that powers AI.
Bernard Marr is an internationally bestselling author, futurist, keynote speaker, and strategic advisor to companies and governments. He advises and coaches many of the world’s best-known organisations on strategy, digital transformation and business performance. LinkedIn has recently ranked Bernard as one of the top 5 business influencers in the world and the No 1 influencer in the UK. He has authored 16 best-selling books, is a frequent contributor to the World Economic Forum and writes a regular column for Forbes. Every day Bernard actively engages his almost 2 million social media followers and shares content that reaches millions of readers.