Ten years ago, if you mentioned the term “artificial intelligence” in a boardroom there’s a good chance you would have been laughed at. For most people it would bring to mind sentient, sci-fi machines such as 2001: A Space Odyssey’s HAL or Star Trek’s Data.
Today it is one of the hottest buzzwords in business and industry. AI technology is a crucial lynchpin of much of the digital transformation taking place today as organisations position themselves to capitalize on the ever-growing amount of data being generated and collected.
What is Artificial Intelligence?
The concept of what defines AI has changed over time, but at the core there has always been the idea of building machines which are capable of thinking like humans.
After all, human beings have proven uniquely capable of interpreting the world around us and using the information we pick up to effect change. If we want to build machines to help us to this more efficiently, then it makes sense to use ourselves as a blueprint!
AI, then, can be thought of as simulating the capacity for abstract, creative, deductive thought – and particularly the ability to learn – using the digital, binary logic of computers.
Research and development work in AI is split between two branches. One is labelled “applied AI” which uses these principles of simulating human thought to carry out one specific task. The other is known as “generalised AI” – which seeks to develop machine intelligences that can turn their hands to any task, much like a person.
Research into applied, specialised AI is already providing breakthroughs in fields of study from quantum physics where it is used to model and predict the behaviour of systems comprised of billions of subatomic particles, to medicine where it being used to diagnose patients based on genomic data.
In industry, it is employed in the financial world for uses ranging from fraud detection to improving customer service by predicting what services customers will need. In manufacturing it is used to manage workforces and production processes as well as for predicting faults before they occur, therefore enabling predictive maintenance.
In the consumer world more and more of the technology we are adopting into our everyday lives is becoming powered by AI – from smartphone assistants like Apple’s Siri and Google’s Google Assistant, to self-driving and autonomous cars which many are predicting will outnumber manually driven cars within our lifetimes.
Generalised AI is a bit further off – to carry out a complete simulation of the human brain would require both a more complete understanding of the organ than we currently have, and more computing power than is commonly available to researchers. But that may not be the case for long, given the speed with which computer technology is evolving. A new generation of computer chip technology known as neuromorphic processors are being designed to more efficiently run brain-simulator code. And systems such as IBM’s Watson cognitive computing platform use high-level simulations of human neurological processes to carry out an ever-growing range of tasks without being specifically taught how to do them.
Where to go from here
If you would like to know more about , check out my articles on:
Or browse the Artificial Intelligence & Machine Learning to find the metrics that matter most to you.