Written by

Bernard Marr

Bernard Marr is a world-renowned futurist, influencer and thought leader in the fields of business and technology, with a passion for using technology for the good of humanity. He is a best-selling author of 20 books, writes a regular column for Forbes and advises and coaches many of the world’s best-known organisations. He has over 2 million social media followers, 1 million newsletter subscribers and was ranked by LinkedIn as one of the top 5 business influencers in the world and the No 1 influencer in the UK.

Bernard’s latest book is ‘Business Trends in Practice: The 25+ Trends That Are Redefining Organisations’

View Latest Book

Follow Me

Bernard Marr ist ein weltbekannter Futurist, Influencer und Vordenker in den Bereichen Wirtschaft und Technologie mit einer Leidenschaft für den Einsatz von Technologie zum Wohle der Menschheit. Er ist Bestsellerautor von 20 Büchern, schreibt eine regelmäßige Kolumne für Forbes und berät und coacht viele der weltweit bekanntesten Organisationen. Er hat über 2 Millionen Social-Media-Follower, 1 Million Newsletter-Abonnenten und wurde von LinkedIn als einer der Top-5-Business-Influencer der Welt und von Xing als Top Mind 2021 ausgezeichnet.

Bernards neueste Bücher sind ‘Künstliche Intelligenz im Unternehmen: Innovative Anwendungen in 50 Erfolgreichen Unternehmen’

View Latest Book

Follow Me

What Is Artificial Intelligence (AI) In 60 seconds

2 July 2021

Ten years ago, if you mentioned the term “artificial intelligence” in a boardroom there’s a good chance you would have been laughed at. For most people it would bring to mind sentient, sci-fi machines such as 2001: A Space Odyssey’s HAL or Star Trek’s Data.

Today it is one of the hottest buzzwords in business and industry. AI technology is a crucial lynchpin of much of the digital transformation taking place today as organisations position themselves to capitalize on the ever-growing amount of data being generated and collected.

What is Artificial Intelligence?

The concept of what defines AI has changed over time, but at the core there has always been the idea of building machines which are capable of thinking like humans.

After all, human beings have proven uniquely capable of interpreting the world around us and using the information we pick up to effect change. If we want to build machines to help us to this more efficiently, then it makes sense to use ourselves as a blueprint!

AI, then, can be thought of as simulating the capacity for abstract, creative, deductive thought – and particularly the ability to learn – using the digital, binary logic of computers.

Research and development work in AI is split between two branches. One is labelled “applied AI” which uses these principles of simulating human thought to carry out one specific task. The other is known as “generalised AI” – which seeks to develop machine intelligences that can turn their hands to any task, much like a person.

Research into applied, specialised AI is already providing breakthroughs in fields of study from quantum physics where it is used to model and predict the behaviour of systems comprised of billions of subatomic particles, to medicine where it being used to diagnose patients based on genomic data.

In industry, it is employed in the financial world for uses ranging from fraud detection to improving customer service by predicting what services customers will need. In manufacturing it is used to manage workforces and production processes as well as for predicting faults before they occur, therefore enabling predictive maintenance.

In the consumer world more and more of the technology we are adopting into our everyday lives is becoming powered by AI – from smartphone assistants like Apple’s Siri and Google’s Google Assistant, to self-driving and autonomous cars which many are predicting will outnumber manually driven cars within our lifetimes.

Generalised AI is a bit further off – to carry out a complete simulation of the human brain would require both a more complete understanding of the organ than we currently have, and more computing power than is commonly available to researchers. But that may not be the case for long, given the speed with which computer technology is evolving. A new generation of computer chip technology known as neuromorphic processors are being designed to more efficiently run brain-simulator code. And systems such as IBM’s Watson cognitive computing platform use high-level simulations of human neurological processes to carry out an ever-growing range of tasks without being specifically taught how to do them.

Where to go from here

If you would like to know more about , check out my articles on:

Or browse the Artificial Intelligence & Machine Learning to find the metrics that matter most to you.


Business Trends In Practice | Bernard Marr
Business Trends In Practice | Bernard Marr

Related Articles

The Five Biggest Healthcare Tech Trends In 2022 | Bernard Marr

The Five Biggest Healthcare Tech Trends In 2022

Wherever we look in the healthcare industry, we can find new technology being used to fight illness, develop new vaccines and medicines, and help people to live healthier lives[...]

The 10 Tech Trends That Will Transform Our World | Bernard Marr

The 10 Tech Trends That Will Transform Our World

What makes the fourth industrial revolution so different from previous industrial revolutions is the convergence and interaction between multiple technology trends at once. In thi[...]

The 5 Biggest Connected And Autonomous Vehicle Trends In 2022

Autonomous driving promises a future where road traffic accidents and speeding tickets are no longer a feature of life.[...]

The Five Biggest Cyber Security Trends In 2022

The changed world we’ve found ourselves living in since the global pandemic struck in 2020 has been particularly helpful to cybercriminals.[...]

The Five Biggest Space Technology Trends For 2022

The past decade has seen a resurgence of interest in space travel and the technological innovation driving it.[...]

The 5 Biggest Biotech Trends In 2022

Steve Jobs once said that the biggest innovations in the 21st century would be at the intersection of biology and technology.[...]

Stay up-to-date

  • Get updates straight to your inbox
  • Join my 1 million newsletter subscribers
  • Never miss any new content

Social Media

0
Followers
0
Likes
0
Followers
0
Subscribers
0
Followers
0
Subscribers
0
Followers
0
Readers

Podcasts

View Podcasts