To Truly Leverage Your Data, You Must Optimise Your Data Storage
2 July 2021
Data is the fuel of industry and commerce in the information age. Insights and analytics help us answer questions about how our businesses should operate and who our customers are with ever-increasing accuracy. Companies can use artificial intelligence (AI) to put products and services in front of the right people when they are making buying decisions. Robots and automation are driving efficiency in every business process, from manufacturing to logistics to HR. And the global network of connected devices we refer to as the Internet of Things (IoT) – from smartphones to self-driving cars – means machines can communicate and talk to each other to solve problems with no input from us.
Every single one of these game-changing developments is powered by one thing – the vast and ever-increasing flow of digital data that we are generating and are able to capture, store and analyze.
It’s clear that some companies have gone on to revolutionize their industries – and the way we live – by harnessing this flow of data, creating new services for us that make our lives easier or better. This includes tech giants with their search engines, communications tools, and e-commerce, as well as more specialized solutions covering everything from ride sharing to streaming entertainment, booking holidays, and dating.
Unfortunately, not every business has had the same success. It’s been frequently reported in any number of industry studies and reports that a majority of the data that businesses generate is not used, and many organizations are still struggling to manage, let alone monetize their data streams.
Here I’m going to take a look at one of the first hurdles companies have to overcome – storage. It’s a seemingly simple one, but lack of a strategy for overcoming it can lead to a lot of headaches further down the line, as data volumes continue to grow and decisions need to be made about what data is important at what particular time.
Cloud services offer near-unlimited capacity for organizations to store as much information as they need, but there are complicating factors. Some data may be too sensitive or come with too high a regulatory burden to host off-site. Some data may need to be accessed instantly from anywhere in the world, while some may simply require archival for legal reasons. And data may need to be audited regularly to make sure it is still relevant and hasn’t become outdated or possibly even illegal due to changing regulatory frameworks around the world. Not knowing where your data is, how many copies you have, or how to access it at any given time can severely impede your abilities to carry out these essential functions.
For the most valuable, insight-rich data, the fastest and most highly available storage systems are a necessity. Modern business analytics operations require the ability to move and sort large volumes of data to provide business users or customers with the responsive, push-button functionality they expect from services today. And it has to be backed by encryption and security because trust is everything – no one wants to use a service that will endanger or expose them due to handling their data in an unsafe way.
Intelligent Data Storage
Today’s most advanced data storage systems, such as the IBM FlashSystem family, store information on solid-state non-volatile media to achieve the best possible speed, resilience, and security. They also take advantage of AI technology, including machine learning tools, to smartly manage the way data is stored and accessed, further increasing the speed of access and minimizing the chance of errors or data loss that could impact your business. For example, data that is predicted to be accessed more frequently will be queued and ready to go when it is needed, whereas data that is likely to be less mission-critical may be flagged for transfer to a less accessible but more secure environment, such as tape storage, for archival, or even deleted if storing it may create further problems. To make these predictions, IBM relies on insights from over two exabytes (two billion gigabytes) of data it has under management.
Resilience is another key requirement that any business wanting to get serious about data needs to figure out. If your essential internal and customer-facing operations are all built around acting on data-driven insights (which, of course, should be the aim), they can’t grind to a halt because there are problems with data flow or infrastructure issues. This might mean ensuring data is constantly backed up across cloud and on-site servers (while maintaining regulatory compliance) as well as legacy systems. IBM provides this for its FlashSystem customers through its FlashCopy technology that allows production data to be rapidly copied and replicated. This means that in situations where data integrity is absolutely vital, two or more identical copies can be kept continuously synchronized, in separate physical locations, and restored with virtually nothing lost, should an unexpected disaster happen.
Coping with very fast-moving and constantly changing datasets
One organization that has tackled the problem of implementing the infrastructure needed to deal with truly fast-moving data is the UK Met Office. Its data is used by governments to plan for changing weather, by supermarkets to react to seasonal trends, and by researchers investigating climate change. For this to happen, it needs to ingest and analyze 300 million weather-related data points every day and make them available as insights to its customers. In fact, it does it all twice – to eliminate the risk of data flow being interrupted during development.
To do this, it has developed a hybrid cloud strategy built on IBM FlashSystem, and although its original assessment was that flash storage might be prohibitively expensive, it turned out to be a cost-efficient solution due to the high level of compression it enables. This lets them build the type of high-performance data infrastructure needed to push its data and insights from its in-house servers to the public cloud and on to its customers.
Another example of a data strategy adapting to cope with ever-increasing volumes and workloads needed for today’s applications comes from the Roman Catholic Church – perhaps not the first place we would think to look!
The Archdiocese of Salzburg needed a solution to more effectively provide services such as support and community outreach to its congregation, as well as preserve and provide access to its massive record of historical documents and literature, some of which is over 1,000 years old.
By migrating to solid-state, non-volatile storage systems and leaving behind the mechanical disk-based storage, with its lower access speeds and higher fail rate, the archdiocese was able to increase its response times by 10 to 20 times while also improving security and encryption. Due to the higher availability of data and better understanding of their storage system, thanks to AI analytics, the church is now embarking on a project to make its fascinating historical records accessible via the cloud.
As data becomes an increasingly important piece of an organization’s operating assets, it’s important not to underestimate the importance of making the right decisions when it comes to storage. It’s no longer a simple choice between cloud or on-premises, with a hybrid approach often seen as the best way of realizing those all-important efficiencies. Every business needs to consider storage as a core component of their data strategy, just as they would with data acquisition, analytics, and actioning.
Where to go from here
If you would like to know more about technology during COVID-19, check out my articles on:
- What is Big Data?
- Big Data in Practice
- What Is The Relationship Between KPIs And Big Data?
- How Is Big Data Transforming Business?
- Amazon: Using Big Data to understand customers
- GE: Big Data and the industrial internet
Or browse the Big Data & Analytics to find the metrics that matter most to you.
Related Articles
The Employees Secretly Using AI At Work
Imagine walking into your office and noticing your colleague Sarah effortlessly breezing through her tasks with uncanny efficiency.[...]
Battling AI Fakes: Are Social Platforms Doing Enough?
Since generative AI went mainstream, the amount of fake content and misinformation spread via social media has increased exponentially.[...]
Sign up to Stay in Touch!
Bernard Marr is a world-renowned futurist, influencer and thought leader in the fields of business and technology, with a passion for using technology for the good of humanity.
He is a best-selling author of over 20 books, writes a regular column for Forbes and advises and coaches many of the world’s best-known organisations.
He has a combined following of 4 million people across his social media channels and newsletters and was ranked by LinkedIn as one of the top 5 business influencers in the world.
Bernard’s latest book is ‘Generative AI in Practice’.
Social Media