Real-time data for real-time agility
Data growth has exploded in the past decade and there are no signs of it ever stopping – there is an anticipated 180 zettabytes of data by 2025. That’s almost three times growth in the amount of data generated since 2020. While businesses and organisations lapped up all that data and begin gleaning benefit in terms of more accurate data-driven decisions, a new type of data growth started to emerge.
Deb Dutta, Datastax General Manger in APJ explained, “Thirty percent of all these data will primarily be real-time data… and this is growing because endpoints (that generate data) is growing.”
These are smartphones, tablets, and even sensors which are driven by growth of Internet of Things (IoT) technologies. IoT technologies in turn are being driven by demand for digital transformation and the formulation of concepts like smart cities, smart devices, analytics, alternative energy frameworks, and so on.
Deb described the situation as having, “… a lot of sensors out there that are constantly gathering information and sending it to central repositories.”
And these are contributors to real-time data and the ensuing overall data load.
How can organisations benefit from voluminous data that is now being generated in real-time?
Organisational agility
According to Deb, this means enabling organisations to do the right thing at the right time in the right place with the right resources.
“When these four things align, your organisation becomes really agile,” he said, adding that real-time information is very vital for this to happen.
The second benefit is enabling customer-centricity and really enhancing the customer’s experience of a particular brand. “ Increasingly as as these as consumers hold devices, they collaborate and they communicate with businesses in real time. There is an on-going conversation with customers that seems to never cease.”
Real-time data solution is a total addressable market worth 50 billion dollars. Deb explained it comprises of data-at-rest (operational data that sits in a repository) and data-in-motion (data that streams and where different applications talk to each other).
And this is real-time opportunity to delight customers and make the brand become very sticky.
The power of real-time data and use cases
Real-time data solution is a total addressable market worth 50 billion dollars. Deb explained it comprises of data-at-rest (operational data that sits in a repository) and data-in-motion (data that streams and where different applications talk to each other).
For data-in-motion, there is an element of different applications needing to know whether there is any change in information, which overall gives the data an almost real-time quality.
This may or may not have to do with the way applications are built these days. We used to have large monolithic applications, but we have moved to what Deb described as ‘a world of microservices,’ where pieces of applications talk to each other.
“If you are ordering food, you may do it via a platform where the food ordering system does not work in isolation, right?”
Because after you have ordered the food, you get updates about the preparation status of your food, you get notified when a rider has collected your food, you get alerts when the driver is nearby… you can literally track the order you have made from the time you order it right up to when it reaches your doorstep.
“This deals with data-in-motion,” Deb pointed out, and also explained that the food ordering application could be a separate microservice which keeps your order information because you may want to see a history of food you have ordered before.
Other types of information like your location is critical to the service provider to be able to estimate the time it takes for your food order to reach you.
“That information is not transient. It needs to be persistent. And when another microservice comes in, now there are two microservices talking to each other,” Deb said, underlining the importance of real-time data to not only help organisations execute operationally, but also enhance experience for the customer.
Biggest challenges to real-time data
Cloud native companies that were born in the cloud, likely have an easier path ahead in terms of leveraging real-time data and achieving true customer-centricity and operational agility.
On the other hand, legacy and traditional organisations are not prepared for and use technologies and development tools which are not conducive to a cloud-centric environment.
“They have used data storage technologies which are more optimised for microservices, because they probably built their first applications using microservices. They never built monolithic applications,” Deb said.
On the other hand, legacy and traditional organisations are not prepared for and use technologies and development tools which are not conducive to a cloud-centric environment.
Many of this such tools are available in the open source space but Deb explained that integrating these tools requires a lot of customisation.
“And the challenge basically is because historically, we have seen the creation of application silos. Applications are built, and they use their own separate data sources, and you end up with silos that restrain the data,” Deb explained.
It is not that legacy enterprises do not try, but most embark upon a digital transformation journey with the wrong approach. For example, many organisations are content with simply lifting and shifting their IT applications into the cloud. “Lifting and shifting legacy workloads into the cloud, is not the solution!” Deb exclaimed, explaining that legacy organisations are massively overspending as a result of this approach.
Yet another challenge that arises is the seet of tools that are being used in the front end to build applications. “These are not conducive for developer agility,” he also pointed out.
This means there is no application programming interfaces, which enable developers to connect the applications they write, to the data sources that the applications consume data from, he explained.
Many of this such tools are available in the open source space but Deb explained that integrating these tools requires a lot of customisation.
Integrating the data and capturing the changes in data
This is where Datastax claims to be able to help by abstracting the data from each of these application silos to an application data store. Think of an abstracted layer containing all these data that sits across all applications.
Deb shared that Datastax’s best practice is to guide organisations to first prioritise applications that are urgent as a first step.
“And then we provide the technology and services to move applications and abstract the data they contain to an operational data store where they can be consumed with any kind of front-end applications that developers write, including modern microservice-based applications.
The next important thing their technology does is capture the changes that happen to the data and communicate those changes to other applications that are in adjacency.
“So then that data becomes a corporate asset, with any application being able to access the data and give a holistic view of the business to the organisation itself, and to its customers,” Deb explained.
The next important thing their technology does is capture the changes that happen to the data and communicate those changes to other applications that are in adjacency.
“Because, when we can recognise, register, and communicate the changes that is what gives real-time insight, right?” he concluded.