Big data is BIG, but it must be real-time

Issued by MIP Holdings
Johannesburg, Jul 27, 2020

Big data has been a hot topic for years now, with predictions about its impact ranging from organisations being able to provide services specifically tailored to individuals, to “big brother” governments having and in-depth view into every aspect of a citizen’s life. While many of these predictions seem to have fallen far short of the mark, the technology is finally catching up to the verbiage, according to Richard Firth, CEO of MIP Holdings.

In fact, he says the conversation has moved from big data as a concept to its real-time usage, and the benefits organisations are gaining from this. “We have moved from simply storing big data to timely data; from the collection of data to real-time feeds; from passive data gathering to active analytics,” he explains.

“We now have enough data to make the use of big data a reality. Part of continuous data collection is creating real-time data feeds, which are processed and delivered to other systems to uncover patterns and insights into how different aspects of the business are functioning, or how customers are interacting.”

This, however, only applies to companies that have implemented an effective data strategy and programme, Firth adds. According to a recent survey, 85% of organisations aim to be data-driven, but only 37% report success. This is partly because of the sheer volume of data companies are collecting, and partly because the quality of that data isn’t where it needs to be to enable real-time insights.

“Data does not always just magically start showing spectacular results. It takes a lot of work. The data must be collected and stored properly, and then it needs to be analysed. All of that data must be prioritised and optimised to extract the right insights,” Firth says.

“Real-time analytics and insights may be what most companies aspire to, but if they don’t have their back-end systems and integration set up correctly, they will invariably run into problems. We are collecting data from everywhere, and trying to analyse and use all of that data is a recipe for disaster. Businesses need to ensure that they know what kind of data they need, and then they need to implement the right systems and processes to enable them to get what they want out of that data.”

Gartner has found that poor data quality costs companies $15 million a year, stating that this situation could worsen in the future considering the complex nature of data sources and the massive volumes being collected. “Big data needs to be used as a tool, and smart systems are necessary to ensure that challenges around data volumes and quality don’t hinder outcomes. Information arising from the data must be correctly re-integrated into a company’s existing processes and systems in an intelligent way. This is where we start seeing results,” Firth says.

“When information derived from the data is integrated back into the business in the right way, real-time feeds can start informing real-time interactions with existing business processes. It is quite a concept: an operational business process feeding data into some form of repository in near real-time will then also get information back from the repository that will impact the outcome of or add value to the business process that was supplying the data! This not only allows for greater levels of information, customisation and personalisation for those interactions, but also the foundation, off which autonomous systems based on ML and AI can be built. With the right data – and the right use of that data in the form of information – we have started building workflows or cases that allow just enough to be automated to free up manpower for more important and strategic activities, while offering the flexibility for more personalised human interactions. Big data will mature with real-time data, and real-time data is finally delivering on the promises of big data.”