BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Hazelcast CEO: Automation Rules The Real-Time Economy

Following

Business accelerates. Throughout every tier of the first, second, third (and especially within the AI-driven fourth) industrial revolution, business has been accelerating the pace at which it manufactures, creates, trades, operates and (in some cases, where innovations don’t work) fails.

This perennial reality is perhaps one of the reasons why the technology industry has always been obsessed with the move from data to real-time data.

The death of batch

Where data exists in environments and sits in applications, databases, web services and other entities that move back and forth, its non real-time status is typically denoted by the fact that it has to be compiled, parsed, managed, saved and so processed in batches at the end of the day, or some other defined period. This is why real-time advocates are fond of talking about the so-called ‘end, or death of batch’ era today.

When and where data moves in real-time streams, there is enough computing capacity and crucial enabling real-time software engine intelligence to make a human user perceive that a process has happened instantaneously. In reality, even real-time processing takes a number of milliseconds, so it’s not quite ‘real’ in the sense of our existence on the planet, but that’s a philosophical argument for another day.

A keen protagonist of real-time technologies is Hazelcast. The company is known for its open source in-memory data grid technology that shares information out across compute clusters to enable rapidly scaled data processing. But the company is realistic, it says that true real-time (being able to act on information in the moment) is hard to achieve, especially if we are talking about while, when, during and before an event, not after.

“Many providers still claim to deliver real-time, while waiting on databases and/or other services, causing lag and processing data after the event and missing out on real-time benefits,” explained senior solutions architect at Hazelcast Johnson Noel, in a company technical blog.

“True real-time involves continuously collecting, analyzing, canonicalizing and performing functions (like processing against rules or machine learning models) on the data ‘in-flow’ outside of a database or any other service – in other words, the entire process should be run on very low latency storage and delivered to users or downstream processes long before it touches a database,” clarified Noel.

Automation rules real-time economics

Now looking to explain why automation rules the real-time economy, Hazelcast draws from one of its own implementation stories to explain how quickly the business world is moving and, there’s that word again, accelerating.

Imagine a person being able to get a loan, while they’re standing at the ATM machine. The customer didn’t know they needed one before they get there. But they quickly realized that they wanted more money than they had available. The bank, equipped with data on the customer’s credit risk, profile, investments and financial flows, immediately runs the risk calculation and decides they are a good risk. Customer A (let’s call her Michele) walks off with the extra money they need.

Not one human was involved in this interaction, well, nobody apart from Michele.

Not one slip of paper was reviewed by human eyes. Instead, technology analyzed the new data - Michele wants $500 more than she has - and the old historical data - Michele makes good steady money and is expected to deposit her paycheck in two days as she has for 10 years, plus she also has money in a mutual funds account. The system then calculates in real-time that it’s a good business decision to extend a loan offer.

This is the real-time economy in action - and it is (arguably) driving the real-time economics that is shaping the way we all do business in the modern era of commerce.

“For automated real-time economy actions to occur at the right time for the right reason, companies must merge the fresh insights from new data streaming in and augment it with the historical context of data/insights from the past. The longer or more general the reaction to the insight based on fresh + historical data, the lower the business value of a response. This is where automation will decide winners and losers in the real-time economy,” stated Kelly Herrell, CEO of Hazelcast.

This whole process is something Hazelcast is fond of calling the ‘end of the wait and see era’ today. As you can tell, real-time computing evangelists are somewhat cheerless almost morose obsession with calling out the death knell for the technology platforms and tools that have preceded them, it’s just one of those side-quirks of the technology industry.

As CEO Herrell has openly stated on Techzine.EU, “Thriving in the real-time economy requires instantaneous computation on both new and historical data, something traditional databases cannot do. After years of building our reliable, low-latency data store, we’re focusing on the convergence with real-time data to give enterprises a new approach to improving customer satisfaction, generating new revenue and mitigating risk.”

Just how fast is real-time?

Going back our ATM example one last time, can we ask how fast is real-time, really? If the marketing team are behind the wider promotions that Michele might be offered based upon business logic, can we grasp an appreciation for just how fast things are moving inside the internal mechanics of the software system serving the ATM’s screen?

As noted in an account of a real world implementation here, “Processing completes in less than 120 milliseconds, which gives the marketing team’s [business logic programming inside the software system] more than enough responsiveness to react quickly to customer needs. In most cases, promotions might have business logic to intentionally delay the customer-facing communications by a few seconds to ensure delivery at the ‘right’ time.”

Hazelcast is making this technology possible with its existing IT stack base and the beta launch of its ‘zero-code connectors’ technology. These are components of software code designed to execute data stream joins in real-time computing environments. They work as declarative method units of technology that declare the objective outcome of the software code’s existence and operation and currently support AWS Relational Database Services (RDS) for MySQL and PostgreSQL.

We’re yet to see breakfast business television presenters like the BBC’s (always deeply probing) Sally Bundock start to talk about the real-time economy to any extensive degree, or indeed real-time economics are some new paradigm or principle that the technology sector is now enabling, but it can only be a short time before that happens.

In the meantime, let’s make some tea and head for the ATM.

Follow me on Twitter or LinkedIn