9

Elevating customer experience through efficient data management: Why millisecond...

 2 years ago
source link: https://www.gigaspaces.com/blog/elevating-customer-experience-through-efficient-data-management-why-milliseconds-matter
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Elevating customer experience through efficient data management: Why milliseconds matter

7min. read
Feature-image.jpg

Digital transformation (DX) starts and ends at the contact point of the customer. But the entire DX journey of the business is powered by data.

In a recent ‘fireside chat’, GigaSpaces’ Sr. Director of Solutions Engineering Andrew Showstead and Intellyx president and analyst Jason Bloomberg, shared best practices about how digital transformation is causing organizations to find new ways to conduct efficient data management.

It’s about business requirements, not technology requirements

Typically changes in customer requirements drive changes in data infrastructure, not the other way around. After all, if there were no high-value business reasons for changing the way you handle data, we’d never move any of it off of the old mainframe!

IT-centric service level concerns such as availability and security are par for the course in any data management scenario, and while all organizations strive to continually improve them, they seldom motivate transformative efforts (unless there is a major failure or breach event, of course).

The session also addressed compliance and data sovereignty concerns, and the long-term costs of maintaining and building upon systems of record, whether mainframes, external data services, or cloud data warehouses. Now we’re getting closer to the heart of DX, though it is still inward-focused.

By far, the biggest business data management efficiency gains are obtained in projects that positively impact customer and employee relationships.

“People don’t want to have to go into the bank or the insurance agent’s office, and that’s a key part of the story – especially now that we’re entering a post-COVID era, where work from home is shaping up to be an established pattern much more so than it was before the pandemic,” said Bloomberg.

Customer experience (CX) expectations have never been higher. In a world where we depend on data to fuel every experience, we want to make the DX transition happen without breaking the flow.

Enterprises want to gain “… the ability to say what’s happening in real-time, or provide near sub-second information at the application developer’s fingertips,” said Showstead. “There are periods of time where back-end systems of record are not available and folks are saying, well, we can’t just stop delivering services to our customers at that time.”

Milliseconds matter, and so do development days

Customer experience is about delivering lower latency, faster application response times, and faster access to near-real-time data. As long as fundamental security and accuracy concerns are met, the reputation of your firm will depend largely on the responsiveness of your data.

How does this need for data responsiveness relate to a real-world example? Think of an airline that continuously finds itself canceling and rebooking flights for most of its customers. When cancellations asynchronously cascade through the airline’s various modules – scheduling, flights planning, reservation, maintenance, billing, and ticketing applications – it wreaks havoc upon all of the disparate data sources that each function depends on.

Showstead highlighted the need for always-on access to data for real-time responsiveness. “Experience for the application users is only achieved by having 24×7 access to the platform, which means having 24×7 access to the data. Ultimately, the systems of record may not be available 24×7, yet you still must have always-on access.”

 Talking further about ‘real-time’ data, Bloomberg mentioned that while we will never completely overcome the speed-of-light limitations of moving data around, architects can always achieve lower latency by bringing data and workloads closer together.

Equally as important as the speed of data itself, is the speed of development to properly utilize that data to accelerate time-to-market for new functionality, which makes all the difference in competitive markets.

“You can bring all these different data technologies into a common view and accelerate access to that data with lower latency so that you can offer better services to the consumers of your data,” replied Showstead. “But you can also offer the application builders a more streamlined process of accessing that data to develop services, to be more responsive and agile to the changing conditions that we see right now.”

Transformational thinking about hybrid integration

Another embodiment of meeting customer expectations through modernization is the ability to deliver an omnichannel shopping experience. The in-store experience, merged with the online presence, mobile apps and the complex internal data workings of inventory, transactional, and customer service apps that pull it all together really highlighted the critical need for a better integration model that could handle streaming data as well as static data.

“To provide that aggregated 360-degree view of your customer experience, you need access to a set of applications and systems of record that aren’t on a common platform — merging different data models, different data technologies, and different volumes of data,” said Showstead.

Traditional IT architectures rely on middleware or API interfaces and gateways to request data from back-end systems and then use another set of applications or workloads to process the responses. While we’ve come a long way since tightly coupled data pipelines and ESBs, these request/response-based modes of integration have been layered over the existing ones additively with newer service and integration platforms over time.

Bloomberg and Showstead discussed the differences between the traditional model that a big analyst firm calls Hybrid Integration Platform (or HIP) and a newer approach known as Digital Integration Hub (or DIH). A DIH would still allow request/response pairs to flow through APIs, but it uniquely also supports event-based communications with a responsive, high-performance data store that can serve up the correct data almost predictively on demand.

Avanza-case-study-1024x536.png

The two gave an example of a financial services firm that implemented a DIH, routing data from a variety of databases and systems of record (DB2, VSAM, and an old ADABAS) through APIs that could respond to requests and events from an in-memory data store. Once the DIH was in place, anytime a data change is captured within any of these core systems, it could become an event that kicks off a process in their applications.

Bloomberg asked how the DIH approach of in-memory data was different from other kinds of caching or CDN methods. 

“Caching data improves performance in some areas, but the holistic approach of an in-memory data grid with the high-performance data store gives you co-location of these features – meaning, you can have the data that needs to be highly responsive running in memory,” Showstead said. “The DIH is pulling event streaming and all those disparate data sources into a uniform data model.”

Conclusion

In a distributed application world where data is more precious than oil or gold, going beyond the efficiency edge of previous data management methods is all about abstraction.

Looking at the architecture diagrams, DIH might resemble previous hybrid integration approaches in some ways, but its unique data management approach can provide faster response times for end customers, with the ability for developers to abstract away the complexity of building new customer-facing functionality that uses real-time data.

A DIH reduces data dependencies by replicating data in memory and serving it up in a normalized form, abstracted from its many sources, exactly where and when it is needed.

“That flexibility in terms of the data model is something that we just did not see 15 years ago in the whole service-oriented architecture wave,” said Bloomberg. “We were stuck with whatever data models the underlying systems had – and we just had to deal with that.”

It’s still early days for the DIH technology paradigm, but the ability to abstract data to decouple it from application logic in production, as well as from the application development and integration process itself, can provide a new efficiency lever for digital transformation.

©2022 Intellyx LLC. Intellyx retains editorial control over the content of this document. At the time of writing, GigaSpaces is an Intellyx customer.

Jason “JE” English is Principal Analyst and CMO at Intellyx. JE has broad experience advising and working for companies offering cloud computing platforms, blockchain networks, SaaS-based solutions, industry-specific marketing tools, supply chain management and gaming. A writer, documentarian, and community builder, JE has written, hosted and edited hundreds of technical news, education and thought leadership blogs, event sessions, podcasts and videos.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK