9

Integration Architecture: Evolution from ESBs, Data Virtualization and API Manag...

 1 year ago
source link: https://www.gigaspaces.com/blog/integration-architecture-data-virtualization
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Integration Architecture: Evolution from ESBs, Data Virtualization and API Management to an Operational Data Hub

13min. read
iStock-1280941270-Data-Virtualization.jpg

To be able to quickly develop sublime apps and services, organizations require efficient processes and systems. Digital integration is a major component of these processes – and one that offers many challenges that include connecting disparate data sources, cloud services and on-premises APIs in real time. Once achieved, having enterprise-wide access to APIs, microservices, and event-based data allows developers to produce new digital products faster, with much less effort, enabling organizations to move faster while reducing risk. This allows organizations to focus on what matters most: offering their customers an optimum digital experience. 

An integration platform is a powerful tool that transforms your IT landscape, allowing you to create an infrastructure that connects people, processes and technology in a way that makes it possible to deliver real-time business insights by integrating all data sources into a single view. The platform also promotes agile development so that teams can collaborate more efficiently.  A collaborative environment where knowledge sharing occurs naturally is created, as all stakeholders with the required authorizations can access the same information in real-time, in any time zone or location across both on-premises and cloud-based environments.

How to Implement your Integration Architecture

Integrating an organization’s data architecture can be accomplished using one or more of these four options:

  • Service Bus
  • Data Virtualization
  • API Management
  • Operational Data Hub 

Each option is valid depending on the desired goal – and neither methods are mutually exclusive.  

Service Bus

An Enterprise Service Bus (ESB) is a flexible solution that can be used to build a variety of distributed applications. It provides reliable messaging and communications across platforms, services and devices. It  enables you to connect systems using message-based integration patterns over the Web using industry-standard protocols like HTTP/HTTPS, AMQP (Advanced Message Queuing Protocol), MQTT (Message Queuing Telemetry Transport) and others.

A Service Bus includes these components:

  • Relay Router – Allows developers to route traffic between various endpoints based on incoming requests
  • Service Bus Queues – Used for asynchronous messaging between applications 
  • Service Bus Topics – Used for broadcasting messages to multiple subscribers
  • Azure Relay (previously named Service Bus Relay) – A feature in Azure App Service that acts as a proxy between your client app and the server-side code call

Data Virtualization

Data virtualization is the process of accessing and combining data from different sources to create new business value. Data virtualization software allows you to integrate disparate data through a single representation, enabling you to access and query a unified source that contains all your data. This approach offers a complete view of your business information, allowing you to combine structured and unstructured data; analyze it as if it were all one single database, and use it for decision support and reporting purposes.

Data visualization image

Data virtualization is often used in conjunction with data integration software. It’s similar to data federation, but it goes beyond just aggregating the data, offering a single view of all your business information. The advantages of this approach are that the data virtualization solution acts as middleware, enabling existing infrastructure to be easily integrated with new applications, and eliminating data silos with one data access point. KPIs and rules can be centrally defined, and a change in data sources or front-end solutions does not require a complex, expensive restructuring effort.

On the flip side, any data virtualization system performs only as well as the performance of the  system of record that it supports. Hence, as volumes of data grow, scalability and latency become major issues.  

Challenges with Service Buses and Data Virtualization 

When we use a Service Bus or a Data Virtualization Platform, we face three main challenges:

  • Accessing many different stores and services is always a challenge, even when using an orchestration tool
  • The weakest link in the chain will extend the overall latency in the response 
  • Upkeeping such workflows and data access will ultimately cause data integrity issues or inconsistencies if someone missed or didn’t understand the flow correctly during future workflow updates

With these issues in mind, let’s look at another option, the API-first approach and API management. 

API Management

API Management is a core component of a modern API ecosystem, and it’s one of the most critical tools in your organization’s toolbox. It allows you to manage all aspects of your APIs, including identity, authentication, usage analytics, throttling and access control. With API Management you can make sure that third parties are following best practices, set up rules around what they can do with your data and keep track of all their requests. Your organization will also be able to monetize its APIs through usage-based pricing models or charge developers based on the number of calls they make. 

With API Management you can make sure that third parties are following best practices, by setting rules that determine what they can do with your data and allowing you to keep track of all their requests. Your organization will also be able to monetize its APIs through usage-based pricing models or charge developers based on the number of calls they make.

This methodology introduces the API-first approach. To be implemented successfully it requires the buy-in of the entire organization:

  • The API-first approach is a way of building your APIs to take advantage of new capabilities and technologies, such as microservices and serverless computing. It starts with the API itself and then uses that to drive the software development process.
  • APIs are usually used to create a “front end” that calls back an application from another system. This means that you can build an application on top of an existing one without having to duplicate code or change it in any way.
  • By adopting this approach, you can take advantage of cloud-native patterns to design your applications so they’re scalable, reliable and secure out of the box — and ready for modern deployment scenarios like serverless computing or containers.
  • API-first development is a best practice that can help you develop faster, more easily and with greater flexibility. When done right, it can improve employee productivity by reducing the time to deliver new features and functionality. It also helps your organization communicate better with its customers by providing open access to their data and applications.
  • APIs are also the foundation of microservices architecture, which is a way to break up an application into smaller pieces that can be developed and deployed independently from each other. Each microservice has its own API and can be built by different teams for different purposes.

The main benefit of API-first development is that it makes it easier for you to create and update APIs, as well as maintain them. If you are a developer or IT manager, it’s important to understand how an API can drive better business outcomes for your organization by improving internal processes.

The drawbacks of using only API Management include a steep learning curve to be able to architect high-availability applications at scale. Also, since this capability doesn’t include an internal high-speed data store, performance degradation can become a real concern. 

Operational Data Hub

An operational data hub such as GigaSpaces Smart DIH decouples enterprise systems of record (SoRs) from their digital applications, offering extreme performance via an operational real-time data fabric. It uses an embedded CDC to replicate SoR data in real-time and offers built-in multi-cluster replication capabilities that accelerate the journey to the cloud. 

This hub utilizes event-driven architecture, providing a unified data layer that delivers ultra-low latency and near-linear scalability. Smart DIH improves business agility and maintains an up-to-date picture of fast-changing data. It is used mainly with operational data but also handles analytics-based services, notification services and data integration. By default, the hub is used internally, using API-serving methodology to serve data to business applications that are often used externally to serve data to customers — in essence, it’s B2B2C. When architectured efficiently, integration with iPaaS or API-management tools provide the ultimate decoupling without compromising performance. 

For example, an API can be used to create a single point of access to all the data within the operational data hub. This means that an application can send or receive data from multiple applications, services or databases without having to build custom interfaces for each one. The result is improved productivity, suburb performance and reduced costs associated with maintaining multiple interfaces. 

The Recommended Way to Create an Efficient Integration Architecture

For most organizations, creating a successful integration architecture is a challenging proposition that can be partially solved by a Service Bus, Data Virtualization, or API management solutions. An operational data hub is a more effective, comprehensive solution. GigaSpaces Smart DIH connects all data sources, on-premises APIs, and cloud services into a centralized platform that allows you to quickly integrate applications and devices, while achieving resiliency and availability through virtualized integration outside your core systems. By consolidating software solutions from different vendors to build this integration workflow, an operational data hub ultimately translates to fewer license costs, less integration and less maintenance. 

banner

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK