• Home
  • Blog
  • SOA, ESB, Data Integration: let's shed some light
SOA, ESB, Data Integration: let's shed some light
Article

SOA, ESB, Data Integration: let's shed some light

SOA, ESB and Data Integration: in this article we deep dive into these three models - often confused and overlapped - to better understand their similarities and differences.

 

First, some definitions: what are SOA and ESB?

Service-Oriented Architecture (SOA) is a methodology for making software components reusable and interoperable via service interfaces that use a common communication language. Each service contains the code and data integrations needed to execute a complete and distinct business function - for example, retrieve information or perform an operation. Services are made available to developers who can find and reuse them to assemble new applications or business processes.

SOA emerged in the late 1990s. Prior to then, the connection between applications was made using point-to-point, capillary integrations, with a "monolithic" approach that forced developers to recreate these connections from scratch for each new project. With this model, the code for the entire app was created in a single distribution, with the consequence that for any kind of modification (or in case of malfunctions) the entire app had to be taken temporarily offline until the updated version was distributed.

In this sense, SOA represents an important evolution, as it removes the need to perform integration from scratch: instead, developers can reuse existing functions leveraging an Enterprise Service Bus (ESB) without recreating them

An ESB is an architectural pattern in which a centralized software component performs integration between applications, handling data model transformations, connectivity, routing, communication protocol conversion, and making the integrations available as service interfaces for reuse by new applications.

 


What is the connection between SOA and ESB?

It is possible to implement an SOA without an ESB, but this would require a lot of work and create significant maintenance challenges: reusable interfaces are used, this is true - but in fact, applications would still be connected in point-to-point mode.

The ESB is considered a de facto element of any service-oriented architecture implementation, where services communicate using a "low-coupling" system.

 


And Data Integration?

In our conception, the core principle of Data Integration is the complete decoupling between applications that produce the data and applications that consume it (we discuss about this more in detail in our Guide to Data Integration).

 

Download the guide: Data Integration vs. App Integration

 

The principle of integrating systems through interfaces, as is the case with SOA, is certainly an important starting point for pursuing the goal of decoupling between applications. But in reality, this rarely happens. In fact, most of the times, the services exposed by applications are designed to meet the need for data exchange with another application:  in reality, we're implementing Application-to-Application integrations, not really allowing the reuse of interfaces and creating strong dependencies between them.

Even using an ESB, a tool that is supposed to help bring order, is no guarantee of reuse. Often, ESBs become a very extensive catalog of services, some very similar to each other and specific for point-to-point integrations between certain applications; or they become mere service exposers, offering no significant additional benefits.

The Data Integration methodology, on the other hand, coupled with a technology that acts as a mediator, makes it possible to move from loosely coupled systems to truly decoupled ones, achieving the goal of reuse. A data-integration oriented approach in data exchange implies that the application exposing the data does not know who consumes the data, thus achieving a complete decoupling between them. Furthermore, the data exposer must necessarily ask itself what data is of interest to others, what data provides value when combined with other data.

Besides, the data user will use the data service exposed by the mediator, without having direct integration with the data producer. If the producer changes, the mediator will provide the data to the user in the same way as was done with the previous producer: the user does not perceive any changes. On the other hand, if the user changes, the mediator will adapt the service according to the needs of the new consumer: no intervention required on the producer. 

In conclusion this approach, together with an appropriate technology, makes it possible not to tie applications tightly together (and with high separation costs). It enables to share a data exchange that today may be produced by one application, in the future by another one - without having to rewrite the specific services for the involved applications.

 

New call-to-action

 

Are you with us?

Contact us