Blog

Integration 2.0

Traditional Enterprise Application Integration (EAI) is becoming a commodity. At the same time numerous sources overload us with all kinds of data, and new business models require everything to go faster and better. This cries for new integration approaches. Are YOU ready for Integration 2.0?

Integration 2.0

Since IT systems have been around there has been the need to exchange data between those systems. Over the years several approaches, tools and solutions have been developed for this purpose. And now new technologies and business models require us to alter the way of doing integration.

Let’s go back to the early days. If a system required data from another system, the data was copied manually; A tedious and error prone job. Especially as the amount and complexity of data increased. This resulted in the development of solutions for automated data exchange, based on obtaining data from a specific data source and injecting it into a target system. Some target systems however required additional actions to be performed when inserting data, like validations, logging, record relations or trigger notifications. This led to applications interfaces to inject data in a controlled way. The Common Object Broker Architecture (CORBA) from 1991 by Object Management Group was a first widely excepted standard to build these interfaces.

Traditional Integration

For many years data-exchange was mainly from one system to another (“point-to-point”). As the need for data exchange increased these integrations became harder to manage. In the late nineties several standardized integration best practices were developed and incorporated by vendors in “Enterprise Application Integration” (EAI) middleware-products. Middleware products were generally organized around an Enterprise Service Bus (ESB). An ESB can connect to multiple applications through several technology adaptors, to receive and transform data and to route it to a target system, or to allow parties to publish events and others to receive them.

As multiple systems might require exchanging similar data the Common Data Model (CDM) concept was introduced –generic data models that allowed data to be transferred in a clear and agreed uniform format within the enterprise. Electronic Data Exchange (EDI) standards defined messages and protocols to automatically exchange data between different organizations.

Figure 1: Traditional Integration

A next step was Service Oriented Architectures (SOA-s), that organize and combine data integration functions into a hierarchy of components (services) with a formal interface. Technical standards for these services were developed to allow systems and companies to exchange data even easier over intra- and internet. Vendors adopted these standards to develop systems that were able to exploit these new self-descriptive interfaces, like Business Process Management (BPM) platforms.

IBM, Tibco and webMethods are players of the first hour. Other vendors like SAP (XI, PI), Oracle (Fusion) and Microsoft (BizTalk) stepped in as the market emerged.

Parallel to EAI a second Integration need drove a completely different market: Data integration for Business Intelligence (BI). BI is usually based on aggregated historical data, optimized for reporting and online analysis. This data is commonly extracted from transaction systems, combined and enriched with the other data and finally fed into Data Warehouse (DWH) and consecutive dedicated Data Marts. This process is called Extract-Transform-Load (ETL). ETL is generally batch driven. This in contrast to most Application Integration solutions, which are generally transaction based – either synchronous (request – reply) or asynchronous (publish-subscribe

New developments

The importance of data sharing is ever increasing. The shift of focus from systems to data (“datafication”) is the new mindset, and data is considered an asset now, where before it was merely a by-product of business processes. It requires to value data value and to manage data quality, consistency, security, lineage, traceability and timeliness. This calls for specific functions like Master and Reference Data Management (MDM/RDM), Data protection, Meta Data Management and Data Governance.

Figure 2: Datafication: The data value chain

At the same time data-production is exploding: Increased numbers of transactions over the internet, Internet-of-Things (IoT), social media, e-mail, website-click-behavior: all sources that provide an increasingly amount, speed, and variance in data – popularly mentioned “big data”. These developments drive business change. There is an increasing demand to base advanced decisions on real time data. IT is required to be able to adopt to new developments almost instantly. This requires agility: both in operational IT, architecture and processes. Several organizations have adopted Scrum or Dev/Ops. This development conflicts with most traditional application landscapes. The batch-driven ETL does not satisfy instant-data requirements from BI and streaming analytics anymore, and traditional EAI is not able to deal with the increasing speed and variance of new data.

This is the first blog in a series of 5 blogs about integration 2.0. Read the second blog about Integration 2.0!

Laat je gegevens achter en ontvang de hele whitepaper Integration 2.0

* indicates required
Harald van der WeelIntegration 2.0

Related Posts