The Future of IT Is Automation

Adaptive integration as just described will result in substantial reductions in errors and in implementation and maintenance costs. It will also supercharge an organization’s reaction time-to-change. Reduced errors and instantaneous time-to-change is critical for DI platforms underpinning real-time enterprise data hubs or ZLE environments, and will become even more critical as these constructs grow.

A slowdown or momentary halt here — where thousands of transactions can flow through each second — can ripple across multiple business processes and put hub operations hopelessly behind in no time.

Pure-play business intelligence environments benefit from adaptive integration as well, particularly during this era of compliance and governance. Rapidly changing end-user information requirements, the increasing real-time nature of business intelligence, and the growing complexity of information source and target environments (fueled partly by the advent of Web services) all argue for an adaptive DI platform supporting the business intelligence front end. And the ability to be adaptive also needs to extend to that business intelligence front end.

Flexibility

Adaptive business intelligence is primarily about flexibility. It means delivering business intelligence to any information device or appliance specified by the end user. It means being able to readily insert business intelligence capabilities into portals and applications, often through Web services. And it means being able to run a business intelligence solution on top of any application server, right out of the box, and being able to switch application servers at will, without having to make costly and time-consuming changes to the business intelligence software.

Interestingly, business intelligence has an enabling interior role to play in the workings of an adaptive infrastructure. It all has to do with visibility.

Setting The Stage

If change is truly to revolve around an adaptive infrastructure while the infrastructure stays “glued-in,” then that infrastructure had better be super-glued — and that demands visibility. Adapting quickly to change is only possible when you thoroughly understand the environment you operate in.

Hence, you need deep and on-going visibility into the total environment so you can assess the impact of any adjustments before allowing the infrastructure to adapt. You need to be able to clearly visualize the complex dependencies between data, processes, and applications before proper changes can be made. And you need to be perpetually wise to which data movements and integrations have to be in real-time and which need to be in batch or changed-data-capture mode.

On-going visibility can be achieved by turning business intelligence tools inwards to illuminate the complex inner-working of the infrastructure. Personalized information dashboards, real-time alarms and alerts and other functionality found in state-of-the-art business intelligence solutions all come into play. So to does data profiling. Robust profiling capabilities, embedded in the DI platform and the business intelligence solution, can provide built-in pattern recognition to help visualize data entity relationships and analyze data-corruption issues.

Graphical cross-system visibility into enterprise metadata also lets you better see how to adapt. Metadata expresses information lineage and usage, and a robust metadata capability is a prerequisite for adaptive integration. Visibility into metadata through the use of business intelligence tools enables you to continually assess the impact of system and process changes, improve operational performance, and pinpoint data redundancies and opportunities for re-use.

Becoming smart about what needs to be real-time and what does not can be accomplished by using business intelligence functionality to help visualize and analyze workflows and decision trees. Once you have this understanding, you can let the adaptive DI software dynamically suggest the right processing paths.

A Self-Perpetuating Cycle

There are other aspects of integration that can be adaptive as well. Security, for example, now must be pervasive in both government and enterprise information environments. This should start with secure Web services processing and RSA encryption for data transmission, but can also extend to such techniques as LDAP authentication for log-ins to automatically adapt security measures to changing user environments.

Integration development should also adapt to accommodate increasingly geographically dispersed development resources. Team development over wide areas needs to be enabled through secure versioning and configuration management.

Today, our information environments are human directed. Tomorrow, automated solutions will detect and respond to change as it happens. The more readily our information environments can adapt, the more we can do as people in terms of being creative, strategically oriented, and productive.

Similarly, the more our environments can adapt, the more benefits they will bring to us in terms of enhanced business process effectiveness, holding the line of costs and risks, and increased business agility. Everything changes (that’s a fact), but when your infrastructure is adaptive, the less its core, the adaptive DI solution, will need to change to accommodate changes elsewhere in the environment.

Harriet Fryman is group director of Data Integration Product Strategy for Informatica a provider of data integration and business intelligence software whose customers include 83 of the Fortune 100 and all four branches of the Armed Services.