Search This Blog

Integration of Legacy Applications on a Cloud

Although there have been a sea of changes in the software industry over the last 30 years, there has been no major change in data management since the introduction of the relational database system (RDBMS) in the 1970’s. The world has changed drastically since then. We have orders of magnitude more data, arriving at much faster rates, from more sources. Applications that depend on this data have proliferated, reflecting the needs of the business to have faster and more ready access to information. The relationships among those applications have grown as one business process affects another, requiring the applications to share the data in real-time.

Modern relational databases have resolved many of the problems that they either introduced or suffered from in the early stages. They now provide mechanisms for dealing with high availability, clustering and fault tolerance. They can replicate data to peer databases around the world. However, a few problems remain. Firstly, relational databases are a good way to achieve data integration but are poor at achieving process integration. Secondly, using features such as ‘triggers’, they may be able to detect ‘events’ (changes in data that some application may be interested in) but they are traditionally poor at distributing events back out to the client tier. And thirdly, they do not store, nor present data to the client in a ‘ready-to-use’ format for most of the applications. There are multiple layers of translation, transformation, memory mapping and allocation, network I/O and disk I/O that need to occur in order for the simplest of queries to return the simplest of responses. As our use of the RDBMS has grown over time, we have come to depend on them to share data, but they were really only designed to store data.

In an attempt to break down stovepipe systems, there has been a move to Service Oriented Architectures (SOA). SOA helps organizations achieve reuse of individual components of a business process, and makes it easier to adapt their overall processes to align with changing business needs. SOA enables organizations to quickly build new business workflows. However, SOA still fundamentally leaves business processes as stovepipes and it operates on a basic assumption that the components are completely independent. SOA does not address the issue of the real-time interdependencies on the data that the processes share.

In an attempt to get a comprehensive view of data, large organizations are building data warehouses and online/real-time dashboards, so that senior management can see the big picture and drill into critical details. Most dashboard and/or data warehouse solutions pull a copy of the operational data together (usually into a new dimensional format), leaving the original data in place. The operational applications cannot advantage of this combined data view. Data warehousing does not do anything to solve the real-time data interdependencies between applications where business processes intersect. The missing link is ‘data awareness’.

Consider as an example the way that mission-planning applications (such as JTT or JMPS – Joint Mission Planning System) depend on data from Battle Damage Assessment (BDA) systems, Enemy Order of Battle (MIDB), situation reports, etc. The process flow from the mission planner’s perspective and how potential changes to sources he works with impact the work and the mission.

1. The mission planner(s) start work design missions to destroy enemy targets (bridges, bunkers, SAM batteries, etc.).
2. They pull in data from other systems, BDA, MIDB. Whether they use an SOA based process or not has no real impact in the result. Only on how tightly coupled one system is to another
3. If one second later there is an update to the BDA system or MIDB, the mission planner is left unaware. He continues to plan to destroy a target that may already be destroyed, or plan a mission with inadequate resources due to a change at the target location (new SAM battery, additional enemy forces, etc).
4. The mission planner(s) pull in data from others systems as a final check before releasing the plan. They make adjustments to the plan and release it for execution.
5. If one second later there is an update to the BDA system or MIDB, the mission planner is unaware. The executor of the mission will have to deal with it at run-time. Rerouting to another target, hitting the wrong target or encountering unexpected enemy resistance.

How could this be different? The next generation in data management combines:

1. Distributed Caching
2. Messaging & Active Event Notification
3. Active/Continuous Querying
4. Traditional Querying
5. Support for users/applications on disadvantaged or periodically disconnected networks.
6. High Availability and some degree of Fault Tolerance

The interdependency between applications on data and changes to data has serious impacts on mission critical processes. The current way in which data management is done in enterprise applications is over 40 years old and just can not provide many of the critical features needed to build today’s high performance, cross organization applications. It is time to consider enhancing your systems data management ability.

Arguably one of the most significant developments in the history of data management came with the invention of the relational database (circa early 1970’s). With traditional database access, queries are run against the database, result sets are returned, and work then begins from the returned information.
If new data arrives in the database a microsecond later that would have changed the results set, life is tough. You work with the data you have and maybe synchronize with the database before you finish your analysis, planning, or other work. But once again, the data could change right after you finish.

What can you do? If only the database could call you back, based on your query and show the data changes that would have caused that query to have a different result set. That is exactly what happens with new database software. It acts like a tap on the shoulder to alert people when queries they made have changed results.

The new software enables the creation of applications that work both in garrison and in the field. It has built in support for applications that are not always on the network and/or need to work over distressed networks. During the Trident Warrior military exercise the Navy experienced a 90% reduction in the bandwidth used by a Command and Control (C2) application on a ship built with new software. Additionally, that ship experienced a network outage, during which the application continued to function (although it did not receive new data). When the network was functioning again, the ship received new data and the latest update for each piece of stale data.

Performance
a. Customers experience applications that run 4 to 10 times faster on the average
b. Speeds online transaction processing and analysis systems in the financial industry by as much as factor of 10
c. Speeds up complex long running scientific computing jobs on a data grid
d. Order of Battle data access for complex unit subordination sped up data access times from previous two to twenty minutes (depending on the size of the nations forces) to sub-second.
e. Reduce application footprints – instead of running faster shrink footprint
f. Instead of running an application faster, that speed can be used to run the application on less CPU’s, and less database resources. In turn that often means reduced costs for other software licenses. Overall the result is a significant savings in cost, and power to deploy a system. With today’s edge environments stressed for electricity, the new software can help address that issue. It also supports the green initiatives in government.

The Global Command and Control Systems new Common Operating Picture application, Agile Client, uses advanced software to provide high performance service oriented architecture, where users can dynamically subscribe to near real-time track management data from multiple sources, and view that data live on 3-D or 2D surface. Agile Client enables data fusion from multiple sources. It supports DIL environments.

In the defense and intelligence sector, the new software provides four fundamental virtues: Data Awareness, Support Disconnected Operations and Distressed Networks Increase Performance, Reduce Application Footprint and Real Time view of operational data. In essence, it is was able to pull in all the data streams produced by the military, manage that data in-memory so application could provide a window into that data for the military and achieve all of this with guaranteed low-latency, fault tolerance and high throughput. Additionally the software can provide a unified view of data across datacenters with high throughput and low latency.

Summary
Real time integration of transactions from diverse legacy applications will be dictating most of the investments in command and control of military operations.

Because of its critical importance in achieving the sensor-to-shooter integration for Information Dominance, this blog has extracted most of the text from the Gemfire Company (a division of VMware) relevant text.

No comments:

Post a Comment

For comments please e-mail paul@strassmann.com