Search This Blog

Database Administration for a Cloud


Clouds will dominate the design of enterprise infrastructure. One of the key components of the clouds cloud infrastructure will be in the center of all systems will concern the integrity and operations of databases. Therefore the organization of Database-as-a-Service (DBaaS) warrants special attention as a separate software discipline.

Software is now available that simplifies how databases are     managed, updated and virtualized. This enables oversight of the total environment of hundreds of disk files. Customers can then access a pool of disk drives without intervention by ingermediary personnel. The objective is to assure a high level of security, flexibility, control and assurance that the data will comply with policy guidelines.

DBaaS automates common database operations by presenting to developers and to operators comprehensive displays, which significantly reduce the management overhead as well as increases the efficiency of memory management. DBaaS manages database provisioning, back up of records and the fail-over cloning for development and testing.

The primary objective of DBaaS is to reduce database sprawl through virtualized pooling of disk capacity while providing for sufficient redundancy to assure the continuity of operations. DBaaS installs automated methods for protecting operations and for providing adequate buffer space that assures low latency response time.

DBaaS will reduce database operating and capital expenses by extending virtualization to all data storage components so that capacity is readily scalable. It provides a central view of potential problems while monitoring performance that assures high transaction processing availability.

A major concern for moving databases to the cloud is the potential interference caused by multiple databases sharing the same pool of resources, the so-called “noisy neighbor” problem. DBaaS offers isolation at the individual user level, thus eliminating “noisy neighbor” situations.

DBaaS restricts data base access to authorized users. Assigned privileges enable IT administrators to control which users can take what actions.

SUMMARY
The ultimate purpose of all information security is the protection of data assets. A creeping degradation of storage can invalidate all of the layers of security that are in place. The protection and management of data resources by means of a highly automated DBaaS is therefore of critical importance.

Engaging ISPs in the Protection Against Cyberthreats


There are approximately 4,000 ISPs in the United States. Each of these provide access to Internet for anywhere from 200 to more than one million customers.

Since all Internet transactions must pass through a service providers, these operations are the best points where to engage countermeasures against the prime sources of cyber corruption, such as: botnets, router hijacking and domain-name fraud.

Tackling the challenges to Internet security is critical because Internet transactions process over $8 trillion worth of transactions occur over the Internet. Any shut-down of the Internet will shut down the US economy.

ISPs have the unique visibility of malware transiting their infrastructure.  They are in position to filter malware more effectively than end-point consumers at millions of points where defense is not only expensive and difficult, but also hard to intercept with sufficiently trained expertise. Since all ISPs are ultimately interconnected, the compromise of any one that is poorly managed can then extend to every securely managed ISP. It is the ISP’s that are accountable for the development of secure routing standards as transactions traverse the Internets in over a dozen connections – each of which can be interfered with.

As organized perpetrators disrupt services and infrastructures the issue of safeguarding Internet transactions ceases to be a contractual matter. It becomes a concer for national cyber security.

Cyber attacks such as Distributed Denial of Service (DDoS), introduction of botnets, insertion of a collection of compromised computers or large-scale zero-day infections call for concerted as well as costly protective measures. Thousands of smaller scale ISPs may not be able to afford that.
Adoption of an industry-wide code of safeguarding Internet transactions is insufficient without the assurance of government certification of compliance with well-researched standards.

The newly organized National Cybersecurity Center of the National Institute of Standards, Department of Commerce, appears to be the most suitable organization to develop such standards. However, a certification of compliance with cyber policy would it would require the full force of law in order to be enforceable.

The existing laws that govern the conduct of Internet-related transactions concentrates entirely on police issues such as the Communications Assistance for Law Enforcement Act (CALEA).

SUMMARY
The flaws in the Internet are persistent and cumulative and not temporary. The list of cyber flaws contains ten thousands of entries and applies equally to financial, manufacturing, utility or defense networks. This list changes every minute as attackers modify their software. For instance, network routers that pass on all traffic from one computer to another are vulnerable to promiscuous mode corruption; to router table attacks; to shortest path compromises; to border gateway flaws and to border gateway poisoning to name just a few named faults. All of these routers as managed by ISPs.

ISPs also control network switches, that distribute all traffic, are vulnerable to known corruptions such as flooding attacks; address resolution spoofing; “Man-in-the-Middle” misrouting; denial of service; switch hijacking; spanning tree misdirection; forcing external root election and to VLAN hopping. ISP operated software sets up the Internet directories, which are set up in domain name servers, can be undermined by address starvation; attacks using rogue servers; bogus default gateways; malicious records; spoofing; flooding attacks; faulty responses to a server; buffer overflow attacks and denial of service attacks. The entire global Internet is also a host to a vast population of malicious viruses, worms and Trojans, which can be best detected when making a transition between ISPs. There are over 100 million such software scripts already residing on the Internet. Over three million new ones are added every month.

There is no question that a law that would impose cyber security regulations would go a long way of reducing the current risks.

GSA Moves into Cloud Computing


It is now government policy that IT should be moving websites to public “clouds”. Agencies must seek cloud solutions that include the full application development and deployment life cycles.

The existing systems capacity is fragmented across multiple platforms and providers. The new opportunities in cloud computing to leverage synergies, including data sharing, downloading, access and cross platform integration are enormous. The cloud environment contains the required security, service delivery and hosting capabilities necessary to support the development, testing, quality assurance, and production needs. All such services meet vendor-offered Service Level Agreement (SLA).

The implementation of a cloud-first policy must overcome difficulties arising from a fractured legacy infrastructure. However, we have an early case from the General Services Administration (GSA), how to proceed in an evolutionary progression.

Over 80,000 employees of the GSA are now using Google's G-mail cloud for email, calendaring collaboration and other common services. The agency finished the switch to a public private sector cloud model and is now is moving to engage other cloud services.

A four month migration from GSA to Google has re-purposed servers with an estimated saving of at least 50%. The operations of the old e-mails were pushed to the vendor, while obtaining added storage capacity and better continuity of operations. GSA also improved mobile access and delivered faster responses. The only initial hitch was to connect BlackBerry devices, which was accomplished by helping users how to update the conversion.

The GSA is currently looking at other Software-as-a-Service offerings such as by Salesforce.com and others to provide project management and activity tracking, instead of hosting them as applications residing in the agency's network.

GSA has already set up pre-competed contracts for use by any government agency. This allows buying cloud services faster, easier, and more economically. An infrastructure as a Service (IaaS) Blanked Purchase Agreement (BPA) has just been awarded for $76 million to twelve vendors to deliver IaaS services to Federal, state and local organizations.

The GSA contracts provide for the following:

Ubiquitous network access- Access through any web-browser.
Resource pooling- Resources are assigned according to demand.
On-demand self-service- Capabilities are assigned without human interaction.
Rapid elasticity- Capabilities are provisioned to quickly scale up or down.
Measured service- Pay per use.  Metering tracks resource use.
Security- Compliance with FISMA security requirements.
Data ownership - The Government shall retain ownership of all hosted data.

SUMMARY
Cloud computing is starting to move into operations. The pace is slow because it requires demonstrated success to convince IT planners to change directions. We consider the initial installations by GSA as an early harbinger of developments to come as the required cost savings start dictating commitments to a new way of delivering computing services.

Proposed Legislation to Assure Cyber Security


Secretary of Defense, Leon Panetta, the Director of the FBI, Robert Mueller and the Director of National Intelligence, James Clapper have each said that their top priorities now include cyber operations, that cyber threats are now the number one danger to the USA and that cyber threats pose a critical danger to national and economic security.

To deal with these dangers a proposed Senate bill, the Cyber Security Act of 2012, has been now launched. This Act was half a decade in the making. It was triggered by the urgency of rapidly rising cyber incidents, defined as any information-based corruption that could cause disruption of operations of the US economy. The purpose of the Act is to “…to enhance the security and resiliency of the cyber and communications infrastructure of the United States.” It concentrates cyber security oversight and control under the Secretary of Homeland Security (SoHS).

The proposed Act excludes national security systems operated by the Department of Defense and by any element of the intelligence community. The largest share of cyber-related Federal funds is thus eliminated. The Act also excludes commercial services that already provide cyber security protection and does not permit any Federal employee to regulate commercial information security products. Through exemptions SoHs may not require implementation of cyber protective measures even though universal Internet connectivity could use unprotected networks as a conduit for cyber infection. The Act has therefore limited applicability.

The proposed Act does not enhance security. It is like asking some from a ship’s crew to discuss what to do while the ship’s hull is filling with water.

The SoHS must consult with every operator who makes use of the US critical infrastructure. That includes everyone, because the entire economy now depends on the Internet. It calls for consultations with the Critical Infrastructure Partnership Advisory Council as well as with hundreds of existing Information Sharing Organizations. The SoHS must also set up coordination with the intelligence community, the Department of Defense, the National Security Agency, Department of Commerce, sector-specific agencies and with numerous of Federal agencies, including State and local governments, that have responsibility for dealing with cyber security measures. The Secretary of Homeland Security was given a coordination task that is well beyond the scope of anything previously attempted.

Within ninety days, the SoHS will assess all cyber security threats as well as vulnerabilities and risks. The Secretary will estimate the probability of catastrophic incidents to determine which sectors pose the greatest immediate danger. The purpose of the assessment will be to guide the allocation of Federal resources though how fund will be distributed has not been explained. Because almost every citizen now operates in the cyber space, the scope of this responsibility is greater than that of the Secretary of Defense.

The ninety-day assessment is not executable. It presumes the availability of a catalogue about the extent to which over half a billion of computers in the US are already corrupted. For example this would identify which of the over ten million “bots” are infected by software that can seize control of desktops, laptops, smart phones and servers to propagate malware, such as denial-of-service attacks or spyware that steals secure data files. A complete catalogue would also include the risks that are already in place within the operations of thousands of Information Systems Providers (ISPs).

SoHS would have to assess the risks and the vulnerabilities of 17,000+ networks, each with hundred thousands of uncontrollable points of access to the Internet. Over 4,000 commercial ISPs operate these networks, except that each is administering their own security procedures, which are not open to public inspection. Ultimately that would determine which one of these networks would be fixed to meet certifiable security requirements. Who would develop such standards is not spelled out in the Act. How the SoHS would acquire the capacity to evaluate the extent of vulnerabilities is not explained. Since all networks are interconnected through the Internet, it is not clear how anybody’s infections can be contained. What resources would be available to do an enormous new task is not known except that owners of the infrastructure will provide, annually, a report whether security measures have been effectively implemented.

The Act does not put into place new institutions or methods for at least one year. It does not include a budgeting process. Authority is centralized in the office of the SoHS while the direct accountability for managing the underlying security infrastructure is left to thousands of loosely defined groups but without a hint of a Congressional blue print how to proceed with the enforcement of cyber security. Each owner of a cyber networks can select and implement whatever cyber security measures are best suited.

For instance, the legislation addresses jurisdictional issues, such as the authority of the Department of Homeland Security. It creates the National Center for Cyber Security and Communications without a detailed charter. It delegates the responsibility to ten thousands of privately managed commercial networks or to hundreds of networks operated by government agencies. There is nothing in the legislation that defines the means for countering and then arresting cyber threats.

The Act amends the existing Federal Information Security Management Act (FISMA) to require each agency to develop an information technology acquisition strategy while permitting DHS to streamline reporting requirements and to reduce paperwork. How that can be done was left for DHS to work out except that the key issue – implementation – is not covered. For instance, the Act provides for DHS to operate consolidated intrusion detection, prevention, or other protective capabilities and to use countermeasures for the purpose of protecting agency information systems. How such costly facilities can be budgeted and set up, as a government service was not spelled out.

The ACT encourages agencies make informed decisions when purchasing IT products and services by soliciting best security practices in placing federal cyber contracts. However, it leaves to everyone develop cyber security technologies to fit their local conditions.

The Act calls for developing risk-based performance requirements, looking first to existing standards and to commercial best practices. If a sector is sufficiently secure, no new performance requirements are required. Each owner of a commercial cyber system would determine how to best meet the requirements and then to verify it. An owner could also choose to self-certify compliance with own criteria, though current regulators such as the Securities and Exchange Commission can offer their own security regulations as well as perform cyber oversight. Such provisions essentially de-fang efforts to put into place national cyber security solutions.

The problem with the proposed cyber security Act is found in its disregard of the sources of cyber flaws, which originate from the fundamental insecurity of the Internet through which all mal-ware propagates. The insecurity of the Internet is pervasive. It cannot be remedied by leaving the application of protective measures only in the hands of individual industry sectors or separate government agencies to be applied only whenever the Secretary of DHS finds an imminent danger to life or the economy.

The Internet is riddled with a capacity that makes it possible to manipulate the corruption of software by skilled perpetrators. The flaws in the Internet are persistent and cumulative and not temporary. The list of cyber flaws contains ten thousands of entries and apply equally to financial, manufacturing, utility or defense networks. This list changes every minute as attackers modify their software. For instance, network routers that pass on all traffic from one computer to another are vulnerable to promiscuous mode corruption; to router table attacks; to shortest path compromises; to border gateway flaws and to border gateway poisoning to name just a few named faults.

Network switches, that distribute all traffic, are vulnerable to known corruptions such as flooding attacks; address resolution spoofing; “Man-in-the-Middle” misrouting; denial of service; switch hijacking; spanning tree misdirection; forcing external root election and to VLAN hopping.
The Internet directories, which are set up in domain name servers, can be undermined by address starvation; attacks using rogue servers; bogus default gateways; malicious records; spoofing; flooding attacks; faulty responses to a server; buffer overflow attacks and denial of service attacks.
The entire global Internet is also a host to a vast population of malicious viruses, worms and Trojans. There are over 100 million such software scripts already residing on the Internet. Over three million new ones are added every month.

So far the individual cyber operators have defended against cyber infection with anti-virus software, firewalls and counter-intrusion appliances. The cost of such defenses runs in tens of billions because every operator must protect a network with stand-alone investments. If the objective of setting up cyber defenses is national in scope, this cannot be accomplished by dividing the defense in ways that follow existing patterns. The scope of the US national defenses in the cyber space must first start with dealing with the generic characteristics of the Internet. Only after suppressing global Internet flaws is it possible to deal with local protective measures. The hygiene of hospitals exists under the protective umbrella of the Center for Disease Control and Food and Drug Administration.

The pattern of defenses should follow many of the precedents, which have been already put in place by the health and food regulatory environment. Internet cannot be allowed to let infection propagate until there is a government-defined level of shared safety. The operators of cyber networks must first comply with national standards that can be validated through certification and audit by trusted commercial firms before sector operators can enhance their security needs.
 
The National Institute of Science and Technology in the Department of Commerce or in an institution specifically created in DHS are the organizations for developing and then guiding the required testing and certification. In the same way as Congress saw fit to fund the Centers for Disease Control, so should the Cyber Security Act include the creation of a central organization for identifying and tracking cyber threats. Congress has also organized the U.S. Food and Drug Administration with the mission of protecting health. A similar institution within DHS should have the capacity to evaluate and then to test networks that deliver secure services.

The proposed cyber security legislation creates excessively complex regulations that cannot be implemented. Making sure that cyber communications takes place only via government certified networks would simplify defenses. It will place emphasis on securing the generic Internet before proceeding with securing of dedicated uses.


DoD Has an New Information Systems Strategy

We have now the “DoD IT Enterprise Strategy and Roadmap” (ITESR_6SEP11). The DoD Deputy Secretary and the Chief Information Officer signed it. This makes the document the highest-level statement of IT objectives in over two decades. The new direction calls for an overhaul of policies that guide DoD information systems. Implementation becomes a challenge in an era as funding for new systems development declines.

The following illustrates some of the issues that require the reorientation of how DoD manages information technologies:

1. Strategy: DoD personnel will have seamless access to all information, enabling the creation and sharing of information. Access will be through a variety of technologies, including special purpose mobile devices.
Challenge: DoD personnel uses computing services in 150 countries, 6,000 locations and in over 600,000 buildings. This diversity calls for standardization of formats for ten thousands of programs, which requires a complete change in the way DoD systems are configured.

2. Strategy: Commanders will have access to information available from all DoD resources, enabling improved command and control, increasing speed of action and enhancing the ability to coordinate across organizational boundaries or with mission partners.
Challenge: Over 15,000 uncoordinated networks do not offers availability and latency that is essential for real-time coordination of diverse sources of information.  Integration of all networks under centrally controlled network management centers becomes the key requirement for further progress. Requires a complete reconfiguration of the GIG.

3. Strategy: Individual service members and government civilians will be offered a standard IT user experience, enabling them to do their jobs and providing them with the same look, feel, and access to information on re-assignment, mobilization, or deployment. Minimum re-training will be necessary since the output formats, vocabulary and menu options must be identical regardless of the technology used.
Challenge: DoD systems depend on over seven million devices for input and for display of information. Presently there are thousands of unique and incompatible formats for the supporting user feedback to automated systems. The format incompatibilities requires the replacement of the existing interfaces by means of a standard virtual desktop, which recognizes the differences in training and in literacy levels.

4. Strategy: Common identity management, authorization, and authentication schemes grants access to the networks based on a user’s credentials as well as on physical circumstances.
Challenge: This calls for the adoption of universal network authorizations for granting access privileges. This requires a revision of how access permissions that are issued to over 70,000 servers. The workflow between the existing personnel systems and the access authorization authorities in human resources systems will require overhauling how access privileges are issued or revoked.

5. Strategy: Common DoD-wide services, applications as well as programming tools will be usable across the entire DoD thereby minimizing duplicate efforts, reducing the fragmentation of programs and reducing the need for retraining when developers are reassigned or redeployed.
Challenge: This policy cannot be executed without revising the organizational and funding structures in place. Standardization of applications and of software tools necessitates discarding much of the code that is already in place, or temporarily storing it as virtualized legacy codes. Reducing data fragmentation requires a full implementation of the DoD data directory.

6. Strategy: Streamlined IT acquisition processes to deliver rapid fielding of capabilities, inclusive of enterprise-wide certification and accreditation of new services and applications.
Challenge Presently there are over 10,000 operational systems in place, controlled by hundreds of acquisition personnel and involving thousands of contractors. There are 79 major projects (with current spending of $12.3 billion) that have been ongoing for close to a decade. These projects have proprietary technologies deeply ingrained through long-term contract commitments.  Disentangling DoD from several billions worth of non-interoperable software requires Congressional approval.

7. Strategy: Consolidated operations centers provide pooled computing resources and bandwidth on demand. Standardized data centers must offer access and resources by using service level agreements, with prices that are comparable with commercial practices. Standard applications should be easily relocated across a range of competitive offerings without cost penalty.
Challenge: The existing number of data centers, estimated at over 770, represents a major challenge without major changes in the software currently occupies over 65,000 servers. Whether this can be accomplished by shifting the workload to commercial firms, but under DoD control, would require making tradeoffs between costs and security assurance.

SUMMARY
In summary, the redesign of operational systems into a standard environment is unlikely to be implemented on a 2011-2016 schedule unless DoD considers radically new ways of how to achieve the stated objectives.

Over 50% of IT spending is in the infrastructure, not in functional applications. The OSD CIO has a clear authority to start directing the reshaping of the organizations of the infrastructure. Consequently, the strategic objectives can be largely achieved, but only with major changes in the authority for the execution of the proposed plan. It remains to be seen whether the ambitious OSD strategies will meet the challenge of the new cyber operations.




Strassmann Biographical Information

Needed: Guidance from the Office of Management and Budget

The reported spending for IT is set for FY11 by OMB as $79 billion/ However that number does not include 58 independent executive branch agencies.  For instance exclusions include the Central Intelligence Agency, spending by the legislative and judicial branches of the Federal Government. In the case of DoD and DHS, which account for more than half of the $79 billion spending, the payroll costs of the uniformed and civilian payroll are also excluded. At close to $100 billion of IT spending, the Federal Government consumes close to 2% of global IT spending. As compared with the largest commercial enterprises, this exceeds their IT spending by a multiple of at least 30.

The OMB budget also excludes IT costs that are components of operational systems such spacecraft’s ground systems (such as satellite command-and-control systems and satellite data-processing systems). There are also inconsistencies in how agencies report on IT spending included in R&D programs. Sometimes these costs are included, sometimes they are not.

The reported Federal Government IT costs are broken up into 7,248 investments, which account for a third of total IT budgets. As compared with commercial practice this is a high ratio because enterprises are able to operate with close to 80% of the budget because of effective spending for new projects. For instance, there are 1,536 separate development programs for improving the management of information technologies and particularly the management of the IT infrastructure. There are 781 investment programs for supply chain management and there are 661 investment programs for human resource management. Commercial practices would not tolerate such proliferation.

The Office of Management and Budget in the Office of the President OMB plays a key role in overseeing how federal agencies manage their IT investments. The source for this oversight is data about an agency’s investment portfolio (Exhibits 53) and capital assets planning (Exhibits 300). Additional web based “dashboards” summarize information about diverse projects, though the data and analysis are not reliable.

OMB does not provide oversight over IT spending expended in ongoing operations.
OMB and federal agencies have undertaken several initiatives to address potentially duplicative IT investments. Most of these efforts have not yet demonstrated results. Agencies also do not assess legacy systems to determine if they are duplicative.

The slow progress in managing Federal IT for greater efficiency can be traced to a lack of a coherent Federal Enterprise Architecture (FEA). When originally developed in 1999, the FEA was intended to provide federal agencies with a common construct for their architectures and thereby facilitate the coordination of common business processes and consistent system investments. As part of the fiscal year 2004 budget cycle, OMB required agencies to align proposed IT investments to the FEA reference models; this information was then used to develop the initial process improvement initiatives. Since that time, agencies have established individual enterprise architectures and used them to characterize their IT investments and to guide plans for the future. OMB’s Chief Architect reported that comprehensive changes to the FEA are planned for fiscal year 2012. But meanwhile the actual progress in rationalizing IT spending does not show progress.

Though the closure of a number of data center is proceeding, federal agencies’ data center inventories and consolidation plans are incomplete and do not as yet reflect verifiable net cost reductions.
OMB has also announced its trusted Internet connection initiative to improve security by reducing and consolidating external network connections. However, none of the 23 participating agencies had yet met all of this initiative’s requirements.

A major new initiative from OMB is the FedRAMP project, which is to provide, among other functions, continuous security monitoring of cloud computing systems for multiagency use. This project is currently behind schedule, and has not yet defined all performance metrics.

 The FedSpace project, which is to provide federal employees and contractors collaboration tools for cross-agency knowledge sharing, is also behind schedule and has not defined its performance metrics.

SUMMARY
The nation’s actual annual spending for IT is much higher than the $78.8 billion identified by OMB. Agencies do not routinely evaluate legacy systems to determine if they are duplicative and can be eliminated or consolidated.

A Brief History of the Federal Enterprise Architecture


One of the mandates of the Clinger-Cohen Act of 1996 was the creation of the Information Technology Architecture. In subsequent 1999 guidance the Federal CIO Council defined the Federal Enterprise Architecture (FEA) as the process for developing, maintaining, and facilitating the implementation of integrated systems.

Chief Information Architects were then appointed at the Federal and DoD levels. However, as of June 2011 the GAO reports that the enterprise architecture methodology was not deployed. Between 2001 and 2005, GAO reported that DoD spent hundreds of millions of dollars on an enterprise architecture development that was of limited value.

None of the three military departments have so far demonstrated that they have made commitments to deploy an architecture needed to manage the development, maintenance, and implementation of systems.

Senior DoD IT executives have stated that the development of an architecture methodology has been pushed back due to budget limitations. As yet, no time frames have been established for producing a FEA.  There is no specific time when the enterprise architecture would be delivered.  The current focus is on other resource-intensive commitments.

Nevertheless, the use of well-defined enterprise architectures remains nowadays an attribute of managing IT in successful commercial firms. A centrally directed architecture remains the basis for system integration and for delivering lower IT costs.  In spite of the significant potential benefits, the enterprise architecture has not guided DoD systems over the past fifteen-years.

While DoD was promoting departmental and agency-wide enterprise concepts actual implementation of integration was missing except in isolated cases.  Consequently DoD ended up lacking a coherent blueprint for creating the technology environment that would be interoperable, economically efficient and easily accessible for innovation. The absence of a working architecture has prevented DoD from making progress at speeds in which information technologies are now progressing in leading commercial firms.

The absence of a guiding DoD architecture also opened the floodgates to excessive project fragmentation, to technology incompatibilities and to operations that were contract-specific rather then DoD enterprise integrating. That increased not only the costs of maintenance and of modernization upgrades, but also put a brake on the ability to innovate, which would meet the rapidly rising demands to achieve information superiority. If there was innovation, it had to take place as stand-alone new projects and not as an enhancement to systems that could be improved at a lesser cost.

The Clinger-Cohen Act also passed to the Chief Information Officers (CIOs) the responsibility for the management of total IT spending. That did not take place as originally legislated.

At present CIOs cannot be held accountable for the contracting decisions that are made by acquisition officers who use regulations found on 2,017 pages printed in small font. The management of IT, as of March 2011, is split into 2,258 business systems with only spotty direct component CIO oversight.  Meanwhile, the influence of the DoD CIO to control rising costs continues to be limited.

There are also over a thousand of national security systems that were not listed in the DoD Information Technology Portfolio Repository (DITPR). Such projects included intelligence systems, military command and control networks and equipment included as an integral part of weapons. Whereas in the years of the cold war national security systems could be set up as stand-alone applications, modern warfare conditions now required a real-time interoperability between national security systems and selected business applications such as logistics.

As the consequence of having no architecture as well as a weak CIO control over IT costs, DoD ended up with largely isolated enclaves – silos - of projects.
IT projects are now managed as separate contracts. Applications are not interoperable across the DoD because they cannot share enterprise-wide data. IT projects are run in over 700 data centers as well as in an untold number of special-purpose servers that often have data center capabilities.

Such outcomes were not originally hoped for in 1996. The flawed outcomes, predicted in 1995 Senate hearings, pointed to the inadequacies of Clinger-Cohen to meet post-cold war needs. The 1996 legislation did not offer changes in the organizational structure for managing IT. It did not alter the DoD component-entrenched budgeting practices.  It did not set out to deliver a shared information technology infrastructure that would compare favorably with best commercial practices. Security was not as yet on the agenda. Rising DoD requirements would be then met with rapidly rising spending for IT.

SUMMARY
The current proliferation and excessive costs of government systems is largely the result of a failure to implement a Federal Enterprise Architecture (FEA) that would organize systems into a well managed operations. That has not happened even though it contrary to best commercial practice.

The absence of a coherent FEA remains as the single greatest obstacle for further cost reductions in government IT spending. 


How to Cut IT Costs


The adoption of platform-as-a-service (PaaS) has opened up new opportunities for reducing information technology costs. Now, the U.S. Defense Department must enter this option into its planning.

In fiscal year 2011, the department spent 54 percent of its total information technology budget of $36.3 billion on its infrastructure. The remainder was spent on functional applications. Compared to commercial practices, the size of the Defense Department infrastructure is excessive. The department never has managed to share its infrastructures—programs were built as stand-alone silos, each with its stand-alone infrastructure.
For example, even the simple effort to consolidate what should be a commodity application—such as a common email for the Army—has run into problems. The Army email consolidation effort is difficult because of several reasons: no shared standards; numerous local network modifications; inconsistent versions of software; and incompatible desktops. The idea of placing parts of supply chain management, human resource systems, financial applications or administrative systems on shared platforms is too hard to imagine.

Now, PaaS platforms enable sharing of the operational infrastructure. PaaS calls for the separation between the software that defines the logic of an application and the method that describes how that application will be placed in a computing environment.

A PaaS cloud provisions data center assets, data storage capacity, communication connections, security restrictions, load balancing and all administrative requirements such as service-level agreements. That cloud can be private or public, and it can support local needs or serve global requirements. A system developer can concentrate exclusively on authoring the application logic. When that is done, the code can be passed to the PaaS platform for the delivery of results.

PaaS produces results without the cost and complexity of managing operations. In this way, the total budget for a new application can be reduced. Programmers can concentrate on the business logic, leaving it to PaaS to take care of the hard-to-manage infrastructure. When using PaaS, all of the infrastructure components already will be installed. A PaaS cloud then can support hundreds and even thousands of shared applications infrastructures. Consequently, the total cost of Defense Department operations will decrease.

PaaS is an attractive solution except that each of the platform providers will try to lock up applications into their environment. Once an application code is checked into a vendor’s PaaS, it will be difficult to ever check it out. Hundreds of vendors will add refinements to their PaaS so that any extrication to another PaaS will remain as a restraint.

What a customer wants is not a vendor lock-in, but the ability to port applications from any PaaS to another. The customer then can shop for different terms of service from multiple suppliers. Portability of application code across PaaS services makes price competition possible. Availability of multiple PaaS clouds also makes for more reliable uptime.

To deal with the problem of interoperability across different PaaS vendors, the company VMware has just introduced the PaaS platform called Cloud Foundry. What is unique is that this is open source software. A number of firms already have signed up to support this approach. The only restriction is that all of the applications must conform to compatible software frameworks such as Spring for Java apps, Rails and Sinatra for Ruby apps and Node.js.

An open source cloud platform prevents vendor monopoly. It allows for competitive procurement, makes cross-cloud support available and offers the exercise of multiple options for how services can be delivered. Such arrangement will assure customers of improved quality and maintainability.

In the next few years, the Defense Department will have to depend on cloud technologies that are available from several hundred contractors. A defense customer would contract for a PaaS platform offering with features that are desired. In effect, the PaaS vendor delivers data center services, while the customer retains full control over the application software.

SUMMARY
A Defense Department policy of October 16, 2009, provides guidance regarding the use of open source software. Currently, VMware PaaS meets the definition of commercial computer software and must be given statutory preference.

The broad peer review enabled by publicly available open source code supports software reliability and security efforts through the identification and elimination of defects that might otherwise go unrecognized. The unrestricted ability to modify software source code then enables the Defense Department to respond more rapidly to changing situations, missions and future threats that otherwise would be constrained by vendor licensing.

The availability of the Cloud Foundry platform opens a new approach for how to proceed with the migration to cloud computing. The more reliable PaaS may not take over unless the Defense Department will change its thinking about how to organize the development and operations of information technology.

What are DoD’s Largest Cost Reduction Opportunities?


The Defense Business Board has just published a report on DoD Information Technology Modernization.  The report reveals an allocation of major IT spending categories that previously have not been published:


With renewed emphasis on cost reduction, this tabulation offers a view of what could be the best candidates for cost reductions. It shows that the DoD total IT the infrastructure costs are 62% of the total. These costs are almost entirely Operations and Maintenance expenses, which are more amenable to rapidly changing technology and manpower budgets. Therefore the primary target should the highly expensive and fractured DoD infrastructure.

The question is how to cut either the data center operations costs ($6.5 billion for support plus $2.5 billion hardware), or telecommunications costs ($9.9 billion). Of the two cost cutting opportunities, reducing telecommunications costs offers greater potential and less risk because it is simpler. It involves a much small number of organizations that control spending.

The most frequently cited cost reduction initiatives propose major reductions in the number of data centers. This is done through a four-fold increase in hardware utilization by means of virtualization. That requires capital investments for new equipment. Squeezing 5,000+ applications into a smaller number of servers involves spending on fixing up applications. Data center operating personnel costs are least twice as much expensive as the expense for mainframe and servers. Shutting down installations takes time and involves termination of special purpose buildings, air-conditioning and the supply of electric power. Most importantly, regardless of the number of data centers there will be a need for for continued use of trained operators who know how to deal with local security and operating requirements. Such staff is hard to terminate.

Achieving cost reductions through consolidation of data centers is attractive but takes a long time for payoffs to materialize. It has greater risks and a longer time before the Return-on-Investment (ROI) becomes positive. The last major effort to reduce IT costs in DoD was the DMRD 918 in 1993. It took more than ten years to complete.  Financial paybacks still remain uncertain. While this effort consolidated 142 DISA data centers into nine, several hundred new data centers sprung up elsewhere.

A better opportunity for cost reduction is the $9.9 billion telecommunication cost. There is an estimated 15,000 networks in place. They are already connected through switches and routers that can be reorganized for control by a fraction of the current staff.

The prevailing technology of switching and routing of networks is broken.  There are separate contracts without consolidated oversight such as found in Google and Akamai. A partial list of how communications contracts are divided can be found in the following table:


 The number of communication contracts is much smaller than the number of involved in data centers. Communication contracts are already managed by DISA. Enterprise-wide unification should be easier because contracts are for services rather than for ownership. The contractors should own capital costs.

Consolidations of data centers will have an effect on the structure of communications. First, placing servers into a few locations as a hub cloud service will increase the flow of traffic. Second, concentration of computing will increase the importance of capacity used for sharing data processing across cloud operations. This will add to the volume of redundant transactions. Proceeding with data center consolidation without the redesign of communications will produce an increase in costs. It will also slow down the rate at which data center consolidation can proceed.

SUMMARY
The major challenge for the DoD CIO is to pick the priorities how to achieve cost reductions. It has not been demonstrated that the prevailing emphasis on the reduction in the number of data centers will be the most effective solution. The dollars in potential savings as well as the technical difficulties do not place the cutting in the numbers of data centers on the top savings potentials. Most likely the way to start on savings should be through the restructuring of communication networks.



Access Authorization to Web Applications


A changing workforce as well as a diversity in warfare conditions demands that DoD personnel now expects access to its data from a diversity of sources, anytime, anywhere. Organizations are turning to a new generation of cloud applications in order to meet rapidly expanding requirements.  Unfortunately, many of the applications have to be shared by a diversity of forces, in real-time. These applications reside on thousands of systems where access is managed by separate management staffs.

Meanwhile, users are bringing their own technology to the workplace—frequently on multiple devices such as tablets, notebooks and a variety of smartphones. Personnel seeking access are frustrated with managing multiple login credentials to the desired applications. This makes it harder than ever to manage and secure the workplace.

A few years ago an attempt was made to improve the interoperability of access through the implementation of “portals”. After initial success the portals were abandoned because the cost of maintenance and application integration made their use costly and unwieldy.

The present challenge is to provide easy to deploy access to the multiplicity of cloud- and web-based applications while maintaining secure control over applications, user access and devices. A solution must be scalable, affordable and easily implemented, while using commercial standards.

The granting of access authorization to an application is the single most important requirement for security assurance. If the access authorization process is faulty, a security failure is sure to follow. The key question is how to manage the granting of security authorization in a network environment where a user may use both a variety of mobile as well as stationary devices? How can a single access authorization apply in cases where a user may need to connect with to a variety of Windows, Web-based or SaaS applications? How can the granting of access authorizations be subjected to policy restrictions and then tracked for review by management and by security officials?

The requirements for access authorization are:
1. End users must be able to gain access to web applications from a variety of devices, which includes rapidly changing mobile clients. 2. The enterprise security organization must be able to retain control over security at all times. 3. Users, who need permission to access a wide range of applications, hosted on diverse services, must be able to receive access permission through a single sign-on process.

Permissions would apply to virtualized apps, to SaaS apps as well as to Web applications from any customer device, at anytime and from anywhere.

To enable such simplified application access management we have now available a user centric hosted management platform that centralize IT control across the entire network. A cloud-based Access Authorization Service (AAS) will, for an annual fee per seat (estimated in the $20-$30 range), deliver a range of choices for managing the complexity of policy choices. AAS then provides the sign-on and security interfaces that will engage with a wide range of hardware and software platforms.
The AAS hosted service will enable DoD to centrally manage the provisioning, access and usage of applications. It will extend users’ enterprise identities to the public cloud, simplifying the security of application access. Users—even those with multiple devices— will each have a single login and simplified, self-service access to the organization’s application store. AAS can be deployed immediately without costly hardware or complex, time-consuming integration efforts because it operates as a browser commercial accessible service. IT managers can now effectively address security risks
AAS will:

1. Simplify access to private or public clouds as well as to Windows applications. 2. Materially reduce the number of credentials that have to be provided. 3. Scale down the administrative costs that are maintained by security personnel. 4. Increases enterprise security by relying on generally accepted standards for gaining access to all applications. 5. Streamline the login process while reducing the workspace needed for starting up applications. 6. Simplify user-activity reports while making it easier to monitor application usage. 7.Keep track of software licensing.

SUMMARY
For AAS to be applied universally throughout DoD will require standardization of the ways authentication and authorization instructions are exchanged between security domains. The current approach is often application specific. Contractors custom-make such exchanges proprietary. The migration towards AAS should be using the approved Security Assertion Markup Language (SAML).

It is a version of the OASIS approved standard for exchanging authentication and authorization data between security domains. It is an XML-based protocol that uses security tokens containing assertions to pass information about a principal (usually an end user) between an identity provider and a web service. It enables web-based authentication, including single sign-on (SSO). An AAS that is fully compliant with SAML should be a mandatory service.

Getting Ready for Mobile Devices


According to a July 2011 report there were 82 million U.S. users of smart phones, increasing at the rate of 10% per quarter. (1)  Google’s Android platform leads, with 42% market share. Blackberry market share keeps declining to 20%. Microsoft and Symbian saw their market shares decrease to 5.7 percent and 1.9 percent, respectively.

There has been also an enormous increase in the number of tablets to over 100 million, as they have started replacing laptops. Spurring this development has been the rapid increase in the number of low cost applications that are instantly downloadable (Apps) and rapidly deployable. Over 250,000 Android and 150,000 Apple Apps are now available. There are thousands of individual developers who have entered into the market place to sell Apps through Google and Apple. Development platforms and application templates are readily available, which accelerates the rate at which special purpose Apps become available.

Mobile devices are now priced as affordable consumer appliances. They offer attractive alternatives for delivering to DoD personnel communication services at a fraction of the current cost.
The introduction of mobile computing is disrupting the DoD computing processes:

A rapidly changing environment requires the standardization of selected applications for special uses, such as submarine support, special force communications or coalition warfare deployment.
Off-the shelf applications can be easily acquired and installed by anyone.
Testing of applications, by prior users, makes the applications open source.
The diversity of applications makes it necessary for a user to access a variety of services.

We need a user-centric rather than production-centric computing environment. DoD has to start planning for ways how diverse mobile devices will be able to extract and then to display results obtained from a variety of applications.

One of the solutions is to impose between the varieties of existing applications an additional cloud-based layer that will mediate between customized mobile devices and the variety of dispersed applications. Such a cloud layer, now available for about $30 per seat per year, offers a single sign-on access to on-line web applications such as offered by commercial firms such as Microsoft, SalesForce, FaceBook, Adobe or Intuit as well as DoD authored Apps that offer military solutions.

SUMMARY
The rapid rate how the installation of a variety of mobile devices is progressing is already swamping the capability of DoD computer operations to support military and civilian personnel needs. This situation is particularly acute for operations that need to be deployed rapidly. A complete overhaul in organizing system development is now in order. There is no question that DoD is now entering into the “post PC” era.

 1 http://www.inman.com/news/2011/08/30/number-smartphone-users-jumps-10

Switching to Server-Based Security



Gen. Alexander, head of the NSA and the U.S. Cyber Command, stated: “You’ve got to have an infrastructure that is defensible.” The perimeter surface that needs to be defended into not more than twenty data farms is a fraction of the 2,094 Federal Data Centers and DoD operated 772 data centers. There are most likely thousands of more stand-alone servers that also need to be protected.

The problem is that what Congress, OMB and the DoD CIOs track is counting the reduction in the number of data centers as a metric of success. That is an insufficient measure. There are too many defenders at too many locations required to protect government traffic. This headcount is neither affordable nor sufficiently trained to deal with the rising sophistication of attackers. There is not enough money to acquire all of the anti-intrusion equipment that is necessary. The funds for security protection licenses cannot be acquired either in the quantity or at a speed that matches the threats.

The stated prime objective of the latest DoD policy guidance is to reduce the number of data centers through consolidation. That is insufficient and possibly misleading. Instead, securing operations through the minimization of stand-alone servers and a reduction in operating manpower cuts should be the top priority. Cost reduction, though important, should be only an outcome when successful security improvement demonstrates that simplifying operations also delivers savings.

As mobile applications increase from thousands to millions the DoD environment must be reorganized to synchronize every portable devices with its corresponding office desktop. In this way the existing emphasis on local security safeguards will have to shift from “personal computers” to a well-protected number of clouds that house millions of virtual computers.  The “end of the PC era” will arrive.

In the next ten years it is unlikely that all user equipment will be converted to browser-based, disk-less and USB-lacking devices. There will still be a population that will run specialized application, such as intelligence workstations, engineering design computers or devices that will have to sustain operations when detached from the DoD network. However, from a security standpoint all changes from such stations will have to pass through a thorough security gauntlet before it is reconnected to the DoD network.

SUMMARY
Consolidation into a standard and open source cloud computing environment will minimizes the attack surface for an enemy.  Fewer facilities will also result in lower costs because of fewer jobs and increased competition.

Cloud Computing is a Bad Idea?


According to Federal Computer Week, federal managers doubt cloud computing cost savings.  500 government professionals were surveyed, with the majority (36 percent) in a managerial position. Most respondents were employed at defense or civilian agencies. (1)

 The respondents were not sold on the promises of cloud computing as a long-term money saver. Only 12 percent said the cloud could potentially bring significant cost saving, while one-third said the cloud would have no impact on costs. Nearly 20 percent thought the cloud would result in some cost increase; 6 percent said a cloud migration would lead to significant cost increase. This means that 60% of the managers were negative about cloud computing and only 12% were totally positive. These findings are different from available commercial cases and models that show greater then a 50% cost reductions and a break-even achievable in less than two years.

Respondents said they were “somewhat familiar” with the declared “cloud-first” initiative. More than half said they have partially complied with the directive of identifying only one or two cloud opportunities. Less than one-third said they have shown compliance with the identification of a strategy to move three or more services to the cloud.

It appears that defense and civilian agency executives report a largely negative attitude about the potential savings from cloud computing. There does not appear to be an incentive to view cloud computing as an opportunity. What supports this conclusion is unknown. What matters are the negative attitudes, which will surely inhibit further progress.

SUMMARY
The most plausible explanation for the negative attitude is the realization that the existing budgetary and management processes will block the achievement of gains, even if information technology could become available.

The top obstacle in progressing toward cloud computing is in the insufficient scale of operations. With IT budgets splintered into 5,600 line items, each with a commitment of several years of funding, most of the projects do not have the scale for consolidation into larger cloud aggregations. The problem of bringing over 70,000 servers and close to a hundred thousand of separate applications under enterprise-wide virtualization constitutes a challenge that is unmanageable from the standpoint of those who are operating under current limitations.

DoD is attempting to deal with this issue by pursuing a policy of data center consolidation as its primary initiative to cause a shift in favor of cloud computing. Though the elimination of fixed data center costs may deliver some savings, the cost of realizing higher server utilization through virtualization is unlikely to deliver large savings. The cost savings are to be found primarily in the savings in operating costs – which are primarily labor costs - and not in the reduction in capital costs. Unless DoD simultaneously pursues the reductions that are available through the consolidation of applications, the new data centers will have to spend capital for housing the identical diversity of applications that are currently require support of a large staff of operating and maintenance personnel.

The most likely explanation for the largely negative attitude towards cloud computing reflects the executives’ own skepticism how likely DoD would be able to alter present IT management practices.

  (1) http://fcw.com/articles/2012/02/01/feds-worry-about-cloud-security.aspx?s=fcwdaily_020212