When it comes to computers, invulnerability and complete reliability are outmoded terms in an increasingly networked world. Most insureds will lose the functionality of your computer network several times during a specific term of insurance coverage. Claims of physical damage often affect the functionality of the insured’s computer systems or network. Nearly every claim will have a CyberInsurance component (such as the determination of what property is “owned or used” by the insured or the impact of a failure of a third party’s system).
The year 2002 brings a greater understanding of system survivability methods and technologies. Experts now incorporate survivability concepts into system design. New analytical techniques allow companies to identify vulnerabilities and to create a proper response plan. These self-examinations permit companies to remain dynamic and change with technology and the threat environment. Insurers can require that the CyberInsurance risks they underwrite have these new survivability concepts. The Internet allows users to reach virtually anyone anywhere anytime. Nearly every company uses services provided and data generated by e-businesses. But the Internet’s rapid growth is accompanied by a huge expansion of network services that often are not designed, configured or maintained securely. No company is immune from system failures and cyber-attacks on network intelligence (i.e., the data critical to the insured’s business). The proliferation of hacker tools makes it easier to exploit vulnerabilities in security systems and the structure of the Internet. It is no longer a question of “if’ but of “when” a system or network will be compromised.
Even companies that are not the source of the system failure or cyber-attack can be hurt where the failure affects the infrastructure of the Internet. (For example, a “Denial of Service Attack” that brings down a company’s web site will almost certainly involve all other users of the same web host. Thousands of companies use the services of web hosts, Internet Service Providers, and Application Service Providers. When one of those providers is attacked, fails, or loses data, it can affect all user companies. Retrieval or correction of data may be impossible due to the sheer volume of records obtained per second. Provider problems escalate the impact of information-related claims to those levels experienced during a natural catastrophe. These risks increase with the interdependence of companies through the Internet.
Technology has changed the insurance business. There is new attention on the preservation of information and electronic assets deemed to be critical to the insured’s business. Insurers must make overall policy choices in response to legal problems created by new technologies, but a host of issues arise. Can policy language be standardized in today’s markets? Can technical and legal experts assist in developing policy language? Should CyberInsurance policy language be integrated into the overall insurance policy or remain a separate section or endorsement with its own premium, risk analysis, coverages and exclusions? Does insuring property “owned or used” by the insured extend to Internet infrastructure applications? How can a policy adapt to post-issuance changes in the insured’s technology? Is insuring the insured’s middleware or customizations more akin to a performance bond than to property insurance? If the insured’s middleware or customizations are not designed to reinstall upon an outage, is that an uninsured design defect? Some courts equate a policyholder’s loss of system functionality as covered “physical damage.” For example, in American Guaranty & Liability Ins. Co. v. Ingram Micro, Inc., 2000 U.S. Dist. LEXIS 7299, the Court ruled that where an insured’s mainframe computer functioned properly, its loss of functionality due to the lack of system survivability safeguards constituted covered “physical damage.” In essence, the Court made the insurer the guarantor of the design and implementation of the policyholder’s computer customization and middleware.
Technically and legally sound axioms formulated specifically for insurance issues can help level the playing field. One axiom that has explained and obviated many CyberInsurance claims is the proposition that Y2K non-compliant software is an intentionally designed program acting in accordance with its technical specifications. This axiom has withstood challenge because it has sound technical and legal foundations. Computer hardware and software act predictably, and the understanding that such predictable action is within the design specifications can prevent payment of improper claims. American Guaranty & Liability Ins. Co. v. Ingram Micro, Inc. illustrates the need for consistent application of this axiom to intentional designs in software. There, the insurer did not emphasize the “intentional design” aspect — that the computer’s reversion to default settings after a power outage was inherent in the design of the hardware and software. As a result, the court chose to incorporate a “loss of function” into the definition of “property damage.”
The CyberInsurance Institute, a nonprofit organization founded, in part, through White and Williams LLP, has found two other nonprofit organizations are in the forefront on e-business insurance standards and analytical tools. RANDÂ® developed the concept of “Minimally Essential Information Infrastructure” (MIEE) to protect systems and information deemed essential to an insured’s operation. CERTÂ® developed OCTAVEÂ® to provide a framework to evaluate threats, vulnerabilities and points of failure of an insured’s system and to allow the insured and its insurer to understand the inter-connections between a failure and its consequences. (You can learn more about OCTAVEÂ® at www.cert.org.) For insureds and insurers, the survivability and design standards enunciated by these leading-edge technical organizations can go a long way towards the formation of legal standards that would guide decisions in the CyberInsurance area.