Handbook of Information Security Management:Risk Management and Business Continuity Planning

Previous Table of Contents Next


Credibility of the Numbers

Twenty years ago, the task of coming up with “credible” numbers for information asset valuation, threat frequency and impact distributions, and other related risk factors was daunting. Since then, the GIV was published, and significant progress has been made by some automated tools’ handling of the numbers and their associated knowledge bases. The knowledge bases that were developed on the basis of significant research do establish credible numbers. And, credible results are provided, if proven algorithms with which to calculate illustrative risk models are used.

However, manual approaches or automated tools that require the users to develop the necessary quantitative data are susceptible to a much greater degree of subjectivity and poorly informed assumptions. In the past couple of years, there have been some exploratory efforts to establish a Threat Research Center tasked with researching and establishing:

1.  A standard Information security threat population,
2.  Associated threat frequency data, and
3.  Associated threat scenario and impact data;

and maintaining that information while assuring sanitized source channels that protect the providers of impact and scenario information from disclosure. As recognition of the need for strong information security and associated risk assessment continues to increase, the pressure to launch this function will eventually be successful.

Subjectivity

The ideal in any analysis or assessment is complete objectivity. Just as there is a complete spectrum from qualitative to quantitative, there is a spectrum from subjective to increasingly objective. As more of the elements of risk are expressed in independently objective terms, the degree of subjectivity is reduced accordingly, and the results will have demonstrable credibility.

Conversely, to the extent a methodology depends on opinion, point of view, bias, or ignorance (subjectivity), the results will be of increasingly questionable utility. Management is loathe to make budgetary decisions based on risk metrics that express value and risk in terms such as low, medium, and high.

There will always be some degree of subjectivity in assessing risks. However, to the extent that subjectivity is minimized by the use of independently objective metrics, and the biases of tool developers, analysts, and knowledgeable participants are screened, reasonably objective, credible risk modeling is achievable.

Utility of Results

Ultimately, each of the above factors (Diversion of Resources, Credibility of the Numbers, Subjectivity, and, in addition, Timeliness) plays a role in establishing the utility of the results. Utility is often a matter of perception. If management feels that the execution of a risk assessment is diverting resources from their primary mission inappropriately, if the numbers are not credible, if the level of subjectivity exceeds an often intangible cultural threshold for the organization, or if the project simply takes so long that the results are no longer timely, then the attention and trust of management will be lost or reduced along with the utility of the results.

A risk assessment executed with the support of contemporary automated tools can be completed in a matter of weeks, not months. Developers of the best automated tools have done significant research into the qualitative elements of good control, and their qualitative vulnerability assessment knowledge bases reflect that fact. The same is true with regard to their quantitative elements. Finally, in building these tools to support quantitative risk assessment, successful efforts have been made to minimize the work necessary to execute a quantitative risk assessment.

The bottom line is that it makes very little sense to execute a risk assessment manually or build one’s own automated tool except in the most extraordinary circumstances. A risk assessment project that requires many work-months to complete manually — with virtually no practical “what-if” capability — can, with sound automated tools, be done in a matter of days, or weeks at worst, with credible, useful results.

TASKS OF RISK ASSESSMENT

In this section, we will explore the classic tasks of risk assessment and key issues associated with each task, regardless of the specific approach to be employed. The focus will, in general, be primarily on quantitative methodologies. However, wherever possible, related issues in qualitative methodologies will also be discussed.

Project Sizing

In virtually all project methodologies there are a number of elements to be addressed to ensure that all participants, and the target audience, understand and are in agreement about the project. These elements include:

  Background
  Purpose
  Scope
  Constraints
  Objective
  Responsibilities
  Approach

In most cases, it would not be necessary to discuss these individually, as most are well-understood elements of project methodology in general. In fact, they are mentioned here for the exclusive purpose of pointing out the importance of (1) ensuring that there is agreement between the target audience and those responsible for executing the risk assessment, and (2) describing the constraints on a risk assessment project. While a description of the scope — what is included — of a risk assessment project is important, it is equally important to describe specifically, in appropriate terms, what is not included. Typically, a risk assessment is focused on a subset of the organization’s information assets and control functions. If what is not to be included is not identified, confusion and misunderstanding about the risk assessment’s ramifications may result.

Again, the most important point about the project sizing task is to ensure that the project is clearly defined and that a clear understanding of the project by all parties is achieved.


Previous Table of Contents Next




Network Security Library - All you want to know about Windows, UNIX, NetWare, WWW, Firewalls, Intrusion Detection Systems, Security Policy, etc.