Showing posts with label security metrics. Show all posts
Showing posts with label security metrics. Show all posts

Saturday, April 04, 2009

Boiling the O.C.E.A.N.

Metrics projects that are intended to consolidate and report on the state of security for an organization rarely fail for a lack of measures. Information technology systems, processes and projects all throw off an impressive amount of data that can be captured and counted. The Complete Guide to Security and Privacy Metrics suggests over 900 metrics, and NIST Special Publication (SP) 800-55 Rev. 1, Performance Measurement Guide for Information Security extends this analysis from the system level to an executive view by providing a framework for summarizing the results.

So given all of the measures, structure and guidance available, why is it so tough to be successful? The silent killer in this space is often a lack of focus: too many metrics, too much aggregation, and too little analysis connected to business problems and goals to provide useful insight.

Instead, its better start with the stakeholders and focus on fully understanding their goals and decisions without limiting the conversation with assumptions about what is or isn’t going to be measureable.

Consider this subset of stakeholders, and some of their goals:
Executive management – financial health and strategic direction of the organization. Are we profitable and are we executing effectively in the markets we serve?
Risk governance / Security management – are we keeping risk at an acceptable level? Are we making the best use of the security resources we have?
Line Management – are we achieving operational goals, and aligning with strategic initiatives?
These questions become an effective filter for removing the measures that don’t matter, and for finding common measures that, with analysis, can serve many different purposes. Here’s where it may be useful to classify measures from a stakeholder perspective in terms of the types of decisions that they enable:
Output measures - what is the primary deliverable from a given team?
Coverage measures – how many locations, systems or groups are covered by a given process or policy?
Exposure measures – what proportion of the environment stores or processes regulated information?
Activity measures – how many requests have been received during a given reporting period? Addressed?
Null measures – which teams have not provided data?

The last category is an important one, as it highlights the difference between a measure and a metric. A measure is an observation that increases your understanding about a situation and improves the quality of decision making; a metric is a standardized measurement. Inconsistent, incomplete and missing data from key teams or groups are an important measure of program maturity. Sometimes it’s what you can’t count that counts.

Above all else, resist the pressure to measure everything. A few well-chosen measures will allow for versatile and powerful analysis. There are literally dozens of ways to analyze and present a limited number of well-chosen data points. And when captured consistently over time, the correlations between seemingly unrelated activities offer the opportunity to surprise.

Thursday, January 01, 2009

Twitter Security

A few weeks ago I decided to give twitter a try, following some friends and colleagues scattered throughout the Midwest. Like sets of data points on a time-series plot, it’s amazing to see patterns develop 140 characters at a time.

As with most things that are new, cool or interesting, I wondered if there was a practical way to translate the things that make twitter ‘work’ into something useful at the office.

A few months ago I put together a one page summary of key metrics my project team had gathered and sent it to a number of stakeholders throughout the organization. The response was decent, but not as strong as I’d hoped. As nice as it would be for facts to flow like electrical current throughout an organization, powering change, I needed to put a lot of follow-on effort into making sure the themes of the report registered with decision makers.

As an experiment in communications, I wanted to see if the size and frequency of the message could make the change process any easier. I decided to “twitter” a single metric from a follow on project to see if I could make a bigger impact by dialing down the content but increasing the frequency. To start, I sent a four line Email that put the metric in context along with a recommended organizational response. So far, the hit rate is up.

Not every security metric or message reduces down to one or two sentences. But for those that do, sharing status, concerns and recommendations in a “blackberry friendly” format seems to increase the likelihood that it’ll get read, and re-sent, gaining momentum throughout the organization.

Sunday, June 29, 2008

SPREAD OUT!

If you’ve ever seen rec-level youth soccer led by volunteer coaches I’m sure you’re familiar with this scene: a knot of kids surrounding the ball in a swarm, kicking furiously with parents cheering on. Eventually one or both of the coaches shouts “spread out!!” Usually it’s at the same moment that the ball escapes the swarm, spurring a mad dash to form a new swarm…

After a few years of this, as a youth coach I finally promised myself I’d never use that phrase again. Besides the fact that it never works, there are a couple of other issues with it:

  • It’s an instruction without accountability: no player can accomplish it on her own.

  • You can do exactly what is asked without having any impact on helping your team win. In fact, during one of my games it went the other way -- I’ve seen our defense part like the red sea and open shooting lanes for the other team. Ouch!

  • Instead, I prefer a different phrase that’s just as short and to the point:

    GET OPEN!

    Sure, it’s still an instruction delivered to the whole team, but it enables accountability in a positive sense. You can identify and praise the kids who do it, and follow up with those that didn’t hear/understand what to do. And when kids recognize and respond, it helps the team get more shots and who knows … even score on occasion. As an added bonus I started counting the number of passes made by the team during each quarter. (Hawthorne was right … measurement motivates!)

    Connecting this back to information security, the key takeaway is that it’s possible even with distributed virtual teams to develop a capacity to adjust to unforeseen obstacles without building in excessive communication and coordination overhead. But efficient teams aren’t necessarily the result of teams with a high level of security domain knowledge (CISSP, GIAC, etc.) Sure, those skills are as critical as the soccer equivalent of dribbling and shooting -- but good things really start to happen when security processes collectively orient themselves around meaningful measures.

    Clear goals – decomposed into individually achievable contributions – measured with simple, easy to gather data - and reported internally / externally to both team members and stakeholders are the key to preventing knots and swarms.