P23
Security Southwest Florida
tabletop exercises 7 min read

Tracking Security Readiness Across Quarters and Years

A security program is only as good as what you can measure. Here's how to track readiness over time so leadership can see real progress.

By P23 Security · 2026 · Serving Southwest Florida, Fort Myers, Cape Coral + more
A simple tracking chart on a clipboard showing quarterly security metrics

If you cannot measure it, leadership will stop investing in it.

Security programs lose leadership attention quietly. Not because leaders decide security is unimportant. Because, over time, there is no visible evidence that the investment is producing return. Budgets shift. Staff focus elsewhere. The next item on the agenda crowds out the one that never produces a clear update.

The counter to this drift is measurement. Not elaborate dashboards. Simple, consistent tracking of the things that matter most, reviewed on a rhythm that matches leadership’s cadence. Tracking does not make a program better. Tracking makes it visible, and visibility is what sustains investment.

The four metric categories that matter.

For most small and mid-size organizations in Southwest Florida, four categories of metric cover the program:

1. Completion metrics

The simplest category. What the program has done over a given period.

  • Tabletop exercises completed in the quarter
  • Live drills conducted in the quarter
  • Training sessions delivered
  • Walkthroughs conducted by internal or external advisors
  • Policy reviews completed
  • Board or leadership updates delivered

Completion metrics are the easiest to track and the easiest to gamify. Tracking them alone is not sufficient. Tracking them in combination with the other categories is valuable.

2. Personnel metrics

Who has been trained, certified, and prepared.

  • Percent of staff current on required training
  • Percent of volunteers current on required training
  • Number of certified Stop the Bleed or CPR/AED trained individuals
  • Internal trainers active, by capability
  • Background check currency (percent of personnel with current screening)
  • Access credential audit status (percent of credentials verified in last review)

Personnel metrics are particularly important for organizations serving vulnerable populations, where training currency affects legal and insurance standing.

3. Finding and closure metrics

The heart of program improvement.

  • New findings identified in the period (from audits, drills, AARs, walkthroughs)
  • Findings closed in the period
  • Findings currently open, by priority (30-day, 60-day, 90-day)
  • Findings open past their target timeline
  • Cumulative trend: new vs closed over multiple quarters

A healthy program has a high closure rate on findings and a declining rate of new findings over time as the program matures. Both trends are visible only through consistent tracking.

4. Performance metrics

What the program can actually do.

  • Drill response times (evacuation, lockdown, medical response)
  • Protocol adherence rate during drills
  • Debrief feedback scores from participants
  • Percent of drill objectives met
  • After-action follow-up completion rate

Performance metrics are harder to measure but produce the most meaningful trend data. A program whose drill response times are declining quarter over quarter is a program that is genuinely improving.

10-15
metrics typical for a mature P23 client program, covering all four categories without overwhelming leadership review capacity
P23 program tracking methodology

What not to measure.

Not everything deserves a metric. Metrics that do not drive action are vanity metrics. They feel productive to track and waste the time of everyone involved.

Avoid:

  • Counts of items that cannot be influenced (number of days since last incident is not useful for most organizations)
  • Feelings or vague assessments (percent of staff who “feel safe”) without specific follow-up
  • Metrics that invite gaming (number of security posts patrolled, when patrol is not the right practice)
  • Complex composite scores that obscure the underlying reality

The principle: every metric should drive a question that has an actionable answer.

The quarterly rhythm.

Most small and mid-size organizations benefit from a quarterly readiness review, sized to leadership attention.

The quarterly review agenda

  • Review completion metrics (what happened this quarter)
  • Review personnel metrics (training currency and credential status)
  • Review findings (new and closed this quarter)
  • Review open findings past their target timeline
  • Review any performance data from drills in the quarter
  • Identify focus areas for the coming quarter
  • Confirm scheduled drills, training, and policy reviews for the coming quarter

A well-run quarterly review runs 45 to 90 minutes. The update goes to operational leadership, with a summary for any board or trustee body.

The annual view.

Once per year, the metrics should be rolled up into a longer view that tells the story of the program’s evolution.

The annual readiness report:

  • Shows year-over-year comparison of all core metrics
  • Identifies trends (improving, stable, declining)
  • Tells the narrative of the program’s development
  • Sets priorities for the coming year
  • Becomes the document presented to boards, trustees, or major donors

The annual report is one of the most powerful tools for sustaining leadership investment in the program. It demonstrates, concretely, what the organization has built and how it is getting better.

The Hurricane Ian stress test for metrics.

After Hurricane Ian in 2022, organizations that had been tracking readiness metrics had a specific advantage. They could point to evidence. “We trained our staff in severe weather protocols in Q2. We tabletop-rehearsed hurricane response in Q3. Our decision-making during the storm reflected that preparation.”

Organizations that had invested in security but had not tracked were in a harder position. They knew they had done things. They could not describe those things in ways that communicated effectively to boards, donors, or insurers.

The lesson: the metrics are for communication as much as for internal improvement. Leadership, boards, and external stakeholders rely on the documented record to understand the state of the program. Without that record, all sides are reasoning from vibes.

The verse describes stewardship proportional to resources. Organizations entrusted with the care of children, vulnerable adults, congregants, or donors carry a high standard of care. Tracking is one of the ways that care becomes visible, accountable, and sustainable.

Building tracking from scratch.

For organizations that have not been tracking readiness, a simple starting point:

  • Pick three metrics, one from each of completion, personnel, and findings
  • Set up a simple spreadsheet or document to record them quarterly
  • Assign an owner (the safety team lead, executive director, or fDoS advisor)
  • Schedule the first quarterly review
  • Review the usefulness of the metrics after two quarters; adjust
  • Add performance metrics in year two as drill and training data accumulates

The program will not feel mature for at least a year. The compounding value becomes clear by year two and substantial by year three.

The fDoS advantage.

For clients on ongoing fDoS engagements, readiness tracking is built into the monthly rhythm. The monthly memos, quarterly reviews, and annual audits all contribute data points. The advisor owns the tracking discipline, produces the reports, and presents the metrics to leadership.

Organizations without fDoS can still track effectively, provided someone internally takes explicit ownership. Without an owner, tracking tends to degrade over time, just like any unclaimed responsibility.

The metrics that produce sustained investment.

A pattern we have observed consistently across clients in Southwest Florida: organizations that track readiness metrics and present them to boards on a regular rhythm continue to invest in their security programs over time. Organizations that do not track tend to see budget attention drift away. Not because the programs become less valuable. Because the value becomes invisible.

Measurement produces visibility. Visibility sustains investment. Investment sustains the program. The loop begins with the decision to track something.

If your organization in Fort Myers, Cape Coral, Naples, or Port Charlotte wants to build the tracking discipline that will sustain your security program over years, we would be glad to help design the metrics that fit your specific context. The discipline is learnable. The payoff is durable.

Serving Southwest Florida · Fort Myers · Cape Coral · Naples · Port Charlotte

Ready when you are

Practice the scenario. Close the gap.

A structured tabletop or drill, facilitated for your team, with a written after-action that turns practice into change.

Schedule a tabletop