Four Simple Steps to Evaluate Critical Incident Response Tools

Imagine you arrive at your regional cardiac hospital for that emergency double bypass surgery when the surgeon greets you with exciting news:

“We’ve got an amazing new medical instrument and procedure we’ll be using for your surgery. It’s never been used on an actual patient before, and it was designed by automobile mechanics– but the salesman says it’s an outstanding tool! And the marketing material looks great!”

Clearly, you’d be in the market for a new surgeon and hospital.

That imaginary scenario can never happen because new medical devices have to be cleared and approved by the FDA before they can be legally sold in the United States. In fact, new medical devices in the US take from 3-7 years to make it from concept to approval.

Whether it be medicine, aviation, or the automobile industry, there are many layers of protection in place to ensure that when a product comes to market– especially one that can have life-or-death consequences– it has been vetted and tested in the real world to determine its safety and effectiveness.

There is one fast-growing industry, however, where these checks and balances don’t exist: the safety and security industry surrounding active killer and critical incident response.

The school safety vertical alone is expected to see $2.8 billion in expenditures by 2021. There are surely many well-meaning companies out there trying to contribute solutions to mitigate the ever-increasing risk of school violence and active killer incidents. Like any other time in a capitalist economy when there is strong demand, there will be companies scrambling to get into the space in order to grab a piece of that $2.8 billion.

The problem is there is little information or guidance for evaluating the effectiveness of any of these solutions. This leaves school administrators, facilities managers, and others responsible for safety with little data on which to base one of the most important decisions they’ll make: how to protect their students, staff, and those for whom they’re responsible.

The explanation for the lack of data is simple, yet troubling: the vast majority of solutions being purchased today have never been tested in a real-world event or environment.

You Can’t Solve a Problem You Don’t Understand

In order to solve any problem, you have to first understand it. Many of the unproven safety and response solutions available today are not designed by (or in consultation with) public safety professionals, industry thought leaders, or those with experience responding to and managing critical incidents.

Instead, they’re designed by software engineers, MBAs, or other intelligent and well-meaning people who are trying to solve a problem they simply don’t understand.

At the end of the day, these people end up committing what are common errors in the world of entrepreneurs: they either create a solution for a problem that doesn’t exist, or bring a “solution” that actually makes the problem worse.

Common Operating Picture Isn’t Plural

When you survey the critical incident response solution landscape, you’ll see a lot of companies throw around the phrase “common operating picture” claiming that their product provides one.

A common operating picture is widely recognized as the solution to long-standing problems with communication and situational awareness that hamper response and life-saving efforts during mass killings and other critical incidents.

But the fact is, you can only have one common operating picture. If five different responders from five different agencies are using five different “common operating pictures” there is nothing common about them– they’re all on a different page.

The reality of the market is that the common operating picture that will be universally adopted will be the one that public safety professionals agree to use and embrace as a group. Only when thousands of responders from a myriad of agencies across the United States are using the same picture can it truly become America’s common operating picture.

Different Apps, Common Content

Using the same picture doesn’t mean that all first responders have to be using the same app or display system– it just means that they have to be using the same content. Many of the solutions being sold today are smartphone applications purporting to give all kinds of critical information that first responders need. On the surface, it seems like a good idea. After all, real time intelligence, situational awareness, and geographic information about a location are important to successfully resolving any critical incident.

The proliferation of narrow-purpose apps, however, ends up being an ironic metaphor for the major challenge that surfaces in virtually every school shooting and active killer incident: different people with different tools that don’t work together.

Public safety agencies can’t be expected to operate in an environment where the school has one safety app, the mall has another safety app, and the office complex has yet another safety app. Such technology saturation makes it impossible for law enforcement to have (and be proficient with) the countless available apps.

Potentially hundreds of first responders from many different agencies descend upon active killer scenes with little knowledge of the location. They use different radio frequencies, different nomenclature, different tactics, and different apps. The lack of a common operating picture results in them responding as uncoordinated individuals instead of a coordinated team.

When we embrace common content, however, any app first responders have that can display that common content becomes a tool that provides the common operating picture and overcomes these deficiencies.

Evaluating the Options

With a multi-billion dollar market saturated with purported solutions, how is an administrator, school board, or anyone responsible for public or facility safety supposed to evaluate available solutions? It turns out that answering just four critical questions will quickly bring into focus whether or not the solution you’re evaluating is a good choice:

  • Who designed it?
  • Who uses it?
  • Has it been used in the real world?
  • Who endorses it?

Who Designed It?

Remember, to solve a problem you first have to understand it. Did the people who designed the solution you’re evaluating manage critical incidents? Do they have public safety experience? Have they actually operated during a critical incident under stress and risk of personal harm?

There are incredibly intelligent and creative product designers out there, but if you’re relying on a college-aged engineer whose experience with coordinating assets during critical incidents is limited to Fortnite or Call of Duty, you’re setting yourself up for a spectacular failure.

Solutions have to help reduce the complexity of an emergency response, not introduce cool ideas that make sense on a drawing board but are senseless during the unforgiving realities of crisis response.

The common operating picture you settle on is going to be used by police officers and other first responders when bullets are flying, victims are wounded, and seconds count. It must be developed by people who have experience operating in that environment.

Who Uses It?

Great content is rendered utterly useless when first responders arriving at a scene don’t
have access to it. Does the app or platform your local police department uses allow them to display the common operating picture content?

Most crisis responses will be multi-jurisdictional and multi-disciplinary. Many times, a police officer from an adjoining town will be closer to a problem than the agency actually responsible for the location. How is that officer connected to the disparate systems that exist? How do they coordinate from a common operating picture?

Is it part of their training and policy? Can they access the content from their in-car computers, document management systems, and CAD/RMS systems?

If not, your well-intentioned expenditure will be nothing more than a virtual-paperweight during a critical incident.

Has it Been Used in the Real-World?

This is perhaps one of the most critical evaluation criteria. Would you put your family on a brand new commercial aircraft that was completely designed and tested on a computer and had never once actually flown in the real-world? Of course, you wouldn’t.

Why, then, would you entrust the safety of your school children, customers, employees, or the public to a theoretical solution that is unproven and untested in the real world?

Do Public Safety Organizations Endorse It?

This question is often overlooked but is incredibly relevant. Are there agencies responsible for public safety that designate the content as a best practice and encourage its adoption? If so, that gives a strong indication that thought leaders and senior executives in the realm of public safety have already assessed the solution with criteria similar to these and have concluded that it is a viable, real-world solution.

Checking the Boxes with Collaborative Response Graphics®

When we use the four criteria listed above to evaluate CRGs®, it is easy to see why they’ve been adopted as America’s common operating picture.

Design

CRGs were designed by people who come from the market they serve. Members of the team used the United States military equivalent, the GRG (Gridded Reference Graphic), during thousands of real operations conducted against terrorist enemies overseas. The
GRG was and remains the common content that our nation’s finest fighting forces use to provide a common operating picture during operations abroad.

At the same time, members of the team with a background in public safety have served domestically as both responders and commanders during multi-jurisdictional responses to critical incidents. They understand the unique challenges responders face with respect to communication, situational awareness, and the need for an accessible and understandable common operating picture.

Those valuable lessons were critical ingredients in the design and development of CRGs.

Users

CRGs are used by thousands of public safety agencies and professionals across America. CRGs are truly common content. They are ingested and displayed by a vast number of smartphone apps, CAD/RMS systems, geospatial systems, and document delivery platforms across the United States.

Real-World Use

CRGs are the only content that has been used in thousands of real operations overseas as well as during real events and incidents domestically.

They are not conceptual or theoretical. Their adoption does not require faith in a theory or intention– they are known and proven to be effective.

Endorsement and Adoption

CRGs are recognized as a best practice by state chiefs of police associations across the United States. They were recognized by the New Jersey Office of Homeland Security and Preparedness and the New Jersey State Police as a protective measures best practice. They form the foundation of New Jersey’s statewide mapping initiative to protect schools and critical infrastructure.

CRGs enhance response time and improve command and control during an incident. They have been validated by thousands of real-world incidents under the most stressful conditions and are deployed across the United States and internationally to protect schools, businesses, hospitals, and other critical infrastructure.

They ultimately enable first responders from anywhere in America to arrive at a location to which they’ve never been, instantly orient themselves, communicate location-based information, and work together to save lives.

Conclusion

When you use the four-step evaluation, it’s easy to see why CRGs have been widely adopted as America’s common operating picture.

It’s also easy to see why so many of the available “solutions” on the market are destined to fail when they’re needed most: they represent untested and unproven concepts designed by people who don’t understand the problem.

Newsletter