OK, consider this my Christmas wish…
There are many parts of world where there isn’t enough infrastructure to make existing CAP-based warning systems feasible. And yet those are some of the areas where wide-area, low cost alert and warning is most needed.
At the same time, one of the most neglected bits of the radio spectrum these days is the “medium wave” band below the commercial AM channels. Long the home of 1930s-style non-directional beacons and dwindling Morse code (CW) ship-to-shore communications, it’s also the home of a smattering of ham radio experimentation, various Differential GPS broadcasts (mostly around 300 kHz) and the global NAVTEX weather text system at 518 kHz (and, secondarily, 490 kHz.)
Unlike HF radio (“shortwave”), signals at these frequency cling to the Earth’s surface, a phenomenon called “groundwave propagation.” Unlike VHF transmissions like as those used by FM broadcasters and the NOAA Weather Radio system, these transmissions can be heard far beyond the horizon from the transmitter site. Based on NAVTEX and earlier marine CW experience the International Maritime Organization reports that a transmitter on those frequencies can generally cover a radius of 300-500 miles reliably (over seawater or moist earth) with a very modest 1 kilowatt output in the daytime and as little as 400w at night. (One key to this level of performance is avoiding interference by assigning designated time-slots to individual NAVTEX transmitters.)
For much of the 20th century 500 kHz was the primary global calling frequency for ships at sea. However, its use has been phased out in favor of other systems, mostly using HF and VHF radio and satellites. While a few preservationists continue occasional CW operations on 500 kHz from restored transmitter sites, the frequency is essentially inactive worldwide.
Which leads me to suggest that it would be both prudent and historically consistent to designate 500 kHz as a worldwide clear channel for wide-area broadcasts of warning messages regarding weather, tsunamis, volcanos and other hazards.
I believe we have all the technologies required to make this work. The Common Alerting Protocol can, under ITU-T recommendation x.1303, be transmitted in a compressed ASN.1 format suitable for narrowband media. Techniques such as Differential Phase Shift Modulation can provide very robust performance at extremely narrow bandwidths. We can look to NAVTEX as an example of how to manage transmission schedules to minimize interference and even make room for legacy CW operations. And receivers for medium wave transmissions can achieve very good performance using compact, rugged and low-cost ferrite-loop antennas.
As our ability to generate timely warnings of potential disasters improves, we need to improve our ability to deliver them to the people whose action can save their own lives and others. In the industrial nations we have many options for doing that, but most of the world’s population is waiting for simple, robust and appropriate technologies to meet this growing need.
The AIR Model is something I’ve presented in speeches and classes for many years, but I just realized I’ve never explained it here. Not sure exactly when I first concocted the AIR formulation, but it might have been in the late 1990s when I was looking back on what I’d learned working as a Public Affairs Officer for FEMA. Then again, it might date back to some of my lectures at the California Specialized Training Institute in the late 80s and early 90s. I should rummage the archives and figure that out someday.
Anyway… what I was looking for was a mnemonic, a memory trick that would help folks involved in emergency public information avoid the pitfall of thinking that human communication was merely a matter of information transfer. What I came up with was a play on the traditional Civil Defense logo:
It’s meant to suggest that EVERY act of communication has all three components:
- Alerting – Attracting or directing the attention of the audience;
- Informing – Providing information to the audience; and,
- Reassuring – Offering emotion-oriented signals about how the information should be processed (e.g., as a threat or as a blessing).
It’s impossible to share information with someone who’s attention is focussed elsewhere. Likewise, people in an overaroused emotional state are notoriously hard to inform. So even if our goal is simple education, we still need to consider the attention-management and emotional aspects of our communication efforts.
This has a number of practical implications. For example, here (via YouTube) are a few thoughts on the AIR Model and how it can help us unpack some of the bureaucratic and political aspects of public warning management, from a 2009 workshop for the California Community Colleges System:
“Geotagged tweets appear as drops of heat that accumulate on the globe. You can even switch the map to a 3D display if you have your pair of Red/Cyan glasses handy. There’s something about watching the East Coast heat up around lunchtime every day that seems deeply satisfying. I may not know what every tweet is saying, but I do know how the tweet traffic itself is changing….The most promising application may arrive when projects like A World of Tweets are combined with Twitter projects that provide sentimental analysis. We wouldn’t just know where people were tweeting, we’d know (roughly) how they are feeling while they tweet.”
That last bit resonates with a model of emergency public communication I’ve advocated for some time. The point of the AIR Model (“Alerting, Informing and Reassuring”) is that there’s more to human communication than just Shannon’s mathematics. Seems like a lot of social media traffic, on Twitter in particular, is more about directing attention (“alerting”) and expressing emotion (“reassuring,” or sometimes not!) than it is about transmitting information per-se.
I was particularly impressed with his single slide on the ancient dilemma of “Connection vs. Protection,” which I though neatly summed up so much of the current tension in areas like open source and homeland security.
Creativity is risky, but it seems like our only hope.
A pull-quote from an article on CNN.com caught my eye:
Nokia has a legacy software, which makes it difficult to throw out everything and recreate something new. -Thomas Kang, Strategy Analytics
This is a little-recognized, because relatively long-term, difficulty with the traditional marketing strategy of product differentiation through proprietary technology. In this case, the problem is that cellphone maker Nokia is deeply invested in an operating system that provides no easy migration to the newer smartphone designs.
Apple had the same problem with the Macintosh in the late 1990s. The old Mac OS had become limiting, but the firm was reluctant to make any change that would break backward compatibility. Eventually the shrinking of Apple’s market share become so acute that discontinuity became the least of their worries, and they shifted to the relative openness of the Unix-based OS X.
In his classic The Innovator’s Dilemma professor Clayton Christensen of Harvard’s Business School documented how successful firms often become trapped by the very investments that made them successful. Although he wrote primarily of the rising internal cost of money, the capitalization of patents and other proprietary technologies can have the same effect, inhibiting adaptation to a changing market.
Right now we’re seeing the same drama playing out in the broadcast industry. Although in theory broadcasters’ FCC licenses are temporary grants of public spectrum, station owners have routinely attached huge capital values to their control of radio and TV channels. Now that analog broadcast is losing its monopoly on content delivery, those capitalizations are in grave peril.
Much has been made lately of the explosive growth of the open Android smartphone operating platform. Even mainstream firms like Motorola and Samsung are swapping the future-proofing of openness for the near-term profits of proprietary architecture.
So openness as a strategy isn’t just in the interest of the consumer. In the long run it’s also a win for vendors.
“Quick Response” or QR codes are two-dimensional barcodes that can be read with a smartphone camera. They’re starting to catch on as an alternative to RFID for making a connection between the physical and virtual worlds.
QR codes can have several advantages over RFID chips (depending on the application, of course):
- They’re self-announcing - It’s obvious when they’re present;
- No special equipment is required - Free applications to read QR codes are available for download to popular smartphone models; the codes can be added to signs, printed onto stick-on labels, or just displayed on a computer or TV screen;
- They can be deployed easily, quickly and cheaply - You don’t need to buy or install a chip, just print or display a black-and-white image; a number of web sites and software downloads will generate the image for you; and,
- They’re open-standard based and web oriented - No special database is required; the code can be used to point the smartphone directly to a web page (if you want to control access to the information you can password-protect the web page).
In short, QR codes are a technology emergency managers can leverage without needing extra budget. Just brainstorming, here are a few ways I’m thinking they might be used:
- Put a QR code on on near the door to a room where hazmat is stored, linking the reader to a current inventory and MSDS.
- Put a QR code on a piece of infrequently-used equipment (a generator, for example) with a link to operating instructions (you know, the ones that keep getting mislaid.)
- Post fliers with QR codes in the waiting area of an application center, so folks can use their time reading background information and maybe even filling in a few of the forms online.
- Include a QR code in the EAS slide on TV and cable so folks can get more information. Keep the code on-screen (up in the corner, maybe) when regular programming continues so latecomers can get the alert details.
And that’s not to mention that QR codes can serve as a low-cost substitute for commercial bar-coding systems for inventory control (again, Avery labels are your friend!)
This strikes me as the sort of low-hanging fruit of the open source revolution that emergency managers should be picking. (The town of Manor, Texas has been a leader here.) Unfortunately, as with all open source products, no salesman is going to call on us and we’ll be getting no color glossy brochures. Free sometimes seems to be the hardest price for government to pay.
(Here’s a bit more technical information about QR codes. Apparently a single code can hold up to 4,296 characters of text and includes error-correction to compensate for dirt or scratches.)
The past couple of weeks I’ve been involved in a few threads on other blogs* as to the nature of resilience. Today on an EMForum conference call FEMA Administrator Craig Fugate fielded a question with what I thought was an interesting take on that much used but rarely defined term.
He said that FEMA was focusing on the ability to “restore critical services, with a stable tax base and a stable workforce.”
That may not satisfy everyone, but at least it’s specific enough to be actionable. Certainly it’s a government-and-business-centric model, and as such it may not speak to all the social aspirations that sometimes get wrapped in the blanket term resilience. Then again, Administrator Fugate is a government official with a keen sense of the limits of government.
So we have at least one fairly explicit working definition of resilience from the federal perspective: restore essential services, protect the tax base, and “stabilize” the workforce. (Quotes around that last one because I’m not sure in the current economy how we’d know a “stable” workforce if we saw one.)
Meanwhile my search continues for equal or greater specificity from other perspectives on what resilience actually entails.
“Knowledge is power” was an Industrial Age axiom. But in the Network Age mere knowledge is inert; it’s knowledge in motion that makes things happen. Whereas the icon of the bureaucratic era might have been a row of locked file cabinets, the icon of the emerging “2.0″ era might be a screen-style button labeled ”Chat.”
The most obvious reason is the heightened availability and sheer quantity of knowledge. When the arcana of almost any topic are readily available online, merely being in possession of facts is less of a distinction. (The burgeoning classification of information by governments might be seen as a reaction against that trend.)
At the same time, as our networks put us face to face with the diversity of the world and its peoples we’re forced to recognize the role of social construction in knowledge. The same observations can be explained by a variety of “facts” in differing social contexts. When we all lived relatively local lives it was easy to imagine that local ordinances of “reality” were in fact universal laws. But as we encounter a larger world more directly, many of our old verities lose their authority. (The “new tribalism” of retreat into congregations of the like-minded might represent a retreat from the cognitive friction of so much diversity.)
Plus there’s the subversion of traditional, largely geographic, lines of authority by networks that blithely cut across jurisdictional bounds. With the whole world available to us, local expertise, defined as unique possession of knowledge, rarely can compete. (For now, at least, linguistic borders seem more durable than geographic ones, although English speakers may perceive that less clearly than others because English has become a global lingua franca.)
Ultimately, of course, there’s the sheer complexity of our postmodern ambitions. Many of our current projects are orders of magnitude more complicated than those of earlier ages. As Kevin Kelly argues in his essential study Out of Control, the price of admission to more complex achievements is a delegation of governance with the accompanying sacrifice of some of our sense of control.
The terms “Web 2.0″ and “social media” refer largely to this shift from knowledge as property to knowledge as interaction. (Or to put it another way, they indicate that the networked media are finally breaking out of the cocoon of the publishing metaphor and discovering their own interactive nature.) The addition of knowledge is surpassed by the multiplication and, indeed, exponentiation of effect achieved by collaborative techniques like mashups and crowdsourcing.
Power in the network age arises not from what we know but from what we share.
Retrieved this 1995 article from an old website by means of the Internet Archive. Still rings true so I thought I might give it a new lease on life – Art
Sitting in a bar in Cincinnati the other night, I found myself chatting with a man whose home had been destroyed in the great fire in Oakland in 1991. Once Bob found out I was in the disaster business, and that I’d been there for the fire, he had a lot to say.
Mostly he talked about the extraordinary intimacy he’d felt with the people who sheltered him during and after the fire. Suddenly, in that crucible of crisis, he found himself crying and laughing, sharing deeply authentic moments with people he’d never met before. Four years later, those moments were Bob’s chief impressions of that disaster.
Which made sense to me. My fellow responders have always seemed like family. When we converge at some disaster or other, there’s a sense of reunion…hugs and handshakes, the ritual recounting of recent disaster and reports on absent friends…or maybe just a quick exchange of smiles and nods before we go to work.
But Bob’s remembrance got me thinking about the role of shared disaster in the formation of communities. Most cities, for example, have some great fire, flood or other misfortune enrolled in their orporate memory. San Francisco is the only one I know that’s gone to far as to add a phoenix to its city seal, but many could.
Likewise, in the media-sphere; the assassination of President Kennedy remains a defining experience for millions of people who only experienced it on television. More recently, many young (and not so young) people found community in observing the death of Jerry Garcia. As I write this, both Israel and the global community of statecraft are responding to the shock of the assassination of Yitzhak Rabin.
Maybe this feeling of community explains “emergent volunteer” behavior and the compulsive way people watch disaster coverage on TV. At some deep level we know how crucial disasters are in defining communities. We want to validate our membership. It’s sort of a “be there or be square” situation. Not to share the disaster makes one less a member of the community, forever denied both the external rituals ( the cocktail party remembrances and so on) and the personal sense of identification.
So what about those of us who consider ourselves responders? Around my volunteer fire station the conventional farewell was always, “See you on the big one!” (Although after the Oakland fire it was amended to “See you on the next one” for a while.)
The sentiment ran deep, even though it might seem an awful thing to look forward to somebody’s catastrophe. Of course, it wasn’t people’s suffering we were looking forward to…it was the chance to renew our standing in the community of responders…and in the larger community to which we respond.
A famous circus performer once said something to the effect that, “The high-wire is life: All the rest is waiting.”
I think many responders feel the same way. But that doesn’t really set us apart from emergency volunteers or the public at large, does it? We all want to be there for our communities when The Big One comes.
It’s what communities are all about.
The New York Times website has been running a fascinating series by Errol Morris entitled “The Anosognosic’s Dilemma: Something’s Wrong but You’ll Never Know What It Is.” (Anosognosia is defined in Wikipedia as “a condition in which a person who suffers disability seems unaware of or denies the existence of his or her disability.”)
Morris interviewed various psychologists, neurologists and others on the topics of denial, self-deception and general cluelessness. A couple of quotes caught my eye. One was from social psychologist David Dunning of Cornell:
Donald Rumsfeld gave this speech about “unknown unknowns.” It goes something like this: “There are things we know we know about terrorism. There are things we know we don’t know. And there are things that are unknown unknowns. We don’t know that we don’t know.” He got a lot of grief for that. And I thought, “That’s the smartest and most modest thing I’ve heard in a year.”
Another is from V.S. Ramachandran of UCSD and the Salk Institute, responding to Morris’ question, “Do we live in a cloud of belief that is separate from the reality of our circumstances?”:
Absolutely, and overall, fortunately, it’s a positive cloud in most of us. If we knew about the real facts and statistics of mortality, we’d be terrified…It may well be our brains are wired up to be slightly more optimistic than they should be.
I’ve sometimes said that emergency management is mostly the management of denial processes at the individual, organizational and social levels. Having read the Morris series I’d expand that to include other forms of self-deception.
Emergency managers frequently appeal to emotion in order to motivate groups and individuals, including leaders. And indeed it’s often our own feelings of hope that cause us to bother. But is there danger in drinking too much of our own Kool-Aid?
I’d suggest that one challenge we’ve faced in the early years of the Homeland Security era has been the tendency, especially when there’s a paucity of evidence to inform our choices, to substitute passion for understanding. That’s indubitably human, and it may well be characteristic of the beginnings of any new enterprise.
But by the same token, a key characteristic of maturity is the tempering of passion by experience.