Tag Archives: risk communication

punk rock OR featured on math podcast “The Other Half”

One of my blog posts about starting a fire at a gas station was featured on the math podcast The Other Half called “The Road Trip” by podcasters and professors Dr. Annie Rorem and Dr. Anna Haensch [Listen here] The podcast is about taking an optimal road trip (the Traveling Salesman Problem (TSP)) and rare risks associated with travel.

In The Road Trip, Anna and Annie look into the math that undergirds the great American summertime tradition of rolling down the windows, turning up the stereo, and touring the countryside by automobile.

Randy Olson has made the planning part easy by computing the optimal road trip across the U.S. His work to minimize the miles between landmarks in the lower 48 has been featured in the Washington Post and on Discovery News. In fact, Tracy Staedter of Discovery News can be credited not only with encouraging Olson to tackle this problem, but also with determining the list of landmarks he used. If you have a road trip you’d like to optimize, check out his code here.

And, because cars don’t run on math alone, we also consider the necessity of refueling on the road. In particular, we ask Laura McLay to weigh in on gas station safety, as she computes the conditional probability of blowing yourself up while you’re pumping gas.

The Road Trip” is n excellent podcast! Thanks to Annie and Anna for doing such a great job and for being math ambassadors. I look forward to future episodes.

The Other Half is part of ACME Science, which offers several other math and science podcasts.

One thing I would like to add to the podcast is that there are real applications of the TSP and risk analysis. We academics don’t always sit up in our ivory towers coming up with silly problems to solve that are divorced from the real world. We need to be able to characterize rare risks for numerous applications (e.g., nuclear power risks) and then communicate those risks to others for managing rare but potentially catastrophic risks. I have a few links to related blog posts at the bottom of this post. Likewise, the TSP isn’t just used to plan summer road trips. It’s used by trucking and delivery companies to plan routes, in gene sequencing, for meals-on-wheels deliveries, and in emergency response after a disaster.

A second point is that we really can optimally solve many instances of the TSP, and certainly the ones used for planning road trips. We do not always have to settle for a solution that is “good enough.” It’s true that there are more feasible solutions to many problems than there are stars in the galaxy, but we don’t solve the problems by brute force. We more intelligently solve the problems using optimization algorithms such as the simplex algorithm (a linear programming algorithm) and cutting planes (an integer programming method). Optimization algorithms traverse through the search space and find the single optimal solution among trillions of possibilities sometimes in mere seconds or minutes. It’s truly astonishing and a great contribution to basic science.

If you want more, Bill Cook is the world’s expert on the TSP and he has many examples of optimal solutions on his web site, including a TSP rout of 24,978 cities. Read Bill Cook’s (@wjcook) book and blog about the TSP for more details about the TSP’s history, algorithms, and people.

Related posts:


Type II errors are the ones that get you fired: the Atlanta edition

A couple weeks ago, a comment on twitter reminded me about Type I and Type II errors, which in turn reminded me of my first introduction to Type II errors in a probability and statistics course as an undergraduate student.

A type I error is the incorrect rejection of a true null hypothesis.

A type II error is the failure to reject a false null hypothesis.

Type I and Type II errors are a little confusing when you are first introduced to them. To make things easier, my professor gave us some practical information about Type II errors to help us put them in perspective:

And that brings us to the mess that was Atlanta this past week. If you recall, about 3″ of snowfall iced over, leading to mayhem. Schools and government did not shut down before the storm. Instead, they all closed at the same time, leading to an incredible amount of congestion that overwhelmed the impaired transportation network. Cars were abandoned on the highway and students camped out at school and in grocery stores for the night (presumably, everyone was stopping for bread and milk on the way home). I recommend these two articles from the Atlantic Cities to see just how bad it was in Atlanta [Link and Link]. Here is a time-lapse of the highways in Atlanta. Traffic went from fine to a disaster in an hour:

In this case, the weather itself was not a disaster. Poor management of the situation led to a disaster. (That almost sounds like it should be a bumper sticker: Weather doesn’t make disasters, people make disasters! At other times, the weather really is a disaster) David Levinson, a civil engineering professor at the University of Minnesota (a former Atlanta inhabitant, find him at @trnsprttnst) wrote an excellent piece on CNN about his perspective [Link]. I don’t have a whole lot to add except that managing the effects of severe weather has been and will continue to be a big issue in operations research (and civil engineering too).

  • Should you instead try to mitigate the ice by investing in salt and trucks to prepare the roads? This is not very practical in the South where it rarely snows.
  • Do you always play it safe and close schools? I lived that way in Virginia, and while it is safer, the Type I errors aren’t ideal. One year, school was canceled 5 days for a sum total of 1″ of snow across all of the days of canceled school.
  • If you decide not to close schools and later change your mind, should you stagger the closures? Yes. This is critical in congested cities like Atlanta and DC.

Related posts:

Many of my readers are from or have lived in Atlanta. What is your take?

 

 


what is the (conditional) probability of exploding when filling your car up with gas?

What is the probability that you will cause an explosion when filling up gas?

I like to wait inside my car when my car is filling up with gas. I do this to reset my trip odometer, stay warm, etc. One — and only one — of the local gas stations has a sign (pictured below) warning me to not get in and out of my car when fueling up. The implicit claim here is that getting in and out of my car raises my conditional probability of exploding. I as curious to see if (a) that claim was true and (b) how significant that rare risk would be if it were true, and (c) why only one gas station wants to protect me from this risk.

Snopes has some interesting data about this issue based on a PEI report [Link] The data is apparently a mess, but it looks like there were 81 fires over ~7 years, and information about 64 of these fires was available for further analysis. Most of the fires occurred in the most recent year. I assumed that maybe older fires were less likely to be recorded and therefore assumed that this represented 2 years worth of fires (~24B re-fuelings). The report implicates women in entering the car but the report’s authors don’t actually identify gender in their statistics (read the Snope article). I’ll take this into account.

It’s worth noting that men get struck by lightning way more than women. Maybe women balance this out by causing more fuel fires?

Claim (a) appears to be true. Let’s say there are 81 gas explosions in 2 years (24B refuelings). That is a base rate of 3.4×10^-9, a rare risk. This doesn’t give any insight into whether going back into one’s car is a bad idea, so let’s look at that issue further. To do so, I will assume that the 17 fires are randomly either caused by reentering or not reentering one’s car. Then, we know:

P(fire and reenter) = 1.5×10^-9

P(fire and no reenter) = 1.0×10^-9

Of course, we want to find P(fire | reenter) and P(fire | enter). To do so, let’s assume that women do 36% of all fuelings (same as the proportion of miles driven by women) and that women reenter their cars 50% of the time and men reenter 10% of the time.

P(fire | reenter) = 6.25×10^-9

P(fire | no reenter) = 1.4×10^-9

Fires in cars where someone reenters their car occur 4.5 as often as when they don’t. If we look at an extreme example where all women reenter but no men do

P(fire | reenter) = 4.22×10^-9

P(fire | no reenter) = 1.65×10^-9

Here, people that reenter (all women) cause fires 2.5 more than those that do not reenter (all men).

But those that reenter may not always cause fires at a higher rate than those that do not reenter – it all depends on how often people reenter their car. Let X = the probability that people reenter their cars while fueling:

P(fire and reenter) / X >= P(fire and no reenter) / (1-X)

This is true when X < 0.591. This appears to be true, but it's a fairly realistic number even though I bet it's a bit high. Given the rarity of fires, people that reenter probably do not cause fires at statistically significantly higher rates than those that do not reenter.

Any risk that happens about once every billion fuelings is very rare. I will not fuel my car that often over the course of my lifetime, so I can expect to escape without causing a fire.

Let’s compare the risk associated with causing an explosion due to reentering one’s car to the risk associated with dying in a car accident. To do so, let’s assume that one drives 300 miles between refuelings for an apples to apples comparison. Then, the risk of dying between refuelings is 3.3×10-6 (1.1 deaths per 100M miles driven). When compared to the risk of exploding given that you reenter, you are 528 more likely to die driving than explode when reentering your car while fueling.  Therefore, I’ll conclude that if a risk from reentering my car exists but that it’s fairly insignificant (Claim b), which explains why most gas stations are not concerned about warning me (Claim c).

Have you ever caused a fire while filling your car up with gas? Do you even know of anyone second- or third-hand who exploded while fueling?

20130411-092841.jpg


operations research, disasters, and science communication

I had the pleasure of speaking at the AAAS Meeting on February 17 in a session entitled Dynamics of Disasters: Harnessing the Science of Networks to Save Lives. I talked about my research that addresses how to use scarce public resources in fire and emergency medical services to serve communities during severe but not catastrophic weather events. My research has application to weather events such as blizzards, flash flooding, derechos, etc. that are not so catastrophic that the National Guard would come. Here, a community must meet demands for fire and health emergencies within a community using the resources that they have during “regular” days – e.g., ambulances and fire engines – while the transportation network is impaired due to snow, flooding, etc. Everything is temporarily altered, including the types of 911 calls that are made and travel and service times as they are affected by an impaired transportation network. Plus, it’s always a lot of fun to mention “Snowmaggedon” during a talk.

Anna Nagurney organized the session, and the other speakers included David McLaughlin, Panos Pardalos, Jose Holguin-Veras, and Tina Wakolbinger. They talked about a number of issues, including:

  • how to detect tornadoes temporally and spatially by deploying new types of sensors
  • how to evaluate people and even livestock during hurricanes and floods
  • what the difference between a disaster and a catastrophe is
  • what types of emergency logistics problems require our expertise: national versus internationa, public vs. non-profit, mitigation vs. preparedness vs. response, short-term disaster vs. long-term disaster

I applaud Anna Nagurney for organizing a terrific session. It was fascinating to talk to people in my field about disasters without focusing too much on the modeling details. We all mentioned which types of methodologies we used in the talk, but we focused on the takeaways, actionable results, and policy implications. And it’s clear that the opportunities in this area are almost endless.

The AAAS Meeting is all about science communication to a large audience. The talks focus on broader impacts not specific model details. It’s not always easy for me to take a step back from my research and explain it at a higher level, but I get a lot of practice through blogging and talking about my research in my classes. Still, I was nervous. I am a mere blogger – the conference is heavily attended by real science journalists. In fact, I had to submit speaker information and a picture ahead of time so that journalists prepare for my talk. I truly felt like an OR ambassador – it was quite an experience.

I attended another session on disasters, where the topics often revolved around forecasting power, false alarms, and risk communications. I have blogged about these issues before in posts such as what is the optimal false alarm rate for tornado warnings? and scientists convicted for manslaughter for making a type II error. This appears to be an ongoing issue. According to the scientists on the panel, part of the problem stems from journalists who want to make a good story even juicier by not portraying risk accurately, thus leading to false alarm fatigue.

Other sessions at the AAAS Meeting addressed several fascinating topics. One session was on writing about science, and it featured a writer from the Big Bang Theory. Another session was about communicating science to Congress. Many of the speakers were from science publications and PBS shows.

I have at least one other blog post on science communication in the works, so stay tuned.

My slides are below: