Tag Archives: disasters

Public sector operations research: the course!

Course introduction

I taught a PhD seminar on public sector operations research this semester. You can read more about the course here. I had students blog in lieu of problem sets and exams They did a terrific job [Find the blog here!]. This post contains summary of what we covered in the course, including the readings and papers presented in class.

Readings

Public Safety Overview

  • Green, L.V. and Kolesar, P.J., 2004. Anniversary article: Improving emergency responsiveness with management science. Management Science, 50(8), pp.1001-1014.
  • Larson, R.C., 2002. Public sector operations research: A personal journey.Operations Research, 50(1), pp.135-145.
  • Rittel, H.W. and Webber, M.M., 1973. Dilemmas in a general theory of planning. Policy sciences, 4(2), pp.155-169.
  • Johnson, M.P., 2012. Community-Based Operations Research: Introduction, Theory, and Applications. In Community-Based Operations Research (pp. 3-36). Springer New York. (Originally an INFORMS TutORial)
  • Goldberg, J.B., 2004. Operations research models for the deployment of emergency services vehicles. EMS Management Journal, 1(1), pp.20-39.
  • Swersey, A.J., 1994. The deployment of police, fire, and emergency medical units. Handbooks in operations research and management science, 6, pp.151-200.
  • McLay, L.A., 2010. Emergency medical service systems that improve patient survivability. Wiley Encyclopedia of Operations Research and Management Science.

Facility location

  • Daskin, M.S., 2008. What you should know about location modeling. Naval Research Logistics, 55(4), pp.283-294.
  • Brotcorne, L., Laporte, G. and Semet, F., 2003. Ambulance location and relocation models. European journal of operational research, 147(3), pp.451-463.

Probability models for public safety

  • Larson, R.C. and Odoni, A.R., 1981. Urban operations research. This was the textbook we used to cover probability models, queueing, priority queueing, and spatial queues (the hypercube model).

Disasters, Homeland Security, and Emergency Management

Deterministic Network Interdiction

  • Smith, J.C., 2010. Basic interdiction models. Wiley Encyclopedia of Operations Research and Management Science.
  • Morton, D.P., 2011. Stochastic network interdiction. Wiley Encyclopedia of Operations Research and Management Science.

Papers presented by students in class

Papers selected for the first set of student presentations (background papers)

  • Blumstein, A., 2002. Crime Modeling. Operations Research, 50(1), pp.16-24.
  • Kaplan, E.H., 2008. Adventures in policy modeling! Operations research in the community and beyond. Omega, 36(1), pp.1-9.
  • Wright, P.D., Liberatore, M.J. and Nydick, R.L., 2006. A survey of operations research models and applications in homeland security. Interfaces, 36(6), pp.514-529.
  • Altay, N. and Green, W.G., 2006. OR/MS research in disaster operations management. European journal of operational research, 175(1), pp.475-493.
  • Simpson, N.C. and Hancock, P.G., 2009. Fifty years of operational research and emergency response. Journal of the Operational Research Society, pp.S126-S139.
  • Larson, R.C., 1987. Social justice and the psychology of queueing. Operations research, 35(6), pp.895-905.

Papers selected for the second set of student presentations (methods)

  • Ashlagi, I. and Shi, P., 2014. Improving community cohesion in school choice via correlated-lottery implementation. Operations Research, 62(6), pp.1247-1264.
  • Mandell, M.B., 1991. Modelling effectiveness-equity trade-offs in public service delivery systems. Management Science, 37(4), pp.467-482.
  • Cormican, K.J., Morton, D.P. and Wood, R.K., 1998. Stochastic network interdiction. Operations Research, 46(2), pp.184-197.
  • Brown, G.G., Carlyle, W.M., Harney, R.C., Skroch, E.M. and Wood, R.K., 2009. Interdicting a nuclear-weapons project. Operations Research, 57(4), pp.866-877.
  • Lim, C. and Smith, J.C., 2007. Algorithms for discrete and continuous multicommodity flow network interdiction problems. IIE Transactions, 39(1), pp.15-26.
  • Rath, S. and Gutjahr, W.J., 2014. A math-heuristic for the warehouse location–routing problem in disaster relief. Computers & Operations Research, 42, pp.25-39.
  • Argon, N.T. and Ziya, S., 2009. Priority assignment under imperfect information on customer type identities. Manufacturing & Service Operations Management, 11(4), pp.674-693.
  • Pita, J., Jain, M., Marecki, J., Ordóñez, F., Portway, C., Tambe, M., Western, C., Paruchuri, P. and Kraus, S., 2008, May. Deployed ARMOR protection: the application of a game theoretic model for security at the Los Angeles International Airport. In Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems: industrial track(pp. 125-132). International Foundation for Autonomous Agents and Multiagent Systems.
  • Mills, A.F., Argon, N.T. and Ziya, S., 2013. Resource-based patient prioritization in mass-casualty incidents. Manufacturing & Service Operations Management, 15(3), pp.361-377.
  • Mehrotra, A., Johnson, E.L. and Nemhauser, G.L., 1998. An optimization based heuristic for political districting. Management Science, 44(8), pp.1100-1114.
  • Koç, A. and Morton, D.P., 2014. Prioritization via stochastic optimization.Management Science, 61(3), pp.586-603.

I missed a class to attend the INFORMS Analytics meeting. I assigned two videos about public sector OR in lieu of class:

Jon Caulkins’ Omega Rho talk on crime modeling and policy

Eoin O’Malley’s talk about bike sharing and optimization (start at 3:51:53)

Blog posts I used in teaching:

We played Pandemic on the last day of class!

Advertisements

my national academies committee experience & risk-based flood insurance

I had the pleasure of serving on a National Academies committee the past two years. Our report entitled “Tying flood insurance to flood risk for low-lying structures in the floodplain” was just released [Link].

If you don’t know much about the National Academies, it is a private, independent, nonprofit institution that provides technical expertise to important societal problems (engineering, in my case). The National Academies committees like the one I participated in address a specific challenge and has a very specific charge. The committee is composed of a bunch of really smart people who work together to answer the questions posed in the charge. FEMA provided the charge for my committee.

The specific charge is below, but a bit of background is necessary to know why the problem is so important and was it had to be addressed now. Recently, I blogged about floods and their huge impact on society [Link]. After a series of hurricanes that caused extensive flood damage to properties, the National Flood Insurance Program (NFIP) was created (in 1968) to reduce the risk of flood and mitigate flood damage by encouraging better flood management and building practices. The idea was that homeowners in flood-prone areas (“Special Flood Hazard Areas” – areas with an annual chance of flooding of 1% or more) would have to pay for flood insurance to help pay for the cost of disasters. Today, most homeowners in Special Flood Hazard Areas pay the going rate based on an elaborate formula set by FEMA. There are currently about 5.5 million flood insurance policies.

Those houses that already existed in a Special Flood Hazard Area in 1968 could be grandfathered into the program and receive subsidized rates. Over time, the hope was that these existing houses in Special Flood Hazard Areas would be replaced, thus reducing exposure in flood-prone areas. But they were not. They continue to exist and are expensive for FEMA when disasters strike. This is a huge problem. FEMA’s insurance premium formula as well as risk-based actuarial rates are incredibly sensitive to the elevation of the home relative to base flood elevation. These homeowners may pay $200 for a flood insurance premium per year when a risk-based actuarial rate may thousands of dollars. These houses are negatively elevated, meaning that they are below base flood elevation and flood frequently. There are a lot of these structures out there and they are costly to FEMA.

The Biggert-Waters (BW) Flood Insurance Reform Act of 2012 required these subsidized policies to disappear overnight, turning this important problem into an immediate problem. Subsequent legislation changed some of this, but the bottom line was that subsidized rates would rise, substantially for some. FEMA wanted a review of how they set their rates to be credible, fair, and transparent. That is where the committee came in.

Here is our study charge set by FEMA. In our conversations with FEMA actuaries, FEMA asked for shorter-term (within 5 years) and longer-term recommendations for improving their methods. FEMA asked us to look at how premiums are set and how the process could be improved. We focused on the math; another committee addressed the societal impact of the changes.

Study Charge
An ad hoc committee will conduct a study of pricing negatively elevated structures in the National
Flood Insurance Program. Specifically, the committee will:

  1. Review current NFIP methods for calculating risk-based premiums for negatively elevated structures, including risk analysis, flood maps, and engineering data.
  2. Evaluate alternative approaches for calculating “full risk-based premiums” for negatively elevated structures, considering current actuarial principles and standards.
  3. Discuss engineering, hydrologic, and property assessment data and analytical needs associated with fully implementing full risk-based premiums for negatively elevated structures.
  4. Discuss approaches for keeping these engineering, hydrologic, or property assessment dataupdated to maintain full risk-based rates for negatively elevated structures.
  5. Discuss feasibility, implementation, and cost of underwriting risk-based premiums for negatively elevated structures, including a comparison of factors used to set risk-based premiums.

We constructed ten conclusions:

  1. Careful representation of frequent floods in the NFIP water surface elevation–probability functions (PELV curves) is important for assessing losses for negatively elevated structures.
  2. Averaging the average annual loss over a large set of PELV curves leads to rate classes that encompass high variability in flood hazard for negatively elevated structures, and thus the premiums charged are too high for some policyholders and too low for
    others.
  3. NFIP claims data for a given depth of flooding are highly variable, suggesting that inundation depth is not the only driver of damage to structures or that the quality of the economic damage and inundation depth reports that support the insurance claims is poor.
  4. When the sample of claims data is small, the NFIP credibility weighting scheme assumes that U.S. Army Corps of Engineers damage estimates are better than NFIP claims data, which has not been proven.
  5. Levees may reduce the flood risk for negatively elevated structures, even if they do not meet NFIP standards for protection against the 1 percent annual chance exceedance flood.
  6. When risk-based rates for negatively elevated structures are implemented, premiums are likely to be higher than they are today, creating perverse incentives for policyholders to purchase too little or no insurance. As a result, the concept of recovering loss through pooling premiums breaks down, and the NFIP may not collect enough premiums to cover losses and underinsured policyholders may have inadequate financial protection.
  7. Adjustments in deductible discounts could help reduce the high risk-based premiums expected for negatively elevated structures.
  8. Modern technologies, including analysis tools and improved data collection and management capabilities, enable the development and use of comprehensive risk assessment methods, which could improve NFIP estimates of flood loss.
  9. Risk-based rating for negatively elevated structures requires, at a minimum, structure elevation data, water surface elevations for frequent flood events, and new information on structure characteristics to support the assessment of structure damage and flood risk.
  10. The lack of uniformity and control over the methods used to determine structure replacement cost values and the insufficient quality control of NFIP claims data undermine the accuracy of NFIP flood loss estimates and premium adjustments.

You can read more about our report and its conclusions in the press release.

The committee was composed of 12 members and included civil engineers, risk analysts, actuaries, and one retired FEMA employee. Our fearless chair David Ford did a lot of the heavy lifting in terms of crafting our core conclusions. National Academies staff member Anne Linn was incredibly helpful in terms of getting us focused, staying on track, and writing the report. National Academies staff member Anita Hall did the logistics and was incredibly responsive to our travel needs. The committee met in person four times and wrote parts of the report. The report was sent out to reviewers, and we changed parts of the report in response to reviewer comments much like in a peer-reviewed journal. We couldn’t have done this without David and Anne (many thanks!)

Serving on the committee helped me understand the importance of flooding from many possible perspective. I bought a new house during the my time on the committee. My new house is on the top of a ridge with virtually no chance of flooding.

Serving on the committee also helped me to learn about state-of-the-art techniques in civil engineering and risk-based insurance. Our colleagues in other fields do some pretty cool things, and we can all work together to make the world a better place. I’m proud of our final report – I hope it leads to more credible, fair, and transparent NFIP flood insurance premiums.

21720-030937166X-450


flood risks and management science

This week’s flooding in Texas highlights how vulnerable we are to flood risks. Texas is extremely prone to flooding yet is among the worst states when it comes to flood-control spending. Texas is exposed to significant riverine flooding in addition to storm surge flooding caused by hurricanes and tropical storms. Texas has the second most flood insurance premiums in the US (second only to Florida).

In the past year, I have been serving on a National Academies committee on flood insurance for negatively elevated structures. I have learned a lot about flood insurance, incentives, and risk. I can’t say anything about the report yet except that it will be released soon, but I can tell you a little about the problem and how management science has helped us understand how to mitigate flood risks.

Floods occur frequently (although not frequently enough to motivate people to mitigate against the risks) and when floods occur, they do a lot of damage. The damage is usually measured in terms of structure and property damage, but flooding also leads to loss of life and injuries. Flooding is not just a coastal issue – floods occur along rivers, in areas with high water tables, and in urban areas where infrastructure channels water in such a way that it creates flood risks. Cook County, Illinois has chronic urban flooding issues that is expensive. Floods lead to enormous socioeconomic costs. Two-thirds of Presidential disaster declarations involve floods.

The basic approach to managing flood risks is to (1) encourage people not to build in floodplains and (2) building less flood-prone structures and communities to reduce the damage when floods occur. Getting this to happen is tough. Improving infrastructure requires an enormous investment cost either to society (e.g., flood walls), communities (e.g., FEMA’s Community Rating System), or individuals (e.g., elevating a house). A Citylab article criticizes Texas for not investing in infrastructure that could reduce the impact of floods.

On an individual level, flood insurance is required for those who live in “Special Flood Hazard Areas” (a floodplain; FEMA defines a “Special Flood Hazard Area” as an area with a >1% annual chance of flooding). Flood insurance can be really expensive, which can encourage individual homeowners to either forego insurance or mitigate against flooding. Elevating a house is expensive, but it may be a good financial choice if it reduces flood insurance by thousands of dollars per year. The reality is that most homeowners do not have a flood insurance policy even when it is required because insurance is expensive and is perceived as not needed. Many homeowners in floodplains go decades between floods, and despite the warnings and requirements, they do not see the need to pay so much for flood insurance when they have not experienced a flood.

I recommend reading Catherine Tinsley and Robin Dillon-Merrill’s paper on “near miss” events in Management Science and their follow-up paper with Matthew Cronin. Their papers demonstrate that when someone survives an event like a natural disaster that was not fatal or too traumatic (e.g., a flood that didn’t do too much/any damage), they are likely to make riskier decisions in the future (e.g., they cancel their flood insurance).

A Wharton report by Jeffrey Czajkowski, Howard Kunreuther and Erwann Michel-Kerjan entitled “A Methodological Approach for Pricing Flood Insurance and Evaluating Loss Reduction Measures: Application to Texas” specifically analyzed flood loss reduction methods in Texas. They recommend supplementing the flood insurance with private flood insurance (FEMA currently provides homeowners with flood insurance through the National Flood Insurance Program) to encourage more homeowners to purchase insurance. They also evaluate the impact of elevating structures in the most flood-prone areas, and they find that mitigation through elevation can be instrumental in reducing much of the damage.

What have you experienced with flooding?


staying safe from tornadoes

The devastating tornadoes in Oklahoma and Arkansas this weekend was a sad way to start tornado season. Sensing equipment and forecasting models have been used in improved advanced warning systems to help people take shelter, however, major tornadoes are still pretty deadly. The recent 2011 tornado in Joplin, MO was one of the deadliest ever. The  “Weather Forecasting Improvement Act of 2014,″ which passed the House in April 2014, seeks to address this need.

While there is room for improvement when it comes to advanced tornado warnings, the cost associated with the warning systems is somewhat controversial. Federal funding for disasters is essentially a zero-sum game, so high-profile disasters like tornadoes can use up a disproportionately high amount of the budget allocated to disaster research, leaving us vulnerable to other less news-worthy disasters. Since there are so few tornado-related fatalities every year compared to other weather disaster (e.g., heat waves), there isn’t much of a potential to save lives regardless of how good the advanced warning system becomes.

Meteorologist Eric Holthaus wrote a nice article on Slate about this issue [Link]

While it’s necessary to continue making progress on hurricane and tornado forecasts, it should definitely not be at the expense of funding to improve forecasts of lower-profile weather and climate disasters that, in aggregate, kill dozens of times more people per year, and are increasing. Essentially, the bill invests scarce funds in high-profile weather events at the expense of those that cause many more deaths. Boosted by human-caused climate change, heat waves now kill more people in the United States each year than hurricanes, tornadoes, lightning and floods combined, according to the CDC. And weather-related traffic accidents kill 10 times more than heat waves—more than 6,000 people per year. [see the first image below to visualize these magnitudes]

Track the bill here. The bill focuses on forecasting, but I am going to take this a step further and examine the entire warning system to see if there are other ways we could save lives. The ultimate goal is to keep people safe, and there are many ways to do so. I grew up with tornado sirens, which were great so long as you were awake. I was surprised to learn that this is generally the (see the bottom figure below). TV warnings are another old-fashioned way to warn people, which seemed to work back in the era when people watched network TV. Virginia does not have too many tornado sirens (although it had tornadoes!), and since I do not watch much TV, my family missed a couple of warnings. Later, Virginia Commonwealth University used the campus siren installed to warn us of active shooters on campus as a makeshift tornado alarm. In my opinion, that was terrific. They have had more tornadoes than shootings.

I later learned that I could get the best warnings from following meteorologists and weather nerds on twitter. I highly recommend following nearby National Weather Service offices on twitter to get high quality information in real-time (get started here). Weather apps can deliver warnings even if you aren’t actively using the app. I think these are all great options, but I am still sometimes blissfully unaware of serious weather despite being “prepared.” Being able to reach everyone including those who cannot be reached by the most far-reaching alerts in our shifting technology landscape is not a forecasting problem (e.g., the deaf cannot hear sirens). The point is that there are some serious challenges in ensuring that the warnings get to the people quickly. Doing so isn’t so much of a forecasting problem, unless the forecasts give much, much earlier warnings (think: hours instead of minutes).

Another problem is that I can always choose to ignore the warnings even if I get them. This will likely happen if there are too many weather warnings/false alarms (The Weather App That Cried Wolf). I blogged about this issue here.

The OR tie in for this post may be somewhat dubious, but the connection here is that these issues cut to systems thinking and tradeoffs. Plus, I’m a daughter of the Midwest and will always be somewhat obsessed with weather and tornadoes. Your thoughts on optimal ways to keep people safe (broadly speaking) from tornadoes are welcome.

 

From “Investing More Money in Tornado Research Would Be a Disaster” by meteorologist Eric Holthaus. Click through for more information.

 

 

Related factoid: most of the world is not at an increased risk of tornadoes. So tornado prediction and preparedness is mostly a United States problem

Parts of the world at an increased risk of tornadoes are in red. Courtesy of NOAA. Click through for more info.

Another factoid: tornadoes are far more likely to occur in the late afternoon and early evening than overnight. I was shocked to learn this.

Tornado likelihood across the United States. Click through for more information.

 

Related posts:


operations research, disasters, and science communication

I had the pleasure of speaking at the AAAS Meeting on February 17 in a session entitled Dynamics of Disasters: Harnessing the Science of Networks to Save Lives. I talked about my research that addresses how to use scarce public resources in fire and emergency medical services to serve communities during severe but not catastrophic weather events. My research has application to weather events such as blizzards, flash flooding, derechos, etc. that are not so catastrophic that the National Guard would come. Here, a community must meet demands for fire and health emergencies within a community using the resources that they have during “regular” days – e.g., ambulances and fire engines – while the transportation network is impaired due to snow, flooding, etc. Everything is temporarily altered, including the types of 911 calls that are made and travel and service times as they are affected by an impaired transportation network. Plus, it’s always a lot of fun to mention “Snowmaggedon” during a talk.

Anna Nagurney organized the session, and the other speakers included David McLaughlin, Panos Pardalos, Jose Holguin-Veras, and Tina Wakolbinger. They talked about a number of issues, including:

  • how to detect tornadoes temporally and spatially by deploying new types of sensors
  • how to evaluate people and even livestock during hurricanes and floods
  • what the difference between a disaster and a catastrophe is
  • what types of emergency logistics problems require our expertise: national versus internationa, public vs. non-profit, mitigation vs. preparedness vs. response, short-term disaster vs. long-term disaster

I applaud Anna Nagurney for organizing a terrific session. It was fascinating to talk to people in my field about disasters without focusing too much on the modeling details. We all mentioned which types of methodologies we used in the talk, but we focused on the takeaways, actionable results, and policy implications. And it’s clear that the opportunities in this area are almost endless.

The AAAS Meeting is all about science communication to a large audience. The talks focus on broader impacts not specific model details. It’s not always easy for me to take a step back from my research and explain it at a higher level, but I get a lot of practice through blogging and talking about my research in my classes. Still, I was nervous. I am a mere blogger – the conference is heavily attended by real science journalists. In fact, I had to submit speaker information and a picture ahead of time so that journalists prepare for my talk. I truly felt like an OR ambassador – it was quite an experience.

I attended another session on disasters, where the topics often revolved around forecasting power, false alarms, and risk communications. I have blogged about these issues before in posts such as what is the optimal false alarm rate for tornado warnings? and scientists convicted for manslaughter for making a type II error. This appears to be an ongoing issue. According to the scientists on the panel, part of the problem stems from journalists who want to make a good story even juicier by not portraying risk accurately, thus leading to false alarm fatigue.

Other sessions at the AAAS Meeting addressed several fascinating topics. One session was on writing about science, and it featured a writer from the Big Bang Theory. Another session was about communicating science to Congress. Many of the speakers were from science publications and PBS shows.

I have at least one other blog post on science communication in the works, so stay tuned.

My slides are below:


the logistics of the post-Sandy New York Marathon

I’m pleased to hear that NYC marathon will be held on Sunday as planned.

The logistics will be challenging. The race organizers were expecting 50,000 runners before Hurricane Sandy hit. While many runners may sit out, I expect that most will try to run. After all, the hurricane hit well into the tapering phase of training, meaning that runners should be ready to run,even if they’ve been dealing with hurricane-related challenges. And most of the out of town runners will be relatively unaffected by the hurricane and should similarly be ready to run.

The main challenges as I see it will be to:

  1. Get runners into the city and have a hotel room
  2. Get runners and volunteers to the race.
  3. Distribute race supplies such as water and Powerade and to locate portable bathrooms.

#1 Get runners into the city

In huge marathons like this one, many of the runners will not be nearby. Last year, 20,000 of the runners came from overseas. The main ways to get to NYC are by plane and train. As of now, Amtrak still has not resumed NYC travel. They plan to partially restore travel on Friday. There have been a large number of flight cancellations, but flights are being restored and it appears that runners are making it to New York.

Runners from out of town also need hotels. Surprisingly, the lack of hotel rooms seems to be a larger problem for runners than transportation to NYC. The hotels are packed:

The city’s hotels are coping with a list of issues. Among them: Unprecedented cancellations and requests to extend stays; a high number of walk-in room requests from powerless local residents; unpredictable staffing levels; non-working land lines, and in some cases no steam heat.

The Pittsburgh Steelers likewise had problems finding a hotel to accomodate the team on Saturday night before their road game against the New York Giants. The Steelers are flying to Newark for their game Sunday morning.

#2 Get runners and volunteers to the race

Once in/near the city, all 50,000 runners and a few thousand volunteers need to get to the beginning of the race more or less at the same time. Driving to the beginning of a big race like this is generally not the best way to get there. The NYC marathon normally starts on Staten Island, which harder to get to than most races. In the past, half of the runners take the subway in combination the Staten Island ferry to the beginning of the race. Not so this year. The Staten Island ferry has been cancelled and buses will instead transport the runners from a meeting point to the race in four waves at 4:30am, 5:30am, 6:30am, and 7:30am. There shouldn’t be a lot of traffic at 4:30am on Sunday morning, so I would anticipate that the runners should be OK as long as they can take other forms of public transportation to get to the meeting point for the race buses.

Distribute race supplies such as water, Powerade, and portable bathrooms

Normally, setting up portable bathrooms and water/Powerade stations is not a complicated matter. With the number of road closures, etc., it will be more difficult to obtain the necessary marathon resources and get them where they need to be. Races need a huge number of bathrooms because all runners need to go to the bathroom at the same time (right before the race). I wasn’t sure that many portable bathrooms would be available, and it sounds like 1750 bathrooms are at the start of the race. I wrote about bathrooms before [Link]. That sounds like a lot of bathrooms per runner, but I can assure you, there will still be long lines.

In sum, I am amazed that the marathon will continue more or less as planned. I am surprised, however, that hotels may be the biggest challenge. I am also concerned about snafus with public transportation, since runners will rely on public transportation in new ways this time. I hope everything goes smoothly.

What are other issues, bottlenecks, and shortages do you foresee?


the forecasting models behind the power outages forecasts for Hurricane Sandy

I’m thrilled to have interviewed Seth Guikema about his forecasting models for hurricane power outages between his gigs on Good Morning America and Bloomberg. Seth is a professor at Johns Hopkins University, and he is the rock star of hurricane power outage forecasts. I wrote about a Baltimore Sun article about his research not too long ago. On the podcast, he and I chat about the methodologies he uses in his models as well as how news sources like to turn scientific research into digestible sound bites.

Listen here: (or go directly to the mp3 here)

You can listen to the episode below or you can go to the podcast web page (where you can download to iTunes, etc.) and feed. I recommend subscribing to the feed or going directly to the Punk Rock OR Podcast iTunes page, but you can also find the podcast episodes on this blog by clicking on “Podcast” under “Categories” in the left column.

Seth’s models have gotten a lot of coverage. Here are a few places where you can see Seth’s work translated for a general lay audience:

Seth’s forecasts as of 6am on 10/29:

Total prediction: 11 million without power
MD: 2 million
DC: 300,000
NJ: 3.4 million
DE: 425,000
PA: nearly 4 million
Here is an image of where the power outages will occur:

Power outage forecasts for Hurricane Sandy (courtesy of Seth Guikema)