Tag Archives: homeland security

Translating engineering and operations analyses into effective policy

I am presenting at the AAAS Annual meeting in a session entitled “Translating Engineering and Operations Analyses into Effective Homeland Security Policy” with Sheldon Jacobson and Gerald Brown:

In my talk, I will discuss three research questions I have advanced:

  1. How can we more effectively perform risk based security?
  2. What is the optimal way to allocate vehicles to emergency calls for service?
  3. What is the optimal way to protect critical information technology infrastructure?

My slides are below.

Related posts and further reading:

If you have any questions, please contact me!

risk analysis and extreme uncertainty in Beijing

I attended the International Conference on Risk Analysis, Decision Analysis, and Security at Tsinghua University in Beijing on July 21-23, 2017.
The conference was organized by Mavis Chen and Jun Zhuang in honor of my UW-Madison Industrial and Systems Engineering colleague Vicki Bier. The conference was attended by Vicki Bier’s collaborators and former students.

I enjoyed listening to Vicki’s keynote talk about her career in risk analysis and extreme uncertainty. Vicki talked about drawing conclusions with a sample size of one (or fewer!). In her career, Vicki has studied a variety of applications in low-probability, high consequence events such as nuclear power and security, terrorism, natural disasters, and pandemic preparedness. She stressed the importance of math modeling in applications in which the phenomenon you are studying hasn’t happened yet. In fact, you never want these phenomena to happen. Vicki told us, “I am a decision analyst by religion,” meaning that decision analysis is the lens through which she views the world and how she first starts thinking about problems, always recognizing that other methods may in the end provide the right tools for the problem.

Vicki has collaborated with many people of the years, and she shared several stories about her collaborations with her former students. I enjoyed hearing the stories about how her students challenged her and pushed her research in new directions. For example, Vicki told us, “Jun Zhuang made me confront my lifelong distaste for dynamic programming.” Vicki ended her keynote outlining her future work, noting that is not yet ready to retire.

Several conference attendees took a field trip to the Great Wall of China and to visit the tomb of the thirteenth emperor of the Ming dynasty, the only tomb that has been excavated underground, and the Ming dynasty’s summer palace. Many thanks to Mavis Chen and Jun Zhuang for their outstanding organization of the conference!

Pictures from the conference and its side trips are below.

View this post on Instagram

At the #greatwall outside of #Beijing #china

A post shared by Laura Albert (@punkrockorlaura) on

View this post on Instagram

At #Tsinghua university in #beijing #China

A post shared by Laura Albert (@punkrockorlaura) on

View this post on Instagram

#peace signs at #Tiananmen #Square in #Beijing #China

A post shared by Laura Albert (@punkrockorlaura) on

View this post on Instagram

Outside the 13th #ming #dynasty #tomb in #beijing #china

A post shared by Laura Albert (@punkrockorlaura) on

the evolution of aviation security

You can listen to me talk about the evolution of aviation security on Wisconsin Public Radio. Norman Gilliland interviewed me for for an hour on the program “University in the Air” that aired on July 30, 2017. It was a lot of fun to chat about aviation security for an hour.

I recorded the program just before I left for an international trip. On my trip, I went through security at four airports on four continents (Chicago O’Hare, Amsterdam, Beijing, and Cairo) and was closely following the different procedures at each airport. It was interesting how different countries and airports tried to achieve similar goals in different ways. Aviation security will continue to evolve and change and will certainly be different a year or two from now. I’ll continue to blog about the evolution of aviation security 🙂

Related posts:

If a mathematical model is solved in a forest and a decision-maker is not around to see it, does it have any impact?

I am attending a workshop on Adversarial Risk Analysis and Critical Infrastructure at the Lorentz Center in Leiden. David Banks, a statistics professor at Duke and one of the workshop organizers, proposed* three levels for modeling that applies to research in statistics, operations research, and optimization:

  • LEVEL 1: You solve the problem.
  • LEVEL 2: You solve the problem in a cost-effective manner (e.g., using heuristics to get a quick solution that is “good enough”)
  • LEVEL 3: You solve the problem in a cost-effective manner that a decision-maker will implement

The other conference attendees (mostly academics) seemed to like these three levels as much as I did. Academics like David and I spend most of our time on Level 1 even though we recognize that the point of doing all this academic work is to inform decision-making. We try to reduce the gap between Level 1 and Level 3 through our work and we occasionally make that leap. We have clearly seen that happen in optimization modeling, where academic work on cutting planes (once Level 1) first developed decades ago is now standard in off-the-shelf optimization software (definitely Level 2 without a loss in quality and maybe Level 3). It might take a few years to get to Level 3, but that is how progress and innovation work.

But that is not why David brought this up. He mentioned informing decisions during a discussion about homeland security and terrorism, and making the jump from Level 2 to Level 3 is tough because the decision-makers–who are often politicians–are not often receptive to math modeling. That’s not to say we should give up, but rather, we should sometimes start with Level 3 as the goal and work backward by rethinking what the problem is and how we solve it.

What do you think?

* David said he “made up” these levels last week.

revisiting aviation security on September 11 fourteen years later.

Today Dr. Paul Kantor (Rutgers and now UW ISYE) and I gave a University of Wisconsin-Madison ISYE department seminar on homeland security and operations research. I talked about passenger screening in the aftermath of 9/11 in a presentation without equations.

I was sad when preparing for my talk when I mentally revisited the events that occurred on September 11, 2001. I started working on homeland security problems soon after. I had just started a PhD program and started to work with Sheldon Jacobson. Fourteen years later, I can conclude that we will always have homeland security challenges, and we will always have some challenges. Here are the slides from my presentation.


Read my other posts on aviation security:

aviation security: there and back again

This week I attended the CREATE/TSA Symposium on Aviation Security at the University of Southern California campus. Center for Risk and Economic Analysis of Terrorism Events (CREATE) and the Transportation Security Administration (TSA).

It was a nice conference attended by academics, those at government agencies (TSA, DHS, Coast Guard, etc.), and those in the private sector. It was a good mix of attendees and speakers, and no one was shy about raising interesting and provocative ideas. Many issues were discussed in the conference from multiple viewpoints, including:

  • Are we more concerned with people with a nefarious intent and no threat items or people with no bad intent but with threat items?
  • How do we even begin to characterize the deterrent effect?
  • Good security means making tradeoffs between efficiency, effectiveness, and cost.
  • Government agencies wants more collaboration with academics. Almost all non-academic speakers mentioned this.
  • What about drone security?

It was clear that aviation is still a favorite target among terrorists and that aviation security issues are still challenging. Operations research tools such as risk analysis and optimization are needed to put good ideas into action. It was nice to hear that the practitioners feel this way too. We will always have security challenges, and OR will always help us address some of these challenges.

My advisor Sheldon Jacobson talked about his work in this area, including his work with me that introduced the concept of risk-based screening (see a previous article here). Two other PhD students followed me and continued work in this area. Our work addresses on how to optimally target scarce resources at the passengers based on their risk. The models are resource allocation models that allocate screening resources to passengers statically and dynamically (in real-time). The central theme is to use limited screening resources wisely. There are inherent tradeoffs in these types of decisions: with a fixed set of resources, targeting too many resources at low-risk passengers means there are fewer resources for higher-risk passengers.

Some of the critical findings from our research include:

  • We want to match passenger risk with the right amount of security resources.
  • Risk based screening is great because it uses limited screening resources in an intelligent way. Random screening or screening everyone with all of the resources is not an intelligent use of resources (although some randomness can be effective when used intelligently – it just shouldn’t be the only way to use limited resources).
  • When risk is underestimated, high value security resources get used on high risk passengers (a good thing). Finding a threat passenger is like finding a needle in a haystack. Underestimating risk helps you make a smaller haystack.
  • When risk is overestimated, high value security resources get used on low risk passengers, which may leave fewer high value security resources available for high risk passengers. Overestimating risk prevents you from making a smaller haystack (everyone looks risky!)
  • TSA PreCheck implicitly underscreens by weeding out many of the non-risky passengers to make a smaller “haystack.” PreCheck has the potential to make the air system safer in low risk, cost-constrained environments. Side note: TSA PreCheck didn’t exist when I was a PhD student working in this area, but earlier ideas and programs were out there (e.g., trusted traveler programs).

It was nice hearing from TSA practitioners who read my papers with Sheldon and used our ideas to guide changes to policy.

Sheldon will give the long version of this talk in Arlington, Virginia on August 5 at an WINFORMS  meeting. Details are here.

You can also listen to my podcast interview with Sheldon about aviation security from 2011 here.

Special thanks to Dr. Ali Abbas (CREATE director), Kenneth Fletcher (TSA), and Jerry Booker (TSA) for organizing the conference and to Stephen Gee, Lori Beltran, and Michael Navarrete for their hard work organizing the conference. Ali promised to write an OR/MS Today article about the symposium, so stay tuned for more details. 

TSA/CREATE Symposium attendees

TSA/CREATE Symposium attendees

0221 (Large)-XL

Sheldon Jacobson talks about our aviation security research


aviation security: is more really more?

Aviation security has been in the news this week after ABC released a report suggesting that 95% of explosives go undetected when passengers go through checkpoint screening at airports.

There are several operations research challenges in passenger screening that address how the Transportation Security Administration (TSA) can set up, staff, and use its limited resources for screening passengers. My advisor Sheldon Jacobson has been working on aviation security issued since 1996 (!) and has amassed a large number of papers on passenger and baggage screening. His work provides the critical technical analysis that is at the foundation of security operations at commercial airports throughout the United States, including the fundamental technical analysis that laid the basis for risk-based security, which in turn lead to TSA PreCheck. I wrote several of these papers with Sheldon when I was a PhD student.

Sheldon Jacobson was interviewed by Defense One about risk-based passenger screening issues.

“Ultimately, we’re dealing with people’s intent more than items. Which concerns you more: a person who has no bad intent but who has an item on them like a knife or a gun, or someone who has bad intent but doesn’t have such an item?” [Jacobson] said. “Most people are comfortable with the former rather than the latter. A person with bad intent will find a way to cause damage. A person without bad intent who happens to have an item on them is not the issue.”

Risk-based systems can help solve that problem, but only when used correctly. The most famous and widely used is TSA’s PreCheck, which launched in December 2013. It allows U.S. citizens and permanent residents who submit to a somewhat strict background check (including an in-person meeting and fingerprint scan) to receive expedited screening at airports for five years. Jacobson says the best thing policy-makers could do to airport improve security is get a lot more people into PreCheck.

The TSA screening policies focus more on finding prohibited items rather than preventing terrorists from finding attacks. As evidence of this, think about the time and energy used to find bottles of liquids and gels that we have accidentally left in our bags. As further evidence of this, recall that the box cutters and small knives used in the September 11, 2001 hijackings were not even prohibited in the first place.

Ultimately, an overhaul of screening requires more than just operations research. We also need new technologies for screening and new training programs. Promising new technology may be just around the corner. Fast Company has an article about how to use biometrics to identify those with bad intents (rather than those who accidentally left a water bottle in their carry on).

I’ll end today’s post with a recent article from The Onion. The Onion is a bit pessimistic – we will always have security challenges. Hopefully we can make big improvements in security at airports and if we do, operations research will have played an important role.

Punk Rock OR Podcast #4: Sheldon Jacobson on aviation security

The fourth edition of the Punk Rock OR Podcast is out.  With the 10th anniversary of September 11th coming up, I decided to a podcast episode on aviation security was in order. Dr. Sheldon Jacobson from the University of Illinois at Urbana-Champaign agreed to chat with me about his research on aviation security to highlight the role of operations research in homeland security.

Don’t forget to subscribe to the podcast feed via the podcast web site.


Sheldon Jacobson

terrorism analytics

For this month’s blog challenge, I was inspired by one of the month’s big stories: how Osama bin Laden was caught.

The Navy SEALs deservedly get a lot of credit for the role they play in the ongoing wars on terror, but nerds also play a critical role in fighting terror. The intelligence that played a role in finding bin Laden depended on people on the ground in foreign countries as well as on analytics. Many intelligence agencies are populated by nerds who use analytical techniques on the large volume of data they collect.

We’ll never know exactly how important analytics is in fighting terrorism, but I’ve written a few thoughts here.

Various US government agencies collect and analyze an enormous amount of data on a daily basis.  The NSA collects data equivalent in size to the Library of Congress every six hours.  All of this data obviously cannot be scrutinized at a detailed level (hopefully they don’t get to all of it–I may be put on a watch list if someone looks at the google search terms I used to write this post).  A data rich environment can lead to excellent decision-making if care is taken to determine how to use one’s limited resources for using analytical techniques.  In the terrorism example, how does one determine

  1. which cell phone communications to record?
  2. which phone conversations deserve a transcript and which emails need to be translated?
  3. which data to summarize as metadata?

Another problem with terrorism is the lack of a proper dependent variable.  For example, suppose you collect some cell phones that were used by known terrorists. If you want to look at the terrorists’ social networks by examining the calls sent and received from the terrorists’ phones, it is impossible to know if their calls were made to other terrorists or not (unless some of the numbers are to known terrorists).

This problem is not unlike, say, credit card companies trying to detect fraud.  Both terrorism and fraud detection involve finding a needle in the haystack.  However, terrorism social networks are large and involve many types of transactions (rather than, say, just credit card transactions). Osama bin Laden used flash drives and written communication delivered by courier, whereas others who are lower in the food chain use cell phones, land lines, email, etc. Credit card companies can also make decisions like dropping risky customers that don’t have analogous decisions when fighting terror.

It’s “easier” for a credit card company to determine who is fraudulent based on having more knowledge about their customers and having more certainty about their dependent variable (whether fraud is an issue). My credit card company called me while on vacation this winter, since my unusual purchases set of some kind of red flag.  I was able to verify that no fraud was taking place after I answered a few questions.  I was glad that they were looking out for me. No harm no foul.

Analytics used for fighting terror includes mining cell phone traffic for patterns, identifying social network analysis of terrorist organizations, and creating a system for analyzing risk air passengers or cargo containers (this report summarizes some of the analytical techniques that have been used). There are certainly some fascinating examples that are classified, but well have to speculate about those.

I’ve enjoyed the other blog posts about Analytics, especially those that discuss how analytics fits with the past and future of operations research. Please check out the other OR blogs to read more about analytics.

Related posts:

A post from the INFORMS computing society conference: are application-oriented conferences too specialized?

I am attending the INFORMS Computing Society (ICS) Conference this week.  This is my first ICS Conference.  In addition to the focus on computing, the conference selected homeland security for its theme, with a secondary theme of energy security.  I initially found it unusual for an already-specialized  conference to have an application area focus.

On second thought, many conferences have application area focuses, such as transportation, health, risk analysis.  The application-focused conferences tend to attract people with a broad range of interests who use a variety of tools.  I suppose that the dual focus on computing would not be too limiting, but homeland security seems more focused than, say, health applications.

As a homeland security researcher, I have to say that I was eventually won over.  The talks were all pretty interesting to me, and there was a strong theme to the talks in nearly every session.  Best of all, conference was populated with other homeland security enthusiasts, so the questions asked during the talks were really insightful.  The audience questions quite often gave me even more to think about than the talks.

Since I am interested in both computation and homeland security, nearly all of the talks appealed to me.  Despite having a mere five tracks, I often had to make some tough choices between talks scheduled at the same time and missed more than a few talks that I wanted to see.

The five concurrent sessions were a nice contrast with the 75 concurrent sessions at at the INFORMS Annual Meeting.  Walking between buildings to session hop is not ideal.  When I started attending the INFORMS Annual Meeting, there were about 50 concurrent sessions, and all sessions fit within one convention center.  I have been hoping for the number tracks to be cut back, but they seem to grow every year.  But this tangent should really be its own post, so I’ll stop reflecting for now and come back to this theme later.

I would like to know if those who are less interested in homeland security enjoyed the ICS theme of homeland security as much as I did.  Given that many homeland security applications are ultimately aimed at managing risk, they have broad applicability beyond homeland security and even extreme events.  So I am not sure if the application was really all that limiting.

I hope not too many were deterred by the application focus.  Did you attend ICS?  If so, what did you think of the homeland security focus?