Author Archives: Laura Albert

A digital device policy in the classroom

Over the years I have struggled with the issue of whether or not to ban cell phones or laptops. For me, whether or not to ban is the wrong question. A better question focuses on student learning, since I strongly believe in policies that support student learning in the classroom. I also strongly believe in treating my students like adults. 

I don’t mind cell phones in the classroom, I mind behavior that interferes with learning. So I crafted a digital device policy in the classroom that focuses on behavior instead of imposing bans two years ago.

I posted the digital device policy from my syllabi below. When I introduce the policy, I tell the students I expect them to be in categories 3 and 4. During the semester, I make sure to remind of the policy if there are many students who are not meeting expectations or acknowledge that the students are doing a great job by meeting expectations.

I’m more comfortable setting a clear policy that outlines expectations for behavior than a blanket ban. This also puts the students in charge of themselves.

It’s been two years, and I still like it. 

My digital device policy:

Laptops and tablets should be put away and closed if we are not using them for an in-class example. Research* shows that laptop use in class leads to lower grades for those with the laptops and even lower grades for those who are sitting by the laptop users due to the distractions they provide. I ask that you respect your peers’ desire to learn and not engage in distracting behavior in class. I understand that many of you like to follow along with the lecture notes on your tablet during class. I support the use of a laptops and tablets that are consistent with the course’s learning goals. I discourage taking notes during class using your laptop keyboard, since students frequently tell me they find typing noises during class to be extremely distracting.

* Sana, F., Weston, T. and Cepeda, N.J., 2013. Laptop multitasking hinders classroom learning for both users and nearby peers. Computers & Education, 62, pp.24-31.



Far Below Expectations




Meeting Expectations


Exceeding Expectations

Cell Phone / Laptop / Tablet / Device Use

In the real world, people have their phones and devices with them at their jobs, meetings, and courses. Adults do not have their devices taken away from them. They are expected to manage their own use and conform to professional expectations in every setting.



Use is inappropriate. Device is a distraction to others.


Example: A student plays games, views non-academic material, types (not for taking notes), reads non-academic articles, has text or chat conversations.




Use is distracting. Device is a distraction to the student. Student frequently checks phone or device during learning.


Example: A student takes out their phone to look at a text several times during a class period.



Device is not used except for designated appropriate times OR use is limited to a quick check of the phone during a transition or appropriate time.


Example: If a student receives an important message from a parent, they quickly check while still being engaged in class and with no distraction to others.


Device is not used except for as an efficient academic tool for a direct purpose. Devices are not a distraction and are used at appropriate times as an extension of work or learning.


Example: A student follows along with the lecture notes on a tablet and goes back a slide to correct a misconception about the lecture material. The student looks up the formula for the Binomial theorem for an in-class example, which is consistent with the course’s learning goals.


I use this meme in class, but fewer and fewer students know who Obi Wan Kenobi is:


This updates my previous post on my preliminary digital device policy.

What is your digital device policy?


Target always thinks I am pregnant: on the costs of false positives and false negatives

In 2012, The New York Times published an article about an algorithm used by Target to identify shoppers that might be pregnant.

[A] man walked into a Target outside Minneapolis and demanded to see the manager. He was clutching coupons that had been sent to his daughter, and he was angry, according to an employee who participated in the conversation.

“My daughter got this in the mail!” he said. “She’s still in high school, and you’re sending her coupons for baby clothes and cribs? Are you trying to encourage her to get pregnant?”

The manager didn’t have any idea what the man was talking about. He looked at the mailer. Sure enough, it was addressed to the man’s daughter and contained advertisements for maternity clothing, nursery furniture and pictures of smiling infants. The manager apologized and then called a few days later to apologize again.

On the phone, though, the father was somewhat abashed. “I had a talk with my daughter,” he said. “It turns out there’s been some activities in my house I haven’t been completely aware of. She’s due in August. I owe you an apology.”

Target uses purchase history to predict which shoppers are pregnant. The article in the New York Times that broke this story implied that Target is extremely accurate in these predictions. I want to explore this a little further. Target sends shoppers that meet their pregnancy criteria coupon books for maternity and baby products. This article in Forbes outlines some of the data used by Target as well as their approach in predictive analytics.

[Target’s statistician] identified 25 products that when purchased together indicate a women is likely pregnant. The value of this information was that Target could send coupons to the pregnant woman at an expensive and habit-forming period of her life.

I’ve always been skeptical about the accuracy of Target’s algorithm, mainly because they have found me continually pregnant since 2010-11 (the last time I was actually pregnant). Target has sent me many coupon books and ads over the years. It’s not just Target. Sometimes I receive baby formula in the mail from a formula company. Babies are expensive, and that it costs Target (and other companies) very little to send me baby coupons and ads. The upside is that if Target is correct, they have a huge potential profit. If they are wrong, it only cost them a little in advertising revenue.

Target targeted me with this ad on twitter. It’s not the first pregnancy ad that has been tailored to me. I also receive coupons in the mail.

We can unpack a procedure for identifying pregnant customers. John Foreman includes a “pregnant shopper” model in his book Data Smart to introduce linear and logistic regression that illustrates this point. I’ve used this model in class and students really like it. Regression models are fit to data from fictitious shoppers. A logistic regression model produces a score for each shopper based on their purchases of many types of products (similar to how the real application works). The score can be mapped into a 0-1 prediction or decision for classification. This helps decide who gets the coupon books and who doesn’t by choosing a cutoff, with shoppers whose scores are above the cutoff getting the coupon books. The lower the cutoff, the more false positives there will be. Different cutoffs lead to different values of the true positive and false positive rates (see the “ROC curve” image below from Data Smart).

Two ways to measure accuracy include:

  1. Sensitivity: the true positive rate that measures the proportion of actual positives that are correctly identified.
  2. Specificity: the true negative rate (1 – the false positive rate) that measures the proportion of actual negatives that are correctly identified.

What this means is that the algorithm is not necessarily accurate and that Target is not necessarily aiming for accuracy in terms of the model’s predictive ability. Instead, Target is choosing a point on this ROC curve by setting a cutoff that makes sense for their business model. If the costs of sending an ad to a non-pregnant shopper is low (the cost of a false positive) and the profit of true positives is high, it would lead Target selecting a value of the cutoff that would lead to a point on the curve with a high true positive rate as well as a high false positive rate (with low specificity). This is what I experience.

Other applications where the cost of a false positive is lower may lead to different selections of a cutoff with a lower false positive rate. I rarely have received baby formula in the mail, presumably since mailing formula comes at a much higher cost to these companies.

Another example that comes to mind is modeling sports injuries. The cost of a false positive could be high: resting a star player too much at the end of the season to stave off injury means the team could lose too many games and miss the playoffs. Not resting the player (risking a true positive) means the player could suffer a season ending injury, which would mean the team could lose in the playoffs.

The economic impact of obesity on automobile fuel consumption

Back in 2005, I was a graduate student working with my advisor Professor Sheldon Jacobson at the University of Illinois. Hurricane Katrina damaged oil refineries in the Gulf of Mexico, causing production to drastically drop. Gas prices surged as supply fell.

Sheldon and I discussed fuel economy in the wake of Hurricane Katrina. He noted that one way to improve fuel economy is to remove all the junk out of a car, because lowering the weight inside a car improves its fuel economy on the micro level. He challenged me to quantify this small change in fuel economy. I was a curious and energetic graduate student, and I threw myself into answering this question for the next couple of weeks. During this time, Sheldon pointed out that as a country, people in the United States had gotten heavier. This additional body weight (about 25 pounds for both men and women between 1960 and 2003) was not insignificant. Given that Americans drive so much, this additional weight accounted for a lot of additional fuel usage.  I felt a little guilty because I had a baby whose car seat weighed about 25 pounds (it was one of the safest on the market) and often left heavy bags of cat litter in my car trunk if I was too tired to carry them inside.

After crunching the numbers, the results were astounding: We found that if we put people from 1960 into automobiles in ~2005, approximately 938 million gallons of fuel would have been saved by transporting lower weight passengers, which corresponds to approximately 0.7% of the nation’s annual fuel consumption.

Our paper was published in Engineering Economy. It’s entitled, “The Economic Impact of Obesity on Automobile Fuel Consumption.” You can read the paper here.

Paper Abstract: Obesity has become a major public health problem in the United States. There are numerous health implications and risks associated with obesity. One socio-economic implication of obesity is that it reduces passenger vehicle fuel economy (i.e., the miles per gallon achieved by automobiles, which include cars and light trucks driven for noncommercial purposes). This article quantifies the amount of additional fuel consumed (annually) in the United States by automobiles that is attributable to higher average passenger (driver and non-driver) weights, during the period from 1960 to 2002. The analysis uses existing driving data in conjunction with historical weight data. The results indicate that, since 1988, no less than 272 million additional gallons of fuel are consumed annually due to average passenger weight increases. This number grows to approximately 938 million gallons of fuel when measured from 1960, which corresponds to approximately 0.7% of the nation’s annual fuel consumption, or almost three days of fuel consumption by automobiles. Moreover, more than 39 million gallons of fuel are estimated to be used annually for each additional pound of average passenger weight.


The paper was published in December 2006. A press release went out in October, a month into my first tenure track position. Sheldon and I anticipated that the paper would receive a lot of media attention. In preparation, we received training from the excellent University of Illinois media team. They helped us develop a series of takeaways and talking points about the paper and taught me how to stay on point during interviews. Sheldon was a lot more eager to talk to the media than I was. He agreed to do the heavy lifting when it came to media engagement.

When the press release came out, the paper received a lot of attention. I received phone calls at home at the crack of dawn asking for interviews for radio programs. I was on television. I was quoted in newspapers. I was on the radio. Journalists asked me how much I weighed during interviews (really!) and one asked me if I “hated fat people.” The Associated Press wrote an article that was published in hundreds of newspapers.  In all, our research findings were reported by news articles in more than 400 online and print newspapers and magazines, and was featured on several national cable news shows and regional radio shows. My interview with WDBJ7, a CBS affiliate in Roanoke, Virginia, appeared on the evening news. It was scary, but overall it was a great experience. Amazingly, I saw references to this paper in the popular press years after the press release. For example, a CNBC quiz asked how much gas is consumed annually due to Americans’ weight gains since 1960. I got the right answer (938 million gallons of gas).

The media firestorm helped me become comfortable with working with the media, despite my introversion. I could see the value of media attention to scientific topics such as analytics, and now I always embrace bringing engineering, operations research and analytics to the public. I am much better at talking with journalists, and I’m happy to say that no once since has asked me about my weight.

the logistics of hosting big events like the Democratic National Convention

The Democratic National Convention will be in Milwaukee next year. Since the announcement, journalists and experts have noted Milwaukee’s lack of hotel rooms for this event, with hotel reservations already stretching south to Chicago. Madison is about 75 miles away and might be completely booked for the DNC. The Milwaukee Journal Sentinel published an article about the logistical mess.

The DNC is similar to other large events hosted by cities.

Jacksonville, Florida hosted the 2005 Super Bowl. Jacksonville (population 892,000) was the smallest city ever to host the Super Bowl. It did not meet the National Football League (NFL) rule of having a single hotel with at least 750 rooms near the stadium. Predictably, hotel and travel logistics were a nightmare. Jacksonville made up for the lack of hotel rooms with seven cruise ships, which provided about 3,700 of the 17,500 4 or 5 star hotel rooms the NFL requires for Super Bowl week. The city in total had 35,000 rooms for Super Bowl week.

Milwaukee is on Lake Michigan, which probably does not have access to the same large cruise ships. While Milwaukee isn’t Jacksonville, perhaps temporary housing (on boats and Air BnB) could make the hotel shortage more manageable.

What are other ways cities have managed the logistics of large events?


Today I am officially a full professor!

Today I am officially a full professor! I’ve had a wonderful journey as a professor. Below I pasted my research statement that summarizes all of the research I’ve done since my PhD. I thought this was worth sharing with people other than my colleagues on the promotions committee.

Research statement

My research applies operations research methodologies to important societal applications.  My primary methodological base is discrete optimization, including integer programming and Markov decision processes (MDPs).  I have applied operations research methodologies to address public sector problems related to emergency medical services, homeland security and infrastructure protection, and disaster response and recovery.

The public sector applications I have studied are complex systems that span people, processes, vehicles, and critical infrastructure. My research has studied how to cost-effectively allocate resources, decisions that are not made in isolation. Rather, decisions are interrelated, with every decision potentially affecting every other decision due to congestion, processing delays, capacities, and uncertainty about what can happen next. While many papers in the literature apply operations research methods to public sector problems, these papers have limitations. First, research models make simplifying assumptions that limit the applicability of the research to real systems. Second, the papers in the literature have not been adequate for addressing problems in emerging areas, such as in homeland security, infrastructure protection, and disaster management, which have become important issues of national concern in recent years.

To overcome these limitations, I have identified problems that are important from a societal point of view and interesting from a methodological point of view. My approach has been to either lift simplifying assumptions made in the literature to broaden the scope of applicability and/or formulate new models to represent problems not yet considered in the literature. Moreover, I have worked with stakeholders to gain domain expertise and have used real-world data whenever possible to ensure that the most salient aspects of the problems are reflected in my models and analysis.

My research has made several substantive and technical contributions to operations research and public sector applications. I have identified important public sector problems that have not been addressed in the literature and can be modeled using operations research techniques, thereby expanding the operations research discipline to encompass new applications that motivate new models and solution techniques. I typically formulate a new optimization model for each problem at hand, drawing on methods and ideas relevant to the application, since one generic model cannot be meaningfully applied to many settings. I have employed a broad set of methods, drawing from integer programming, stochastic programming, network optimization, network interdiction, MDPs, and queueing, since a broad methodological toolkit is essential for solving real problems. I have analyzed each of the models to provide insights about a particular problem, facilitate numerical implementation, and identify exact and heuristic algorithmic approaches. Finally, I solve the models, using real-world data or representative data sets.

My emergency medical service (EMS) research dates to 2007 and focuses on how to enhance emergency healthcare by deploying and routing multiple types of vehicles to emergency calls for service. I formed a community partnership with local fire and EMS departments, and these collaborations provided the motivation and domain knowledge for my research. While there exist many models for EMS planning in the literature, most models assume a uniform fleet of homogenous ambulances and do not consider prioritized patients. My research has lifted these assumptions to design risk-based planning and response protocols using a heterogeneous set of vehicles to and to gain policy insights. My research has led to an analysis of performance measures to support patient outcomes and survivability, new models for informing ambulance deployment and routing decisions, and new insights for balancing multiple criteria in making these decisions. My research on deploying non-transport vehicles to emergency healthcare patients was the first of its kind, which led to important policy insights for public safety leaders. This research was put into practice by Hanover County Fire & EMS in Virginia. I was part of the Hanover County team that was awarded a National Association of Counties (NACo) award in 2010:  Best of Category for the Achievement Award in the Next-Generation Emergency Medical Response through Data Analysis & Planning based on the substantial improvement my research models made in a real system. My 2012 paper in the management journal Interfaces reports the impact of this research observed in practice.

Since tenure, my research has focused on fire and EMS planning problems in tiered systems with multiple types of vehicles. There is not an analytical framework to support making decisions in tiered fire and EMS systems that send a mixture of vehicles to prioritized patients, and my research has focused on filling this important knowledge gap. My research group has formulated new MDP, integer programming, and approximate queueing models to study several important issues, such as how to cost-effectively route ambulances to patients with uncertain medical conditions, how to make joint deployment and routing decisions instead of assuming that these two interrelated decisions are separate, how to deploy a heterogeneous set of vehicles on a network for responding to prioritized calls for service, and how to design new risk-based response paradigms in congested networks. In this stream of research, I produced a suite of new models to support the design and operation of fire and EMS systems. My research group developed the first queueing model to evaluate the performance of fire departments when multiple vehicles respond to calls for service. This past year, this model was incorporated into the software used to route fire engines to calls in Canada.

My homeland security and infrastructure protection research dates to 2001. My early research was in the area of aviation security, where I developed and analyzed new integer programming and MDP models to assess the viability of risk-based prescreening systems.  The common theme that pervades this research is the novel formulation and application of dynamic optimization models to capture and improve the operation of aviation security systems.  The approach uses real-time assignment policies to adapt to variations in the day-to-day airport threat environment. My research provided policy insights regarding risk-based aviation security, served as the model for the TSA PreCheck paradigm, and provided the technical validation that helped facilitate its launch by the TSA. For my pivotal role in the creation and widespread adoption of risk-based aviation security strategies, my collaborators and I were awarded the INFORMS Impact Prize in 2018.

My research in this area took a new direction into cybersecurity after tenure. I recognized an opportunity to apply operations research methods to supply chain risk management (SCRM) to protect critical information technology infrastructure and enhance cybersecurity. My research group has formulated the first optimization models for enhancing SCRM by identifying a set of cybersecurity mitigations that are effective with respect to cost and risk-reduction and are robust to uncertainty. My research has introduced integer programming models to address security mitigation prioritization as well as a new exact algorithm for solving bi-level network interdiction models for critical infrastructure protection. My research group has also developed a new computational algorithm that efficiently solves a difficult class of infrastructure protection problems that considers the impact of adaptive adversaries, an important feature of the models in this area, on a network.

My disaster response and recovery research studies how to respond to emergencies on a network during and immediately after mass casualty incidents and how to restore a network after a disaster. This research area draws upon my emergency management and infrastructure protection expertise gained through the previous two areas of research. In this area of research, my group has investigated how community-driven data can be used to manage interdependent networks to enhance their recovery after a disruption. My research group has formulated and analyzed new models that study how to coordinate the activities of multiple types of service providers to restore a network after a disaster. The results provide insight into our understanding and management of infrastructure recovery from natural disasters.

My research formulates new operations research models and algorithms for solving important and interesting real-world problems of national interest and concern. I take this responsibility to serve my profession and our nation quite seriously. I believe that it is essential for researchers who are working on problems in the public sector to disseminate their research findings to the public through outreach in addition to dissemination in academic journals. Translating research concepts into practical messages is critical for influencing public policy and transitioning research concepts into practice. This is a common theme that permeates all my research activities.

optimization for cyber-security and protecting critical infrastructure

In the past few years, I’ve been working on cyber-security and infrastructure protection research by applying stochastic programming and network interdiction methodologies. My department posted a news article about my research with former PhD student Kay Zheng that you can read here. The research was supported by the National Science Foundation #1422768

My oldest daughter helped me make a short (and campy) youtube video about my cyber-security research. It looks like a movie trailer simply because my daughter likes making movie trailers. I’d totally see this summer blockbuster 😉



The abstracts and links to my papers in cyber-security are below:

Zheng, K., Albert, L., Luedtke, J.R., Towle, E. 2019. A budgeted maximum multiple coverage model for cybersecurity planning and management, To appear in IISE Transactions. DOI: 10.1080/24725854.2019.1584832

Abstract: This article studies how to identify strategies for mitigating cyber-infrastructure vulnerabilities. We propose an optimization framework that prioritizes the investment in security mitigations to maximize the coverage of vulnerabilities. We use multiple coverage to reflect the implementation of a layered defense, and we consider the possibility of coverage failure to address the uncertainty in the effectiveness of some mitigations. Budgeted Maximum Multiple Coverage (BMMC) problems are formulated, and we demonstrate that the problems are submodular maximization problems subject to a knapsack constraint. Other variants of the problem are formulated given different possible requirements for selecting mitigations, including unit cost cardinality constraints and group cardinality constraints. We design greedy approximation algorithms for identifying near-optimal solutions to the models. We demonstrate an optimal (1–1/e)-approximation ratio for BMMC and a variation of BMMC that considers the possibility of coverage failure, and a 1/2-approximation ratio for a variation of BMMC that uses a cardinality constraint and group cardinality constraints. The computational study suggests that our models yield robust solutions that use a layered defense and provide an effective mechanism to hedge against the risk of possible coverage failure. We also find that the approximation algorithms efficiently identify near-optimal solutions, and that a Benders branch-and-cut algorithm we propose can find provably optimal solutions to the vast majority of our test instances within an hour for the variations of the proposed models that consider coverage failures.

Zheng, K., and Albert, L.A. A robust approach for mitigating risks in cyber supply chains, To appear in Risk Analysis. DOI: 10.1111/risa.13269

In recent years, there have been growing concerns regarding risks in federal information technology (IT) supply chains in the United States that protect cyber infrastructure. A critical need faced by decisionmakers is to prioritize investment in security mitigations to maximally reduce risks in IT supply chains. We extend existing stochastic expected budgeted maximum multiple coverage models that identify “good” solutions on average that may be unacceptable in certain circumstances. We propose three alternative models that consider different robustness methods that hedge against worst‐case risks, including models that maximize the worst‐case coverage, minimize the worst‐case regret, and maximize the average coverage in the ( 1 − α ) worst cases (conditional value at risk). We illustrate the solutions to the robust methods with a case study and discuss the insights their solutions provide into mitigation selection compared to an expected‐value maximizer. Our study provides valuable tools and insights for decisionmakers with different risk attitudes to manage cybersecurity risks under uncertainty.

Zheng, K., and Albert, L.A. Interdiction models for delaying adversarial attacks against critical information technology infrastructure. To appear in Naval Research Logistics.

Information technology (IT) infrastructure relies on a globalized supply chain that is vulnerable to numerous risks from adversarial attacks. It is important to protect IT infrastructure from these dynamic, persistent risks by delaying adversarial exploits. In this paper, we propose max‐min interdiction models for critical infrastructure protection that prioritizes cost‐effective security mitigations to maximally delay adversarial attacks. We consider attacks originating from multiple adversaries, each of which aims to find a “critical path” through the attack surface to complete the corresponding attack as soon as possible. Decision‐makers can deploy mitigations to delay attack exploits, however, mitigation effectiveness is sometimes uncertain. We propose a stochastic model variant to address this uncertainty by incorporating random delay times. The proposed models can be reformulated as a nested max‐max problem using dualization. We propose a Lagrangian heuristic approach that decomposes the max‐max problem into a number of smaller subproblems, and updates upper and lower bounds to the original problem via subgradient optimization. We evaluate the perfect information solution value as an alternative method for updating the upper bound. Computational results demonstrate that the Lagrangian heuristic identifies near‐optimal solutions efficiently, which outperforms a general purpose mixed‐integer programming solver on medium and large instances.

Zheng, K., and Albert, L.A. An exact algorithm for solving the bilevel facility interdiction and fortification problemOperations Research Letters 46(6), 573 – 578.

We present an exact approach for solving the r-interdiction median problem with fortification. Our approach consists of solving a greedy heuristic and a set cover problem iteratively that guarantees to find an optimal solution upon termination. The greedy heuristic obtains a feasible solution to the problem, and the set cover problem is solved to verify optimality of the solution and to provide a direction for improvement if not optimal. We demonstrate the performance of the algorithm in a computational study.


EURO Working Group on Locational Analysis meeting plenary

I gave a plenary talk at the EURO Working Group for Locational Analysis (EWGLA) XXV Conference entitled “On designing public sector systems.” I am grateful to Prof. Dr. Lieselot Vanhaverbeke, organizer of the 2019 EURO Working Group on Locational Analysis meeting and professor at Vrije Universiteit Brussel (VUB) for inviting me. Her hospitality and the hospitality of the entire conference organizing committee was amazing. They even gave me a box of very nice locally made gourmet speculoos cookies, which I greatly appreciate.

My slides are below.

The references from my presentation capture almost two decades of research. The papers can be found in the Research section of my blog and are listed at the bottom of this blog post.


I captured much of my visit on twitter. Here are some highlights:


Aviation security

1.Jacobson, S. H., J. E. Virta, L. A. McLay, J. E. Kobza, 2005.  Integer Program Models for the Deployment of Airport Baggage Screening Security Devices, Optimization and Engineering 6(3) 339 – 359.

2.Jacobson, S. H., L. A. McLay, J. E. Kobza, J. M. Bowman, 2005. Modeling and Analyzing Multiple Station Baggage Screening Security System Performance, Naval Research Logistics 52(1), 30 – 45.

3.McLay, L. A., S. H. Jacobson, and J. E. Kobza, 2006. A Multilevel Passenger Prescreening Problem for Aviation Security, Naval Research Logistics 53 (3), 183 – 197.

4.Lee, A.J., L.A. McLay, and S.H. Jacobson, 2009. Designing Aviation Security Passenger Screening Systems using Nonlinear Control.  SIAM Journal on Control and Optimization 48(4), 2085 – 2105.

5.McLay, L. A., S. H. Jacobson, and A. G. Nikolaev, 2009.  A Sequential Stochastic Passenger Screening Problem for Aviation Security, IIE Transactions 41(6), 575 – 591.

6.McLay, L.A., S.H. Jacobson, A.J. Lee, 2010.  Risk-Based Policies for Aviation Security Checkpoint Screening.  Transportation Science 44(3), 333-349.

Infrastructure Protection

1.  Albert McLay, L., 2015. Discrete optimization models for homeland security and emergency management, TutORial at the 2015 INFORMS Annual Meeting, November 1-4, 2015, Philadelphia, PA.

2.Zheng, K., Albert, L., Luedtke, J.R., Towle, E. 2019. A budgeted maximum multiple coverage model for cybersecurity planning and management, To appear in IISE Transactions. DOI: 10.1080/24725854.2019.1584832

3.Zheng, K., and Albert, L.A. 2019. Interdiction models for delaying adversarial attacks against critical information technology infrastructure. To appear in Naval Research Logistics.

Emergency Medical Services

1.McLay, L.A., 2009.  A Maximum Expected Covering Location Model with Two Types of Servers, IIE Transactions 41(8), 730 – 741.

2.McLay, L.A., 2010. Emergency Medical Service Systems that Improve Patient Survivability. Encyclopedia of Operations Research in the area of “Applications with Societal Impact,” eds. J.J. Cochran, L. A. Cox, Jr., P. Keshinocak, J.C. Smith. John Wiley & Sons, Inc., Hoboken, NJ (published online: DOI: 10.1002/9780470400531.eorms0296).

3.McLay, L.A. and M.E. Mayorga, 2010. Evaluating Emergency Medical Service Performance Measures.  Health Care Management Science 13(2), 124 – 136.

4.McLay, L.A., Mayorga, M.E., 2011.  Evaluating the Impact of Performance Goals on Dispatching Decisions in Emergency Medical Service. IIE Transactions on Healthcare Service Engineering 1, 185 – 196

5.Ansari, S., McLay, L.A., Mayorga, M.E., 2015. A maximum expected covering problem for locating and dispatching servers. To appear in Transportation Science.

6.McLay, L.A., Moore, H. 2012. Hanover County Improves Its Response to Emergency Medical 911 Calls. Interfaces 42(4), 380-394.

7.Ansari, S., McLay, L.A., Mayorga, M.E., 2015. A Maximum Expected Covering Problem for District Design, Transportation Science 51(1), 376 – 390.

8.Grannan, B.C., Bastian, N., McLay, L.A. 2015. A Maximum Expected Covering Problem for Locating and Dispatching Two Classes of Military Medical Evacuation Air Assets. Operations Research Letters 9, 1511-1531.

9.Yoon, S. and Albert, L.A. 2018. Dynamic Resource Assignment for Emergency Response with Multiple Types of Vehicles, Under review at Operations Research, October 2018.

10.Yoon, S., and Albert, L.A. 2019. A dynamic ambulance routing model with multiple response. Under review at Transportation Research Part E: Logistics at Transportation Science.