You have /5 articles left.
Sign up for a free account or log in.

Among the world’s great cultural attractions is Stockholm’s Vasa museum, which houses the world’s best-preserved 17th-century warship—the Vasa—which sank in 1628 on its maiden voyage.

When salvaging the wreck in 1961, Sweden successfully recovered about 98 percent of the ship’s original materials. As a result, visitors receive a nearly intact glimpse into maritime history.

In addition to displaying the ship itself, the museum features a range of exhibits that provide context to the ship’s construction, the circumstances of its sinking and social and political life in 17th-century Sweden. These exhibits include artifacts recovered from the ship, such as tools, clothing and personal items.

It’s estimated that some 60 sailors lost their lives when the ship capsized. An inquiry reached no conclusions about who was responsible for the disaster. In retrospect, the culprits are clear: poor design, construction flaws, inadequate testing and management failures.

The Vasa was top-heavy. The design, with two gun decks, made the ship unstable—the ship’s ballast was insufficient to counterbalance the weight of the upper decks and cannons, making the ship prone to tipping over.

Last-minute modifications, ordered by the Swedish king, added additional armaments and decorative features—and weight—to the ship’s upper decks. Also, the ship’s design pushed the limits of contemporary shipbuilding knowledge.

The stability tests performed on the Vasa were inadequate—a crucial test involved having 30 men run back and forth across the deck to see how much the ship would sway. The test was stopped prematurely because the ship began to rock dangerously, but the results were ignored and the ship was deemed ready to sail​.

There was significant pressure from the king to launch the ship quickly to join the ongoing war efforts. This pressure led to the bypassing of essential safety checks and thorough testing.

In addition, there were communication breakdowns between the various parties involved in the ship’s construction, including the shipbuilders, designers and royal officials, contributing to oversight of critical stability issues.

On the day of its maiden voyage, a gust of wind filled the Vasa’s sails as it left the harbor. The ship began to lean to one side, and water started to enter through the open gun ports. Unable to right itself, the Vasa sank quickly, just a short distance from the shore​.


Technological failures are a common feature of contemporary life. Consider some of the most notorious examples:

  • The partial meltdown of a reactor core and the release of radioactive gases at the Three Mile Island nuclear facility in 1979 due to a combination of mechanical failures, design-related problems and human error.
  • The release of toxic gases in 1984 at Union Carbide’s pesticide plant in Bhopal, India, killing 3,000 to 4,000 people and harming tens of thousands, due to design flaws, safety systems failures, poor maintenance and critical equipment that was out of service.
  • Radiation overdoses by Therac-25 radiation therapy machines between 1985 and 1987, resulting in serious injuries and deaths of patients, attributed to software bugs and operator errors.
  • The 1986 space shuttle Challenger explosion, triggered by the failure of an O-ring seal on its right solid rocket booster due to cold temperatures, which killed all seven crew members.
  • The 1986 Chernobyl nuclear disaster caused by a flawed reactor design and mistakes made by plant operators.
  • The failure to convert between metric and imperial units, which led the Mars climate orbiter, in 1999, to enter the Martian atmosphere at the wrong trajectory, leading to its destruction.
  • The 2011 Fukushima Daiichi nuclear disaster, which led to a major release of radioactive materials, following a massive earthquake and tsunami that disabled the power supply and cooling system of three reactors, leading to meltdowns.
  • The 2010 Deepwater Horizon oil spill, resulting from a failed blowout preventer and a series of poor safety decisions.

Not only do technological failures abound, but because of increasingly interconnected global networks and supply chains, a glitch or mishap can have international consequences. In 1965, a series of technical failures and human errors led to a massive power outage across the northeastern United States and parts of Canada.

The initial trigger was a faulty relay at a generating station near Niagara Falls in Ontario. This relay was designed to protect a transmission line from overheating but instead malfunctioned. The malfunction caused the line to disconnect, overloading other lines and causing them to shut down sequentially. Miscommunication and delays in decision-making exacerbated the problem.

The blackout affected approximately 30 million people across an area of 80,000 square miles, including major cities like New York, Boston and Toronto. Power was out for up to 13 hours in some areas. Public transportation systems, including subways and trains, came to a halt. People were trapped in elevators, and businesses and services were forced to shut down temporarily.

Just the other day, a seemingly simple software upgrade caused a worldwide outage, disrupting flights and hospital operations.


Major technological failures are complex events that rarely stem from a single mistake or even a string of oversights. Instead, they are typically the result of multiple interacting factors that include cultural, human, psychological, sociological and technical dimensions.

Often, organizational cultures contribute to an environment in which disaster can take place. The culture at NASA at the time of the Challenger explosion placed a high value on meeting deadlines and launching despite potential risks. Engineers had concerns about the O-rings’ performance in cold temperatures, but management pressures and the normalization of deviances (accepting minor anomalies as normal) led to the fatal decision to launch​.

The 2011 Fukushima Daiichi nuclear disaster arose, in part, from a corporate culture that underestimated the potential for a catastrophic tsunami. Historical data and risk assessments were either ignored or inadequately communicated among the stakeholders, contributing to the disaster.

Human factors also contribute to technological disasters. Operators at the Chernobyl nuclear plant conducted a test that involved disabling safety systems. Combined with a lack of proper training and understanding of the reactor’s behavior under low-power conditions, this test led to the explosion and subsequent meltdown.

The 2010 Deepwater Horizon oil spill partly resulted from decision-making under the pressure of cost and time constraints. BP engineers bypassed safety protocols and ignored warning signs of an imminent blowout due to their sense of urgency to complete the oil platform’s drilling.

Psychological factors, too, play a role. With the Therac-25 radiation therapy machine, overconfidence in the software’s reliability led operators to dismiss anomalies as false alarms. This cognitive bias toward trusting technology without adequate skepticism resulted in lethal radiation overdoses to patients.

At Three Mile Island, operators were under significant stress and confusion during the accident. The failure to correctly interpret the control room indicators due to cognitive overload and fatigue contributed to the partial meltdown of the reactor.

Sociological factors and social dynamics of organizations are also crucial. In the Bhopal tragedy, poor labor practices, insufficient safety measures and lack of maintenance were influenced by the socioeconomic conditions and power dynamics within the organization and between the company and the local community.


Technological disasters are not simply the product of a string of oversights. They arise from a complex interplay of cultural, human, psychological, sociological and technical factors. Contributors include engineering flaws, human error and organizational failures. Human psychological limitations, organizational mismanagement and cultural attitudes toward risk all play a role.

Addressing these issues requires an interdisciplinary approach that integrates lessons from history, sociology, psychology, engineering and other fields to prevent future disasters. Only by adopting an interdisciplinary perspective can we effectively mitigate such complex problems. It’s essential to draw insights from historical case studies, sociological analysis, psychological research and science and engineering to minimize systemic vulnerabilities and prevent future disasters.

Historical analysis of past disasters provides valuable lessons into the causes of technology failures, their long-term impact and the evolution of safety standards. Sociological insights help understand the social dynamics and organizational cultures that contribute to technological disasters. Since human error and cognitive biases play a significant role in many technological disasters, understanding psychological principles can lead to better training programs and decision-making processes

By integrating these diverse perspectives, an interdisciplinary approach can provide a more thorough understanding of the causes of technological disasters and lead to more effective mitigation strategies.


Since technological mishaps have cultural, psychological, sociological and technical dimensions, we need more than legal, legislative or technical quick fixes. We need to understand failure through a multidisciplinary lens.

Disaster and systems failure studies is an interdisciplinary field that studies both natural and human-made disasters. It typically encompasses meteorology, seismology, engineering, environmental science and emergency management and focuses on disaster preparedness, risk assessment, emergency response, recovery and reconstruction.

Disaster researchers have made a number of important findings. One involves the importance of early warning systems, which can significantly reduce the death toll and economic impact of disasters. Another key finding involves the importance of identifying and addressing the special needs of highly vulnerable populations, whether defined in terms of age or poverty or geographical location. Disaster research has also found that reliance on rigid protocols tends to inhibit responses in emergency situations when a swift response is of the essence; that delays due to coordination, jurisdictional and communication issues are widespread; and that in the aftermath of a technology failure, there is a tendency not to hold individuals to account.

Thus, in the case of Hurricane Katrina in 2005, there were significant coordination failures among federal, state and local governments. Delays in decision-making, unclear command structures and inadequate communication exacerbated the disaster’s impact.

Disaster and systems failure studies could be even stronger if these programs included greater participation from history, organizational psychology, public policy and sociology.

Historical analysis can offer critical insights into the causes and consequences of past disasters, helping to identify patterns and recurring issues. It can also show how safety standards and regulations have evolved in response to disasters. Understanding this evolution might help in formulating more effective regulatory frameworks.

Organizational psychology examines how cognitive biases, stress, communication failures and decision-making processes contribute to disasters. Building a robust safety culture within organizations is crucial for disaster prevention. Organizational psychology provides tools and frameworks for fostering a culture where safety is prioritized and employees feel empowered to report potential hazards without fear of reprisal.

Public policy plays a critical role in disaster prevention through the creation and enforcement of safety regulations. Effective policies are based on comprehensive risk assessments and are designed to mitigate identified hazards, while balancing safety, efficiencies and costs.

Sociology offers critical insights into the social dynamics and structures that influence both the occurrence of technological disasters and their aftermath. By examining the interplay between human behavior, societal norms, organizational cultures and systemic inequalities, sociologists can contribute to more effective prevention, response and recovery strategies.


Most of today’s most serious problems have multiple dimensions, yet universities remain largely organized around academic departments that are more than a century old. The reasons are obvious: Redesigning universities to be more interdisciplinary requires significant changes in institutional priorities, funding structures, evaluation criteria and administrative practices, and changing entrenched systems is extremely difficult.

But it’s not necessary to radically redesign the university to promote the study of a particular issue from a variety of disciplinary perspectives—though a university must address workload and evaluation issues if a more interdisciplinary approach is to succeed.

Obvious areas for collaborative teaching and research include:

  • Childhood studies programs, which examine children’s development and the social forces that influence them. This includes the study of children’s history, cultures, development, health, media, socialization, schooling and public policy.
  • Conflict resolution studies, which explores the causes, dynamics and resolution of conflict, from the personal to the international.
  • Criminal justice studies, which offers a comprehensive understanding of crime, criminal behavior, law enforcement, legal processes and corrections.
  • Design studies, which seeks to ensure that students acquire the skills and knowledge to create innovative, sustainable and impactful solutions to various business, policy, product services, social and systems challenges.
  • Digital arts and media programs, which seek to introduce students to graphic design, 3-D modeling, animation, film and video production, interactive web and game design and development, sound and music production, and digital communication strategies.
  • Environment and sustainability programs, which study ecosystem dynamics, biodiversity and conservation, climate science, environmental history and sociology, sustainable development, natural resource management, and environmental justice, policy and law.
  • Health policy, which studies health care systems, health economics, health-care policy development and implementation, epidemiology, public health interventions, health law and ethics, the social determinants of health and health disparities.

An interdisciplinary program in technology and society would examine:

  • The design, operation and failure modes of complex technological systems.
  • Techniques for identifying, assessing and mitigating risks associated with technological systems.
  • The role of human behavior in technological failures, including cognitive biases, decision-making processes and organizational culture.
  • The social impact of technological disasters.
  • Lessons learned from past disasters and how they have shaped current practices and policies.
  • National and international regulations governing technological systems and disaster management.
  • Techniques for developing and implementing policies that mitigate the impact of technological disruptions and disasters.
  • Strategies for effective disaster response and recovery, including coordination among government agencies, nonprofits and private sector entities.

A fascinating article argues that in seeking to understand why the Secret Service failed to stop the assassination attempt against former president Trump, investigators should draw upon the insights that disaster science offers. The article’s author, James B. Meigs, argues that the focus should not just be on individual lapses, but on the Secret Service’s culture and organizational dynamics, which may have led to the organization to “gradually cut corners, take on greater risks and allow workers to grow complacent.”

For example, since safety lapses often take place in the “seams” between various organizational units (in this case, between the Secret Service and local police), what procedures were in place to ensure that warnings were communicated quickly and taken seriously?

What, then, are questions that investigators might ask about the Secret Service’s actions on July 13, 2024?

Some questions are obvious:

  • Why, once a threat was identified, did the agents not delay the former president’s appearance or rush him off the stage?
  • Why did the countersniper team wait until bullets were fired before shooting the would-be assassin? Did they lack a protocol for identifying friends or foes quickly?

But the more important questions have to do with the Secret Service’s organizational culture. In her 2021 book, Zero Fail: The Rise and Fall of the Secret Service, Washington Post investigative reporter Carol Leonnig depicts the agency’s “frat-boy culture,” which lacks focus and self-discipline and close attention to detail. Is that still the case? Does the Secret Service have a pattern of ignoring “weak signals” that suggest that its agents are sloppy, disorganized and undisciplined and ignore security protocols?

Does the organization have sufficiently clear priorities? Is it devoting insufficient time to agent training? Are competing goals diverting the agency’s core focus on providing security? Is it spread too thin, with its many responsibilities, which, apart from its protective duties, include investigating cybercrime, identity theft, missing children and credit-card fraud?


Our institutions need to foster cross-disciplinary collaboration if we are to truly understand the broader implications of technology and to respond to the challenges that lie ahead. An interdisciplinary approach ensures that ethical considerations are integrated into the development and deployment of new technologies. Understanding the societal and environmental impact of technology can help in developing policies can also mitigate the risk that technology will exacerbate existing social inequalities and create new environmental issues.

Since many contemporary issues are interconnected, a multidisciplinary approach is necessary to understand these systems fully and develop effective solutions.

In today’s globally interconnected world, the interdisciplinary study of technological disasters and systems failures isn’t a frill; it’s imperative. Such an approach needs to be truly holistic, encompassing history, psychology, public policy and sociology. Technology is too important to be left exclusively to engineers. Engineering the future requires many disciplines’ input.

Steven Mintz is professor of history at the University of Texas at Austin and the author, most recently, of The Learning-Centered University: Making College a More Developmental, Transformational and Equitable Experience.

Next Story

Written By

More from Higher Ed Gamma