You have /5 articles left.
Sign up for a free account or log in.

Medicine’s future doesn’t lie ahead. It’s already here.

Health care is undergoing a radical transformation, an upheaval that is, in many respects, comparable to the biggest disruptions of the past that dramatically changed the understanding and practice of medicine.

These developments included the discovery of anatomy during the Renaissance; the introduction of vaccination beginning in the late 18th century; the shift from miasma and humoral theories of illness to the germ theory of disease in the late 19th century; the advent of anesthesia, which made complex, longer surgeries possible; the birth of evidence-based medicine beginning in the early 20th century; the mid-20th-century discovery of antibiotics; the introduction of medical imaging, starting with X-rays and subsequent innovations like CT scans, MRIs and PET scans; and the dawn of organ transplantation, beginning with the first successful kidney transplant in 1954, followed by advances in immunosuppressive drugs.

Even those with only a superficial understanding of medicine recognize that we are at one of the field’s pivot points in which far-reaching shifts in diagnostics, staffing and treatment are underway. Think of the innovations we are living through:

  • The advances in genetic testing, genetic sequencing and gene therapy that have made it possible to tailor treatments to a patient’s genetic makeup, optimizing therapeutic effectiveness and reducing adverse effects.
  • The integration of digital technology in health care, including electronic patient records, remote patient monitoring and automated systems to schedule appointments and assist physicians in making diagnostics.
  • The use of artificial intelligence and machine learning to systematically analyze medical images and vast amounts of data, making it possible to detect patterns that might be missed by the human eye.
  • The introduction of minimally invasive surgical procedures to allow for quicker recovery time, as well as highly precise robot-assisted surgery

Then, there are the innovations just beginning to be introduced:

  • Next-generation mRNA vaccines that can be much more rapidly developed and produced than earlier vaccines.
  • New treatments for the reduction of low-density lipoproteins (LDL-C), a significant contributor to cardiovascular disease, as well as novel drug treatments for obesity and diabetes.
  • Implants for those with severe paralysis that allow patients to recover lost motor control.
  • The use of artificial intelligence to accelerate the detection of sepsis and hypertension.
  • The adoption of medical wearables to facilitate in-home testing and remote monitoring.
  • Gene-editing techniques that hold out the prospect of preventing treat sickle cell anemia, congenital blindness, heart disease, diabetes, cancer and HIV before they occur.
  • Immunotherapies that harness the patient’s immune system to fight diseases.
  • Noninvasive tests that analyze blood samples to allow for earlier detection of circulating tumor DNA and other markers of disease.
  • Microneedle patches to deliver medications through the skin and offer an alternative to traditional injections and more painful patches.

No wonder a recent New York Times headline declares, “Suddenly, It Looks Like We’re in a Golden Age for Medicine.”

What sounds like science fiction is increasingly taking place.

Data mining and new technologies and the rapid automated analysis of data hold out the prospect of reducing medical errors, misdiagnoses and dangerous prescription drug interactions.

We are in the midst of what the eminent cardiologist Dr. Eric J. Topol calls medicine’s Gutenberg or Schumpeter moment. In a series of highly influential books and articles, Dr. Topol describes “the creative destruction of medicine”: the process of innovation and technological change that is poised to transform the practice of medicine.

The embrace of genomics, the rapid analysis of big data and the widespread use of artificial intelligence–powered diagnostics is underway, resulting in new techniques for diagnosis, prognosis, treatment and patient monitoring.

But if these breakthroughs and innovations are to be effective, as one commentator points out, “patients, caregivers, health care workers, [regulators,] and policy-makers need to be better armed with information to make confident knowledgeable decisions about the new directions that health care will take.”

I recently had a chance to chat briefly with Dr. Topol, who is one of the top 10 most cited researchers in medicine, a member of the National Academy of Medicine and the author or co-author of 1,200 peer-reviewed articles. That conversation and a talk that he subsequently gave, “How Multimodal AI Will Transform Medicine’s Future,” raise a question that all of us in higher education ought to ponder: How can we best prepare students for this brave new world? Also, do only major research universities have the expertise to train not only the next generation of geneticists, genetic counselors and lab professionals, but astronomers, computer scientists, economists, environmental scientists, neuroscientists, physicists and others?

I have been somewhat ambivalent about the education that my research-focused institution offers. But after speaking with Dr. Topol, I have begun to wonder whether it might be the case that only institutions like UT—with very large faculties and a broad array of technical specialties—can equip especially talented, driven and ambitious students for success in the fields that are rapidly emerging. Not just in medicine, but astronomy, chemistry, criminal justice and law enforcement, economics, environmental science, physics, political science, public policy, sociology and anthropology, urban planning—and even education and the humanities.

I worry that the liberal arts colleges that I intensely admire, as well as many of the less resourced, less selective regional comprehensives and urban campuses that serve the majority of undergraduates, do not currently have the capabilities to prepare undergraduates for success in the fields that are rapidly emerging in artificial intelligence, computer science, data science, genomics and machine learning.

Take a few examples.

In astronomy, artificial intelligence is already an indispensable tool in analyzing and making sense of the vast amounts of data generated by modern astronomical instruments. New technologies can identify relevant patterns or signals amid noise and terrestrial interference; assist in the automated classification of celestial objects such as stars, galaxies, supernovas and exoplanets based on their distinct characteristics; identify repeating signals (like those from pulsars) and find irregularities in light curves, which may indicate exoplanets; simulate cosmic structure formation, galaxy evolution and other large-scale cosmic phenomena; detect gravitational waves; monitor and track objects in space; enhance image quality; and organize, retrieve, visualize and interpret complex astronomical data sets.

In economics and business, artificial intelligence and machine learning are already used in predictive analysis of various economic indicators, market trends and the impact of policy changes. In the field of behavioral economics, AI can sift through vast amounts of consumer data and provide insights into people’s behavior, decision-making processes and cognitive biases. Machine learning is also used in macroeconomic modeling—to forecast economic growth, inflation rates and other macroeconomic variables—and financial analysis: to predict stock prices, identify trading patterns and analyze news, social media sentiments and other external factors to predict market movements. In addition, AI and ML can be used to analyze labor market trends, predict future job market and skills in demand, enhance econometric models and optimize resource allocation and supply chains, logistics and inventories.

In psychology and neuroscience, genomics and artificial intelligence are already contributing to our understanding of brain structures and functions, as well as the localization of brain activities linked to specific cognitive tasks or emotions. These tools are also helping researchers model neural networks, neural dynamics and brain states; model cognitive processes; understand the genetic factors that influence brain development; and identify genes associated with certain behaviors or psychological traits and the interaction of multiple genes.

With the ability to ability to analyze vast and complex data sets, model intricate systems and make predictions, AI and ML play a particularly important role in environmental and sustainability studies: in climate modeling and prediction, monitoring air quality, biodiversity, deforestation and oceans and fresh water and the economic impact of environmental policies and projects.

Of course, these tools can also be used in urban planning, for example, to predict traffic patterns and optimize public transport routes; in law enforcement and criminal justice, to predict crime patterns, allocate resources and assist in forensics; and in personalizing education, by adjusting content to students’ interests and pace, addressing forgetting curves, embedding remediation and identifying confusions or disengagement.

In my own neck of the woods, the humanities, AI and ML can enrich research, analysis and understanding. In the study of literature, these tools can sift through vast volumes of text and identify recurring topics, themes or motifs and trace changes in sentiment and emotional tone over time and across time or different cultural contexts. These technologies can also identify patterns and styles in art, help in the analysis of rhythm and pitch in different cultural contexts, assist in the authentication of artworks, automate translations and transcriptions, aid in the creation of digital maps and analyze vast amounts of archival data, newspapers and other documents.

At an institution like MIT, it’s natural for almost all students to be exposed to the latest and greatest advances. STEM students at places like MIT simply absorb the latest advances in informatics, machine learning and algorithm development and use it as part of their classes or, more likely, when they do work in a lab. These skills might not be “taught” per se but simply learned through hands-on doing … as they will be part of whatever a PI will be up to.

So what should institutions that aren’t MIT or Caltech do? I think it is more important than ever that the broad-access public universities and liberal arts institutions focus on ensuring their students develop advanced thinking and writing skills and quantitative and cultural literacies, with the expectation that their students might learn the newest tech on the job and in advanced degree programs.

I wholeheartedly agree with the argument that the economist Tyler Cowen advances in his 2013 book, Average Is Over: “High earners are taking ever more advantage of machine intelligence in data analysis and achieving ever-better results. Meanwhile, low earners who haven’t committed to learning, to making the most of new technologies, have poor prospects.”

Let’s not fool ourselves: the intersection of technology and human capabilities is reshaping every facet of society, and those who don’t keep up will be left behind. Automation and artificial intelligence, data science, genomics and machine learning present enormous opportunities for those who can leverage that knowledge and those skills and tools—and imperil to those who don’t.

A very modest recent decrease in inequalities of income and wealth mustn’t blind us to the fact that without the proper skill sets, many Americans are fated to a financially insecure and unstable life. As one reader put it, “The ‘winners’ in the coming economy are those who can effectively use machines—not necessarily programmers but those with enough skill to use the data that machines can give us, while making the machines do what we want … The ‘losers’ will be those who do not adapt.”

Are our institutions doing a truly effective job of preparing undergraduates for that world? I have serious doubts. I fear that we are in fact reinforcing the divide between those who are learning to function in the emerging ecosystem—learning, for example, about informatics, computation, data analysis and statistics—and those who aren’t.

Nor are we doing enough to explain that graduates will need advanced education well beyond the skills an undergraduate education imparts.

It’s easy to look backward and scoff at the time when facility with Latin was a prescribed part of the college curriculum. How irrelevant, we say. But if we don’t do a better job of ensuring that our students graduate with certain essential skills—analytic, computational, interpretive—and literacies—cultural, psychological, quantitative and scientific—we will in fact destine most of them to a life in the lower tiers of a highly stratified society. We mustn’t be deluded. If we fail to prepare students for this society’s technologically dynamic sectors, they will likely find themselves adrift in a sea of uncertainty.

In his 1845 novel Sybil, published the same year as Friedrich Engels’s The Condition of the Working Class in England in 1844, Benjamin Disraeli described

“Two nations; between whom there is no intercourse and no sympathy; who are as ignorant of each other’s habits, thoughts and feelings, as if they were dwellers in different zones or inhabitants of different planets; who are formed by a different breeding, are fed by a different food, are ordered by different manners.”

This country is already seeing one manifestation of a stark economic, political and cultural divide between those who graduate from college and those who don’t. During his failed 2004 presidential campaign, former U.S. senator John Edwards spoke of “two Americas: the America of the privileged and the wealthy and the America of those who lived from paycheck to paycheck. I spoke of the difference in the schools, the difference in the loan rates, the difference in opportunity.” He might have added disparities in household structure, health and life expectancy.

I worry that we are about to see a deepening gap among those who do graduate from college—between those who have mastered the knowledge skills and tools demanded by the new economy and those who haven’t.

If you think that the doubts about a college education’s value are troubling today, just wait until the level of disappointment and frustration among many graduates intensifies, as too many discover that college’s historic promise is unfulfilled, that they didn’t receive the preparation they needed to thrive in the new world that is rapidly emerging.

Steven Mintz is professor of history at the University of Texas at Austin.

Next Story

Written By

More from Higher Ed Gamma