Will technological advancements replace your doctor?

I’m an old-school doctor who firmly believes that if you actually listen to your patient, most of the time they’re going to tell you exactly what’s wrong with them—or at the very least, lead you in the right direction, when you bring their family history into play.

Part of this comes from training in the U.K. as part of my clinical rotations in medical school. As a nationalized system, they were very concerned with making the best use of money. Fancy, high-dollar tests were off the table, so I truly learned how to become a good diagnostician the old-fashioned way, and I’m very proud of that.

I realize, however, that I’m a dying breed. And the fight to replace the tried-and-true with cutting-edge technology is on, even in medicine…

Welcome to the age of “omics”

Modern medicine is entering an era of “multiomics”—a term that covers the vast array of data we can now draw from patients through specialized testing. (Think genomics, epigenomics, microbiomics, metabolomics, and so-forth.)

It’s a breakthrough—there’s no arguing that. Computerized advances have made it easier than ever to gather and make sense of all kinds of data using high-tech sensors, scanners, and wearable devices.    

And this, in turn, gives doctors and researchers a truly accurate and precise picture of patient health—which makes personalized care possible without reliance on potentially incomplete or inaccurate information based on patient recollection and family history.

In fact, “omics” technology has gotten so good at this that some doctors are starting to question whether traditional physical exams and patient histories will even be necessary much longer.

Less than half a century ago, in 1975, research showed that physical exams and medical histories were the only things necessary for an accurate diagnosis more than 90 percent of the time. And even in the last decade, studies have continued to support this traditional approach.

But a new study, published this past spring, showed that prediction models of insulin resistance—a modern approach that combines clinical findings with data from the genome, the immunome, the metabolome, the microbiome, and a handful of other “omes”—were accurate enough to completely replace more cumbersome traditional testing in the diagnosis of type 2 diabetes.

Preserving the human connection

As you might imagine, the tech geeks are all over this. (Though I should note that cost wasn’t mentioned in even one of these studies, despite being one of the most important considerations in practice.)

That said, there’s surely a happy medium to be found here.

There’s no stopping the wheels of change—and why would we want to, anyway? If this new technology can increase accuracy of diagnosis and prognosis, and streamline and expedite treatments and cures, then why wouldn’t we embrace it?

On the other hand, technology shouldn’t supplant one-on-one care anymore than it already has in this world of 5-minute office visits. Patients are people—not a set of numbers. And all of the fancy tests in the world can’t replace sound clinical judgment and a strong bedside connection.

The more we can do to bring diagnosis and treatment into focus, the better. But there’s no technological substitute for actual human doctors—though I’m certain we’re moving toward exactly that. (Trust me, self-checkouts are only the beginning of the technology takeover.)

If the doctor disappears, so does the human medical care we all deserve. And that wouldn’t just be a mistake. It would be the end of patient-focused medicine as we know it.


“Are the History and Physical Coming to an End?” Medscape Medical News, 09/09/2019. (medscape.com/viewarticle/917730)