Yes, There's Now a Google for Doctors
How AI finally delivered on my colleague’s joke — and how it’s changing the way clinicians look up medical information.
Back in the early internet days, one of my colleagues confessed he frequently brought up a Google search in front of his patients during office visits to look up some medical fact — medication dosing, rare side effects, best diagnostic test. He did this even though, obviously, the patient could do the same thing on their own. Plus, there has always been an unwritten rule that somehow doctors were better if they just knew stuff — that the smarter docs had the answers ready from the depths of their vast knowledge base.
How did he justify his apparent weakness?
“I tell them there’s a special Google for doctors, one you can only use if you have an active medical license,” he said. “This gives my search greater authority.”
I’ve thought of this anecdote numerous times, and even sometimes used it — jokingly — in the exam room with my patients. It’s far better to look something up and get it right than to pretend you know something and make a medical error. Just the fact that I have to write that sentence speaks to the absurdity of the all-knowing-doctor concept.
Today, artificial intelligence (AI) has quickly and dramatically changed the way clinicians do information searches. The most prominent of these tools is OpenEvidence, which in a sense brings my colleague’s claim of a Google for Doctors to fruition.
What is OpenEvidence? From their “About” page:
To tame the medical information firehose, we built OpenEvidence to aggregate, synthesize, and visualize clinically relevant evidence in understandable, accessible formats that can be used to make more evidenced-based decisions and improve patient outcomes.
It’s free to use, but there’s a catch — only verified healthcare professionals and medical students can get unlimited access. Voila, Google for doctors — and nurses, physician assistants, and students, too!
Before AI: UpToDate
Of course, many platforms have tried before to solve this problem of providing quick and accurate information to clinicians. Clearly the most successful has been UpToDate, essentially an online medical textbook that aims to provide comprehensive and authoritative guidance for clinicians with their most common questions. Full disclosure, I’ve been an editor at UpToDate for years, and knew the visionary founder Dr. Bud Rose personally.
(Here’s a remembrance I wrote about him when he died. He really was ahead of his time.)
With a staff of physician editors working full-time, and literally thousands of experts around the world providing regular input and peer review, UpToDate offered something no textbook could match — quick, authoritative revisions to guidelines based on the latest medical evidence. Compared to the glacial pace of textbook publishing, it was lightning fast and lived up to its name.
Not surprisingly, UpToDate became the dominant source of medical information for many of us, and was purchased from its founder years ago by a large medical publisher. Here’s a slide I’ve used when teaching doctors and nurses about the best sources for information in my specialty, Infectious Diseases, leading off with the 10,000-pound gorilla, referring to UpToDate:
If you want an idea how valued this resource is for us doctors, you should hear the chorus of complaints when hospitals switch their subscription to a less expensive alternative such as DynaMed. In addition, the American Board of Internal Medicine allows access to UpToDate — and nothing else — during the proctored recertification exams in both general internal medicine and the various subspecialties.
The Challenger: OpenEvidence
But now, we have a true challenger — which means that lecture of mine now has a new slide:
So what does OpenEvidence offer that UpToDate does not? For one, as mentioned previously, it’s free to use — for now. More importantly, it uses the strengths of large language models to take clinician queries further. Rather than search items by topic, you literally write your question in the search box, and it provides a remarkably accurate answer, one written to order for your question and your question alone. I’ve been using it extensively for months, and at times it borders on miraculous.
Allow me to share one particularly successful recent query. But first, a little medical background about the question I’ll be asking: Helicobacter pylori — or H. pylori — is a spiral-shaped bacterium that lives in the stomach and, for reasons still not entirely clear, can cause inflammation and ulcers. We look for it when someone has persistent upper-abdominal pain, unexplained iron deficiency (which may be from internal bleeding), or a history of ulcers, since getting rid of the infection often fixes the problem. Treatment means a short course of antibiotics and acid-reducing medicine — basically a well-targeted eviction notice for a troublesome tenant.
This is a very common outpatient problem, and the treatment is a bunch of pills taken for a couple of weeks. That’s right, pills. But what’s not common is when, for whatever reason (pain, other medical issues, recent surgery), a patient can’t take any pills at all. How can the infection be treated?
It’s been several years since I’ve been asked this question, as it’s just not an issue we run into regularly. I knew the answer would be something I’d have to look up.
Enter Stage Left, OpenEvidence, and literally seconds later, this impressive aria:
Wow. That’s pretty darn amazing — especially since the footnotes referred to several highly relevant published papers. Could I have found this information elsewhere? Likely yes, but not nearly so efficiently. For the record, the UpToDate search was not helpful.
Now here’s a whopping big caveat: like all AI products, not all the recommendations OpenEvidence provides are correct. In one recent example, in management of a complex infection, it suggested strongly that I continue a treatment that was overly complicated and potentially toxic — a combination of two antibiotics. It advised against what was clearly the better choice: a simpler treatment that I knew was just as effective, plus was safer and less expensive.
Investigating the source of the wrong advice, I found an overly dogmatic interpretation of consensus guidelines. This can be a weakness of the OpenEvidence approach, as it relies heavily on these sources to make suggestions. In this case, furthermore, UpToDate clearly provided a more nuanced source of information.
How about out-and-out wrong information? Sure, here’s an example — everyone should know that you can’t rule out a blood clot in the lung (pulmonary embolism, or PE) with a chest X-ray. And what about references that don’t exist, the infamous “hallucinations” of AI? While I haven’t personally found made-up references, I have found some that are clearly not related to my question.
Not surprisingly, the editors at UpToDate are hard at work adapting their platform to this era of AI, taking what they call a more “responsible” approach. Can they provide the best of both worlds? We’ll see.
Some Thoughts About the Future
Where does OpenEvidence go from here when it comes to the high stakes of patient care? For this already polished product, one could easily see subscription fees, institutional licenses, or other ways to monetize a valuable service that’s now free to use. Certainly investors agree — OpenEvidence has reported revenue of $50 million, but a valuation of $6.1 billion.
In a best case scenario, OpenEvidence continues to refine its product, making it even more useful for clinicians by integrating itself into electronic medical records, medical-school curricula, and clinical-decision-support tools. Imagine an abnormal laboratory test accompanied by easily accessed information on causes and next steps in evaluation.
Another scenario has it going further, replacing doctors entirely, or at the very least “providing” care in settings where there are doctor shortages. Here’s Dr. Brian Carmody making that argument:
This “AI replaces doctors” concern has motivated some states to ban AI chatbots from posing as licensed healthcare providers. I certainly understand the worry: like any novice trying to accomplish a high-risk task, even the best of AI requires supervision for optimal results.
Of course, there’s also a worst-case scenario. OpenEvidence could be acquired by a rapacious private equity firm hungry for profits, leading to a watered-down product with multiple different subscription tiers, distracting ads, and sharing of personal information for sale. In this way, it would follow the path of previously useful but now bloated and enshittified sites — TripAdvisor, anyone?
For now, I’m enjoying this new “Google for doctors,” alongside — but not replacing — the trusty UpToDate. It’s fast, accurate, and occasionally humbling, a reminder that the sum of medical knowledge is now well beyond the capacity of any single human brain.
Still, I can’t help wondering if someday it will politely suggest that I take a break while it finishes seeing my afternoon patients.
Note to readers: I’m still posting over at the NEJM Group site on ID topics — including this review of two recent studies on the COVID vaccines, one with practical implications.






OpenEvidence truly is a major advance, and it has become much more reliable since it struck the licensing deal with NEJM and JAMA. Of course, it lacks context and you must be cautious when using it to make nuanced clinical decisions. But with the rapidly growing clinical knowledge base, an AI brain is going to be essential
For professional who present and interpret medical data, should we ask they disclose any financial interests in this data company as form of conflict of interest as is generally requested for investors in pharma?