Can AI replace most General Practitioner functions?

mraajr

GLP-1 Apprentice
Member Since
Oct 13, 2025
Posts
65
Likes Received
208
Location
Missouri
United-States
I cannot believe I am asking this question, but after the last couple days talking to Claude.AI about my most recent bloodwork, I do not see how AI replacing much of the work of GPs is not an inevitability. The quick answers, specificity, treatment plans, etc. have all been extremely useful. I have been more than impressed and will certainly continue using it for quick advice.

Of course, I think something like medicine will always (at least for the foreseeable future) need to involve the human element. Older generations who grew up in a time many years removed from the internet will have a harder time adjusting. That is not saying anything bad, just a mostly natural reaction. But younger generations who grow up with this stuff will almost certainly not have as hard of a time taking medical advice from a computer.

Could AI replace most GP-type doctors at some point in the future? Could we see insurance companies implementing AI to cut them out of the equation completely and give direct referrals to specialists when needed? Could it even get to a point where it could write some prescriptions? Or are there other things I am not thinking of it could be used for in this context?
 
I would say yes. Most family practice doctors are simply uncritically matching a symptom list presented to them to whatever the standard of care treatment is while ordering the recommended labwork and vaccine schedule for your age range.

Now there are good family practice doctors out there who actually are good at applying judgement and discernment to customize such things, but when you encounter one of those it's a pleasant surprise that you weren't expecting.
 
I asked ChatGPT about whether to use post oak or hickory for smoking a brisket the other day. I figured it would scan smoking forums or other smoking-things online and come back with which one was more strongly sworn-by.

It happened to come back with a "Which response do you prefer", where it says the same thing but in two different styles, which is a thing it does from time to time. The info WAS presented in two different ways, and also, one argued definitively for hickory and the other one SWORE by post oak.
 
I asked ChatGPT about whether to use post oak or hickory for smoking a brisket the other day. I figured it would scan smoking forums or other smoking-things online and come back with which one was more strongly sworn-by.

It happened to come back with a "Which response do you prefer", where it says the same thing but in two different styles, which is a thing it does from time to time. The info WAS presented in two different ways, and also, one argued definitively for hickory and the other one SWORE by post oak.
I mean you are asking it an OPINION question rather than a FACT question. Now to be fair, medical care is ultimately an opinion question, but most doctors treat it as a fact question and assume the guidelines they're presented with to be the ultimate source of truth, greatly reducing their usefulness.
 
Could AI replace most GP-type doctors at some point in the future? Could we see insurance companies implementing AI to cut them out of the equation completely and give direct referrals to specialists when needed? Could it even get to a point where it could write some prescriptions? Or are there other things I am not thinking of it could be used for in this context?

How about more than just primary care.... Think big. AI is being used to scan imaging, diagnose illinesses, create treatment plans, etc.

AI is so much better at helping with managing my sleep apnea treatment than my doctor’s office, who is a specialist with a team of respiratory therapists. I upload reports from my cpap machine to AI and the reports are analyzed, feedback provided, and I'm provided with adjustments to my cpap settings. All my doctor's office says to me is that my treatment is so very effective 🙄.

As far as referrals that you mentioned, that's happening now, I just read an article about how classic medicare beneficiares will now have to seek AI approval for prior authorization for 17 procedures. No industry is immune from AI.
 
Last edited:
Relying on chatgpt ( or similar AIs ) for medical advice without some external validation, preferably from a doctor is potentially dangerous.
The biggest current problem with chatgpt for medical advice is hallucinations, doctors can also give you confidently incorrect information, but chatgpt is more likely to do this and doctors are much more likely to order further tests or refer you to a specialist, or sometimes just wait and review later to see how something progresses if they do not know where to go with a problem. And this is very common, not all medical issues have clear diagnoses.
I have used it a lot for medication related questions and to assist in trying to work out a difficult to diagnose rash . After several months, blood tests and 3 biopsies and my gp and 2 dermatologists and a couple of dermatology residents, and a huge amount of reading and looking at dermatology images, literally hundreds of hours of research. I finally got a vague answer from the experts that was almost identical to what I plus chatgpt came up with. I actually found this reassuring even without a specific diagnosis or any treatment options.
I did find that the language used was very important, using correct medical terminology matters in the quality of answers, and how the problem is framed is extremely important, asking the same question in different ways will get different answers, and in my profile I instructed it to tell me if my statements were wrong, which is probably critical to stop it from agreeing with incorrect assumptions, without this it's tendency to agree with you could be very problematic.
In terms of knowledge, it is amazing, and in my opinion and some research papers fairly close to specialist level, in some areas with the right prompts better, and definitely better than a gp and by quite a lot.
In terms of logic chatgpt is really still not that great, and inferior to most humans, so where logical reasoning is needed it can make serious errors, not that it is always needed for medical advice , but sometimes it is critical.
In terms of clinical perspective and judgement of what is important and how to go about solving problems it is surprisingly pretty good, but still generally inferior to medical advice.
But the fact that it is more or less free and available at any time without waiting is great, and does improve access to expert advice, and it is especially good at diagnosing rarer problems.
 
I use chatgpt to help learn about and manage my weight loss and peptide use. I feed it all of my labs and bloodwork, prescriptions and what my daily life is like from work to my current weight.

It knows me REALLY well. And it also acts as medical advice because im not trying to tell my dr. That i am split dosing reta and tirz, taking 2 daily shots of a wolverine stack, or that when my ipamorelin and HGH come in, that i might tey to stack those too. Chat GPT will help me though all of.that. i know its wrong on occasion, and im ok with that.

It doesnt judge me and still helps.me through whatever im about. Which right now is ahooting chems in unmarked vials from strangers on the internet into my body several times each day.

Yes. I think that chatgpt will replace most general practitioners and everntually rhe specialists.
 
Considering the fact that medical malpractice is the third leading cause of death in the US, I probably trust AI more. I used to work in a healthcare facility and Nurses complained more than they worked, Doctors cared more about their cars than their patients, administrators only cared about getting paid, and they all slept together.
 
AI has already started to take over healthcare, even GPs.

I went to the doctor last year for a follow-up on an illness. My normal doctor wasn't available, so I saw another doctor at the same practice.

He caught me off guard by asking permission to use his cell phone to transcribe our conversation.

I didn't like that idea but didn't want to cause friction with a new doctor, so I said "OK."

My overall impression is that his phone (don't know which app) listened to our entire conversation and transcribed it. It then summarized what we said, and was somehow tied into the computer each patient room has, which is connected to their network.

He didn't seem to be doing any thinking on his own; the app summarized everything, then he started using the computer terminal in the room to look up treatments/medications.

When I asked specific questions, he typed them into the computer and waited for it to tell him what to say and what to recommend to me.

How is that any different from an average person using chat-gpt or grok, etc, to provide them advice?

People may say that at, at a real doctor's office, at least you'll have a real doctor to review what the AI says.. but in my experience, some doctors don't seem to do that. They seem to rely almost entirely on AI to listen, take notes, evaluate and make recommendations.

And I'm supposed to pay outrageous premiums and visit fees for a person to use AI to 'treat' me? 🤔😔
 
Some doctors suck, and Claude will be better. But there are still some legitimately good doctors who will spot things with their eyes and brain that you won't have told in a chat with Claude, and that remains valuable. And LLMs still hallucinate, even if they're getting better, and they hallucinate with utter confidence, eminently believable. I take anything I get from Claude and cross check it with traditional data sources to make sure it was sound analysis.
 
How about more than just primary care.... Think big. AI is being used to scan imaging, diagnose illinesses, create treatment plans, etc.

AI is so much better at helping with managing my sleep apnea treatment than my doctor’s office, who is a specialist with a team of respiratory therapists. I upload reports from my cpap machine to AI and the reports are analyzed, feedback provided, and I'm provided with adjustments to my cpap settings. All my doctor's office says to me is that my treatment is so very effective 🙄.

As far as referrals that you mentioned, that's happening now, I just read an article about how classic medicare beneficiares will now have to seek AI approval for prior authorization for 17 procedures. No industry is immune from AI.
Yep AI all ready is doing that. How I ended up on Mounjaro initially is I had the highest blood glucose level and not be diabetic. I wear glasses, near sighted with bifocals for reading, but I have progressive lenses so it the entire rage of focal lengths between far and close. Two months after the about-to-be-diabetic-in-about-15-minutes diagnosis (and still on a waiting list to get the started dose of Mounjaro) I could go (insure time restricted) for an eye exam. I knew I needed a new glasses Rx and just thought i needed higher power to read as they were now off. So the eye doctor goes, uh your eye sight got BETTER by 2 full diopters in the last year. That is is unusual, and generally indicates high blood sugar, as that causes your lenses in the eye to swell and change shape and far sighted people get less far sighted. I say omg! i just got that diagnosis! So my eye doctor says well the standard of care is to take pictures of your retina to look for signs of diabetic retinopathy. So we do and AI software now does an analysis on the picture to detect that. My doctor said she would also look as doesn’t quite trust it completely, but has not been wrong yet since she’s been using it. Good news as there was no sign of it. And i gut new glasss. Also as my blood sugar level went back to normal over the following year, your lenses retune to their old shape and my eye sight went back to worse.
 
Top Bottom