Can AI replace most General Practitioner functions?

mraajr

GLP-1 Apprentice
Member Since
Oct 13, 2025
Posts
65
Likes Received
203
Location
Missouri
United-States
I cannot believe I am asking this question, but after the last couple days talking to Claude.AI about my most recent bloodwork, I do not see how AI replacing much of the work of GPs is not an inevitability. The quick answers, specificity, treatment plans, etc. have all been extremely useful. I have been more than impressed and will certainly continue using it for quick advice.

Of course, I think something like medicine will always (at least for the foreseeable future) need to involve the human element. Older generations who grew up in a time many years removed from the internet will have a harder time adjusting. That is not saying anything bad, just a mostly natural reaction. But younger generations who grow up with this stuff will almost certainly not have as hard of a time taking medical advice from a computer.

Could AI replace most GP-type doctors at some point in the future? Could we see insurance companies implementing AI to cut them out of the equation completely and give direct referrals to specialists when needed? Could it even get to a point where it could write some prescriptions? Or are there other things I am not thinking of it could be used for in this context?
 
I would say yes. Most family practice doctors are simply uncritically matching a symptom list presented to them to whatever the standard of care treatment is while ordering the recommended labwork and vaccine schedule for your age range.

Now there are good family practice doctors out there who actually are good at applying judgement and discernment to customize such things, but when you encounter one of those it's a pleasant surprise that you weren't expecting.
 
I asked ChatGPT about whether to use post oak or hickory for smoking a brisket the other day. I figured it would scan smoking forums or other smoking-things online and come back with which one was more strongly sworn-by.

It happened to come back with a "Which response do you prefer", where it says the same thing but in two different styles, which is a thing it does from time to time. The info WAS presented in two different ways, and also, one argued definitively for hickory and the other one SWORE by post oak.
 
I asked ChatGPT about whether to use post oak or hickory for smoking a brisket the other day. I figured it would scan smoking forums or other smoking-things online and come back with which one was more strongly sworn-by.

It happened to come back with a "Which response do you prefer", where it says the same thing but in two different styles, which is a thing it does from time to time. The info WAS presented in two different ways, and also, one argued definitively for hickory and the other one SWORE by post oak.
I mean you are asking it an OPINION question rather than a FACT question. Now to be fair, medical care is ultimately an opinion question, but most doctors treat it as a fact question and assume the guidelines they're presented with to be the ultimate source of truth, greatly reducing their usefulness.
 
Could AI replace most GP-type doctors at some point in the future? Could we see insurance companies implementing AI to cut them out of the equation completely and give direct referrals to specialists when needed? Could it even get to a point where it could write some prescriptions? Or are there other things I am not thinking of it could be used for in this context?

How about more than just primary care.... Think big. AI is being used to scan imaging, diagnose illinesses, create treatment plans, etc.

AI is so much better at helping with managing my sleep apnea treatment than my doctor’s office, who is a specialist with a team of respiratory therapists. I upload reports from my cpap machine to AI and the reports are analyzed, feedback provided, and I'm provided with adjustments to my cpap settings. All my doctor's office says to me is that my treatment is so very effective 🙄.

As far as referrals that you mentioned, that's happening now, I just read an article about how classic medicare beneficiares will now have to seek AI approval for prior authorization for 17 procedures. No industry is immune from AI.
 
Last edited:
Relying on chatgpt ( or similar AIs ) for medical advice without some external validation, preferably from a doctor is potentially dangerous.
The biggest current problem with chatgpt for medical advice is hallucinations, doctors can also give you confidently incorrect information, but chatgpt is more likely to do this and doctors are much more likely to order further tests or refer you to a specialist, or sometimes just wait and review later to see how something progresses if they do not know where to go with a problem. And this is very common, not all medical issues have clear diagnoses.
I have used it a lot for medication related questions and to assist in trying to work out a difficult to diagnose rash . After several months, blood tests and 3 biopsies and my gp and 2 dermatologists and a couple of dermatology residents, and a huge amount of reading and looking at dermatology images, literally hundreds of hours of research. I finally got a vague answer from the experts that was almost identical to what I plus chatgpt came up with. I actually found this reassuring even without a specific diagnosis or any treatment options.
I did find that the language used was very important, using correct medical terminology matters in the quality of answers, and how the problem is framed is extremely important, asking the same question in different ways will get different answers, and in my profile I instructed it to tell me if my statements were wrong, which is probably critical to stop it from agreeing with incorrect assumptions, without this it's tendency to agree with you could be very problematic.
In terms of knowledge, it is amazing, and in my opinion and some research papers fairly close to specialist level, in some areas with the right prompts better, and definitely better than a gp and by quite a lot.
In terms of logic chatgpt is really still not that great, and inferior to most humans, so where logical reasoning is needed it can make serious errors, not that it is always needed for medical advice , but sometimes it is critical.
In terms of clinical perspective and judgement of what is important and how to go about solving problems it is surprisingly pretty good, but still generally inferior to medical advice.
But the fact that it is more or less free and available at any time without waiting is great, and does improve access to expert advice, and it is especially good at diagnosing rarer problems.
 
I use chatgpt to help learn about and manage my weight loss and peptide use. I feed it all of my labs and bloodwork, prescriptions and what my daily life is like from work to my current weight.

It knows me REALLY well. And it also acts as medical advice because im not trying to tell my dr. That i am split dosing reta and tirz, taking 2 daily shots of a wolverine stack, or that when my ipamorelin and HGH come in, that i might tey to stack those too. Chat GPT will help me though all of.that. i know its wrong on occasion, and im ok with that.

It doesnt judge me and still helps.me through whatever im about. Which right now is ahooting chems in unmarked vials from strangers on the internet into my body several times each day.

Yes. I think that chatgpt will replace most general practitioners and everntually rhe specialists.
 
Considering the fact that medical malpractice is the third leading cause of death in the US, I probably trust AI more. I used to work in a healthcare facility and Nurses complained more than they worked, Doctors cared more about their cars than their patients, administrators only cared about getting paid, and they all slept together.
 
Top Bottom