Yes, I was a bit too extreme with my answer above, however, you’ll be hard pressed to find people who don’t already know, to formulate such a good question as:
can I take ibuprofen if I just took a cold medicine that contains acetaminophen?"
Refilling meds, absolutely… As long as the AI has access and can accurately interpret your medical history
This subject is super nuanced but the gist of the matter is that, at the moment, AI has been super hyped and it’s only in the best interest of the people pumping this hype to keep the bubble growing. As such, Nvidia selling us the opportunities in AI, is like wolves telling us how Delicious, and morally sound it is to eat sheep 3 times daily
Oh and I don’t know what kind of “therapy” you were referring to… But any psy therapy simple cannot be done by AI… You might as well tell people to get a dog or “when you feel down, smile”
As long as the AI has access and can accurately interpret your medical history
This is the crux of the issue imo. Interpreting real peoples’ medical situations is HARD. So the patient has a history of COPD in the chart. Who entered it? Did they have the right testing done to confirm it? Have they been taking their inhalers and prophylactic antibiotics? The patient says yes but their outpatient pharmacy fill history says otherwise (or even the opposite lol) Who do we believe, how do we find out what most likely happened? Also their home bipap machine is missing a part so better find somebody to fix that, or get a new machine.
Everyone wants to believe that medicine is as simple as “patient has x y z symptom, so statistics say they’ve got x y z condition,” when in reality everything is intense shades of grey and difficult to parse, overlapping problems.
Yes, I was a bit too extreme with my answer above, however, you’ll be hard pressed to find people who don’t already know, to formulate such a good question as:
Refilling meds, absolutely… As long as the AI has access and can accurately interpret your medical history
This subject is super nuanced but the gist of the matter is that, at the moment, AI has been super hyped and it’s only in the best interest of the people pumping this hype to keep the bubble growing. As such, Nvidia selling us the opportunities in AI, is like wolves telling us how Delicious, and morally sound it is to eat sheep 3 times daily
Oh and I don’t know what kind of “therapy” you were referring to… But any psy therapy simple cannot be done by AI… You might as well tell people to get a dog or “when you feel down, smile”
This is the crux of the issue imo. Interpreting real peoples’ medical situations is HARD. So the patient has a history of COPD in the chart. Who entered it? Did they have the right testing done to confirm it? Have they been taking their inhalers and prophylactic antibiotics? The patient says yes but their outpatient pharmacy fill history says otherwise (or even the opposite lol) Who do we believe, how do we find out what most likely happened? Also their home bipap machine is missing a part so better find somebody to fix that, or get a new machine.
Everyone wants to believe that medicine is as simple as “patient has x y z symptom, so statistics say they’ve got x y z condition,” when in reality everything is intense shades of grey and difficult to parse, overlapping problems.
That’s exactly right… I’ve been working IT in healthcare for over 20 years and seen this over and over
Even IT stuff, which is 1000 times closer to binary compared to the human body, is very hard to troubleshoot when humans are involved