Clinical Decision Support coupled with cost optimization will grow over the next ten years in light of AI and Amazon development. So what do I mean in a practical sense by this?
Say you are a patient with abdominal pain. Your doctor thinks you may have a gallstone, but it could also be a kidney stone or perhaps a bulging disc in your back among other conditions. A test would help your doctor diagnose you. But what test? It could be an ultrasound, CT, MRI, or a nuclear medicine study along with blood work. But what would be the best test for you given your medical history, past tests you've undergone and cost of the tests in the area?
This is way too much information for a busy doctor to analyze in a 15-minute office visit. Ahh, but there is an AI platform with all your past medical history and costs of local services Amazon has figured out. Your doctor, working in conjunction with the platform can enter relevant clinical information based on their physical exam and interstation with you to determine the optimal test to order to diagnose your condition. This would result in high-quality, cost-effective care to be delivered with the help of AI in collaboration with your doctor's input.
How does this change the job of a doctor? Well, it would make them more efficient and able to provide higher quality. They would need to learn how to interact more closely with an AI platform and recognize a stronger partnership with technology. As a result, decrease mental burden and job satisfaction would result. Ultimately, there could be lower rates of provider burnout and increased job satisfaction with more individuals wanting to enter medicine. This could help address our issues around limited access in light of an aging population requiring more services.
Healthcare jobs that are purely clinic, such as drawing blood and processing them, will be endangered. You want to focus on higher, more consultative jobs, such as doctor, physician assistants, nursing, etc.
The AI might do the test and diagnose, but humans would still have to explain the diagnosis, help the patient understand what he/she has to do, and follow up to make sure they are implementing it. In other words, healthcare workers would become change managers dealing with the patient as the project.
I'm building and selling Healthcare AI right now - and these are snippets from what I've seen so far.
1. Diagnosis should be a machine problem. There are way too many data points and factors that are clinically relevant than is reasonable for a doctor to fit into her head.
2. However, to get to machine diagnosis using AI, the data needs to be available in the first place - which means that your entire medical history (correctly and extensively documented in a standard format), all hardware/vendors, and other ancillary systems must be able to communicate with each other. That API-fication of healthcare is largely missing; and given that the vendors of most systems are archaic tech giants, it is unlikely to change very fast.
3. Having said that, AI-based diagnosis can certainly do "screenings" - which are large, public health programs looking for diseases like TB or Diabetic Retinopathy. It can also do mechanical tasks such as preparing reports specific to a blood sample. Note that both of these applications are extremely specific, and take out some tasks, rather than entire jobs.
4. Finally, diagnosis is at best 20% of a doctor's job. She also has to counsel the patient, or do surgeries. While there are advancements, they are not robust enough to replace doctors entirely.
To sum up:
Specifically for the next 10 years, there wouldn't be any significant increase or decrease of healthcare jobs - AI is not yet big enough to eat healthcare. However, as systems slowly settle into the ecosystem, the way doctors work will start changing - they will become more and more reliant on systems, and for the patient, we can hope to achieve better diagnosis, faster turn-around times and overall better experience. The cost for the patient though ... that's an entirely separate question altogether. Will a patient absorb the millions of dollars of cost sunk into building a healthcare AI? What's the implication for the hospital, the patient and the insurance provider?
I love discussing these; please do let me know if you want a follow-up conversation!