By Katrina Manaloto
What does it mean to be “healthy”? Is it eating your daily fruits and vegetables, or walking 10 thousand steps a day? Maybe today, it’s meditating in the morning and meeting up with a friend in the afternoon…but tomorrow, it could be sleeping in and cocooning at home. With today’s focus on holistic health, which equally prioritizes physical and mental wellbeing, staying “healthy” necessitates an endless number of choices and requisites. It’s an almost impossible task for anyone to manage. But, with the advancement of artificial intelligence (AI), health and wellness are becoming more accessible and attainable for people.
In the clinical setting, AI is already proving to be a valuable tool. In the field of neuroimaging, AI streamlines the process of image analysis, diagnosis, and ultimately patient care. For example, Viz.ai software’s Viz LVO automatically detects enlarged blood vessels that indicate a stroke and alerts the patient’s care team about its findings within 6 minutes of the patient’s imaging session. At Drexel University, researchers discovered that GPT-3, the language model that underlies ChatGPT, can identify patients’ subtle speech cues that may signify Alzheimer’s disease, suggesting AI can aid the early detection of Alzheimer’s. Beyond clinical uses that supplement healthcare providers’ work, Google announced in May 2023 that Med-PaLM 2, their large language model for medicine, will be able to analyze and interpret medical text and images. Also, Med-PaLM is capable of performing medical consultations, which empowers patients to actively address their concerns without unnecessary in-person intervention. This gives patients greater agency to take responsibility for their health beyond their yearly physical. One thing is clear: AI is quickly moving beyond its current role as a diagnostic tool and transforming approaches to preventative care.
As AI expedites the management of our physical health, it creates greater opportunities for care providers to address our mental health and wellness. There are astounding applications of AI and neurotech for wellness, from emotion AI-powered glasses that help kids with autism interpret social cues, to personalized fitness and nutrition coaching, to a free chatbot called Woebot that adapts principles of cognitive behavioral therapy to converse with users. In the mental health field, psychiatric and therapeutic services are limited due to an overwhelming demand for services and climbing health insurance costs, especially post-COVID. According to the American Psychological Association, about 84% of psychologists who treat anxiety disorders and 74% of those who treat depression have witnessed an increase in treatment demand since the onset of the pandemic. Since many psychologists are unable to meet this rapid rise in demand, AI-driven technologies that are geared towards mental health feel refreshingly accessible and user-centric. Like AI’s clinical applications, these wellness-geared applications give people the agency to control and optimize their personal wellbeing. For Gen Z, technologies like these are not a far cry from some of the social media and apps that they already use to find social support, like Snapchat’s “MyAI” feature and ChatGPT. And as people become more comfortable with AI, companies can leverage this familiarity to create more AI tech that caters to enhancing wellness.
However, the notion of AI-driven healthcare still poses many ethical and cautionary issues and questions about its role and significance. One fear is that AI’s takeover of wellness will further commodify and mechanize it, especially in mental healthcare. Many AI-driven mental health apps and wearables often have steep fees, making care that is desperately needed in a post-pandemic world inaccessible. Moreover, deferring to digital solutions when talk therapy and psychiatry remain inaccessible does not resolve the problems of those with more severe mental health issues, who require consistent and intuitive care. Unfortunately, AI cannot provide that kind of intensive care “yet,” with most AI-driven mental health apps seeing poor retention rates and a lack of specificity. If these trends continue, along with the many additional concerns about privacy and government regulation, AI could insidiously become a way to profit from people’s suffering rather than contribute to wellness. Though we like to imagine AI as a panacea of sorts, we must still acknowledge its current failings and continue to find balanced approaches that utilize both in-person and AI-driven resources. As researchers continue to discover the limits and capabilities of AI, its role in healthcare and other realms can be continually reevaluated and expanded in a measured manner, as experts see fit.
Despite these potential issues, AI is undoubtedly a useful and revolutionary tool that can advance human practices beyond what was imaginable only a decade ago. This is best seen in the clinical setting, where AI has rapidly integrated into a healthcare provider’s work day, contributing to better quality care by filling in information gaps and expediting certain processes. Though this integration of AI tech into clinical care was largely accelerated by the pandemic, its success is evident, with over half of US healthcare executives believing that AI is often or very effective at improving clinical outcomes. AI in the wellness space, however, has not yet achieved similar levels of success; for instance, 71% of Americans who have heard about mental health chatbots say that they would not want to use one for their own mental health. Rather than try to replace therapy or social support entirely as some of these technologies have attempted, wellness AI needs to take note from its clinical counterpart and collaborate with in-person resources to coordinate care symbiotically. With the proper precautions and research, AI will continue to transform and even revolutionize the landscape of healthcare, ensuring that good health remains attainable.
How would you like to see AI become part of your daily health care? Share your answer here.