“We propose that a 2 month, 10 man study of artificial intelligence be carried out during the summer of 1956 at Dartmouth College in Hanover, New Hampshire.” One of the first uses of the term “artificial intelligence” came in this 1955 proposal for a summer research project. At that point, artificial intelligence (AI) was still a relatively young development in society: one that wouldn’t enter healthcare for another 20 years.
Now, artificial intelligence – or the combination of computer science and datasets to stimulate human intelligence and skills – has a variety of applications in healthcare, such as automating specific tasks, analyzing medical information, or making a diagnosis. Algorithms, chat boxes, smart assistants, and other forms of AI are able to perform these tasks – and then improve their own performance – based on data. They’re given data, analyze it, and get feedback to reinforce their correct answers, decisions, and information and to discourage – and therefore limit – any incorrect ones.
However, in women’s healthcare in particular, AI can lack data – and a result, lack effectiveness, whether in correctly transcribing the terms for women’s anatomy or recognizing sex differences across health conditions. For context, women weren’t allowed to participate in clinical trials until 1993; findings from clinical trials prior to that year were based on men and generalized to women. While women’s participation in clinical trials has increased in the past 30 years, the studies that analyze data by sex has not. An article published in 2018 found, on average, only about 34% of studies – across healthcare conditions like cardiovascular disease prevention, depression treatments, and emergency medicine – analyze data by sex.
Without sex-disaggregated data, humans – and by extension, AI – don’t have the information they need to diagnose correctly, recommend appropriate and effective treatments, or even save patients from a preventable death, especially if those patients are women. For example, the most recognized symptoms of heart attacks, such as chest pain, shortness of breath, and lightheadedness, are usually typical for men. Women’s symptoms can include chest pain but also back pain, flu-like fatigue, nausea, night sweats, and stomach pain. Because men’s symptoms, though, are the default, women are 50% more likely to be misdiagnosed following a heart attack and 50% more likely to die within the year after a heart attack than men are. AI that is trained to recognize an imminent heart attack might only perpetuate these same human errors.
The relative lack of information around women’s healthcare is not the only bias that can hamper the usefulness and effectiveness of AI, though. Developers, who are usually white and male, may unknowingly ingrain their own subconscious stereotypes, such as those around sex, gender, and gender roles, into these tools. AI can thus internalize biases (such as that men are the physical norm and females are abnormal, that female test subjects are more variable than male ones, or even that men are the doctors and women are the nurses) and present them as facts.
As Dr. Helena Deus, Director of Disruptive Technologies at Elsevier, summarized in one of her presentations, “AI is only as good as the data used to train it”. Data that is incomplete (due to the lack of sex-specific healthcare data) or incorrect (due to biases around women and health) can not only limit the usefulness of these tools but also increase their ability to be dangerous to patients.
Despite its challenges, artificial intelligence in women’s health can have positive effects. Over the past few years, AI has helped accurately flag any precancerous changes in a woman’s cervix, reduce false positives and false negatives when reading mammograms, identify at-risk pregnant women in their first trimester, or, in the case of Wellen, increase the accessibility of information about osteopenia and osteoporosis.
Wellen, which is founded and led by Priya Patel, aims to provide services for healthy living and active aging, starting with personalized fitness programs for those with osteopenia and osteoporosis. Osteopenia is the initial stage of bone loss and leads to osteoporosis, which affects about 10 million Americans over 50, 80% of which are women. Women can lose up to 20% of their bone density in the five to seven years following menopause — and every 10% drop in bone density increases their risk of fracture by two to three times. In fact, half of women over 50 years old will suffer from an osteoporotic fracture.
Wellen’s exercise programs focus on strength, balance, and posture to help improve bone health, prevent falls and broken bones, and mitigate some of the effects of osteopenia or osteoporosis. The company also offers the experimental Wellen Chat: a chat-box that helps answer users’ questions about bone health. Users can type their question – such as “What exercises can I do to prevent osteoporosis?” – into Wellen Chat, which will give them both an answer and a link to at least one piece of Wellen content that contains additional relevant information.
Ms. Patel calls AI an “amazing tool in the toolkit” but also recognizes its limitations, noting that chat boxes that utilize this technology, like ChatGPT and like Wellen Chat, need guardrails. Wellen Chat, as a result, pulls its answers to users’ questions from Wellen’s content, which is written by doctors and physical therapists. That way, users can take advantage of AI’s efficacy – they don’t have to read through every piece of Wellen content themselves to find the answer to their question – while also trusting that the information they’re receiving is science-backed and expert-approved, rather than pulled indiscriminately from the internet.
While Wellen Chat is available for anyone to use, it does specifically target a demographic that Ms. Patel notes has been historically “underrecognized and underserved” in healthcare: women over 50. These women can face both sexism and ageism when seeking medical advice. While AI without any guardrails has the potential to learn from, feed into, and perpetuate these biases, well-trained AI, like Wellen Chat, can operate outside of human prejudices. In the process, it can fulfill Ms. Patel’s goal of improving the healthcare experiences – and outcomes – of these patients.
As an article published in 2015 summarized, “Longstanding efforts to improve the health of women have called for increasing the representation of women in clinical studies, analysis of research results by sex or gender, and awareness both within the research community and the public about the essential need to study the influence of sex and gender on health.” With the rise of AI tools – including those in healthcare – that learn from this type of data, correct and complete information becomes increasingly important both to get the most accurate and unbiased information for users and to improve the patient experience. At Wellen, Ms. Patel and her team implemented guardrails on Wellen Chat from the beginning to reap the benefits of AI tools while limiting their ability to harm. This tactic, among others, can help make AI a positive healthcare tool for users around the world, whether they are looking for information about osteoporosis or simply looking for a transcription services that correctly recognizes and transcribes the words for women’s anatomy.
Read the full article here