AUGUSTA, Ga. (WRDW/WAGT) – As artificial intelligence begins to play a larger role in healthcare, some experts are raising concerns that it could deepen existing disparities, particularly for communities of color, if not properly regulated.

Georgia is already facing a severe shortage of healthcare professionals. According to the Cicero Institute, 143 of the state’s 159 counties are experiencing provider shortages, and an additional 8,000 doctors will be needed by 2030.

While AI isn’t expected to replace doctors, it is being introduced to help relieve pressure on overwhelmed offices by handling administrative tasks and assisting with faster diagnoses.

Gov. Kemp sings new safety law, protocols coming to all Georgia public schools

Some AI tools are already in use. For instance, AI software is used to analyze mammograms for breast cancer risk or to aid surgeons with real-time decision-making.

“AI holds a lot of promise,” said Jessica Roberts, a law professor at Emory University. “But we can replicate the same kinds of problems we make as human decisions.”

Roberts warned that the medical data used to train AI systems is not always representative. Much of the existing data comes from white physicians treating white patients, meaning the technology could produce biased or incomplete results for patients of color.

Georgia politician among 19 arrests made in multi-agency human trafficking operation

She’s concerned about its effect on growing crises like the maternal mortality crisis.

“When you look at the Latino population and African Americans, we make up about 35% of the U.S. population, but only about 7% of all providers,” said Dr. Cecil Bennett.

Black women in Georgia die from pregnancy-related causes at nearly three times the rate of white women, and studies show their pain is often dismissed by medical professionals, leading to delays in care.

“There’s just more difficulty sometimes communicating between doctor and patient when their cultures are different,” said Bennett.

So far this summer, there's been 10 heat advisories. Last year there was only one.

Roberts and other advocates are calling for stronger regulations to ensure AI tools are trained on diverse datasets and developed with equity in mind.

“We have to make sure that when we’re using AI and we’re integrating it, that we’re doing so equitably,” Roberts said.

Roberts believes developers share a responsibility to make sure algorithms reflect the populations they serve.

Share.

Comments are closed.