The world is facing a maternal health crisis. According to the World Health Organization, about 810 women die every day from preventable causes related to pregnancy and childbirth. Two-thirds of these deaths occurred in sub-Saharan Africa. In Rwanda, one of the leading causes of maternal death is infected cesarean section wounds.
An interdisciplinary team of physicians and researchers from MIT, Harvard and the Rwanda Partnership for Health (PIH) has come up with a solution to this problem. They developed a mobile health (mHealth) platform that uses artificial intelligence and real-time computer vision to predict C-section wound infection with about 90 percent accuracy.
“Early detection of infection is an important problem worldwide, but it is exacerbated in resource-poor areas such as rural Rwanda, due to a lack of trained doctors and a high prevalence of antibiotic-resistant bacterial infections,” Says Richard Ribon Fletcher ’89, SM ’97, PhD ’02, MIT mechanical engineering research scientist and team technical lead. “The idea is to use mobile phones that community health workers can use to visit new mothers and check their wounds to detect infection.”
This summer, a team led by Harvard Medical School professor Bethany Hedt-Gauthier won the $500,000 first prize in the NIH Maternal Health Technology Accelerator Challenge.
“The lives of women with cesarean delivery in developing countries are limited by high-quality surgery and postpartum care,” added team member Fredrick Kateera of PIH. “Early identification and reasonably accurate use of mobile health technologies for patients with surgical site infections in these communities diagnosis, will be a scalable game-changer for optimizing women’s health.”
Trained Algorithms to Detect Infections
The beginning of the project was the result of several chance encounters. In 2017, Fletcher and Hedt-Gauthier met on the Washington subway at a meeting of NIH investigators. Hedt-Gauthier, who had been working on a research project in Rwanda for five years, was looking for solutions to the cesarean care gaps she and her collaborators encountered in the study. Specifically, she is interested in exploring the use of cell phone cameras as diagnostic tools.
Fletcher, who leads a group of students in Professor Sanjay Sarma’s AutoID lab and has been applying cell phones, machine learning algorithms and other mobile technologies to global health for decades, is a perfect fit for the project.
“Once we realized that these types of image-based algorithms could support home care for women after cesarean delivery, we approached Dr. Fletcher as a collaborator, given his extensive experience developing mHealth technologies in low- and moderate-income settings. ,” Hedt-Gauthier said.
On the same trip, Hedt-Gauthier happened to sit next to Audace Nakeshimana ’20, a new MIT student from Rwanda who later joined Fletcher’s team at MIT. Under Fletcher’s tutelage, Nakeshimana founded Insightiv during her senior year, a Rwandan startup applying AI algorithms to clinical image analysis, and won the top grant award in the 2020 annual MIT IDEAS competition.
The first step in the project is to collect a database of wound images taken by health workers in rural communities in Rwanda. They collected more than 1,000 images of infected and uninfected wounds, then used the data to train the algorithm.
A core problem arose with the first dataset, which was collected between 2018 and 2019. Many photos are of poor quality.
“Wound images collected by health workers are of variable quality and require extensive manual labor to crop and resample images. Since these images are used to train machine learning models, image quality and variability are fundamentally limited performance of the algorithm,” Fletcher said.
To solve this problem, Fletcher turned to tools he had used in previous projects: real-time computer vision and augmented reality.
Improve image quality with real-time image processing
To encourage community health workers to take higher-quality images, Fletcher and team modified the wound screening mobile app and paired it with a simple paper frame. The framework contains a printed calibrated color pattern and another optical pattern that guides the computer vision software of the application.
Health workers are instructed to place the frame over the wound and open the app, which provides real-time feedback on the camera’s position. The app uses augmented reality to display a green checkmark when the phone is in the correct range. Once in range, the rest of the computer vision software will automatically balance colors, crop the image, and apply transformations to correct for parallax.
“By using real-time computer vision at the time of data collection, we were able to generate beautiful, clean, uniform color-balanced images that could then be used to train our machine learning models without the need for manual data cleaning or release-processing,” Fletcher said. .
Using a convolutional neural network (CNN) machine learning model and a method called transfer learning, the software was able to successfully predict infection in a C-section wound with about 90 percent accuracy within 10 days of delivery. Women infected through the app are expected to be referred to clinics, where they can be tested for diagnostic bacteria and prescribed life-saving antibiotics as needed.
The app is well received by Rwandan women and community health workers.
PIH’s Anne Niyigena added: “Women’s trust in community health workers is an important enabler of the app, which means that the mHealth tool is being embraced by women in rural areas.”
Using Thermal Imaging to Address Algorithmic Bias
One of the biggest barriers to rolling out this AI-based technology to a more global audience is algorithmic bias. When trained on relatively homogeneous populations, such as rural Rwanda, the algorithm performed as expected and successfully predicted infections. However, the algorithm was less effective when images of patients with different skin tones were introduced.
To get around this, Fletcher used thermal imaging. A simple thermal camera module designed to connect to a cell phone costs about $200 and can be used to capture infrared images of wounds. The algorithm can then be trained to predict infection using thermal patterns of infrared wound images. A study published last year showed that when these thermal images were paired with the app’s CNN algorithm, predictions were more than 90 percent accurate.
Although more expensive than simply using a cell phone’s camera, the thermal image method could be used to extend the team’s mHealth technology to a more diverse global population.
“We have two options for health personnel: In a homogeneous population like rural Rwanda, they can use a standard cell phone camera, using a model trained on local population data. Otherwise, they can use a more general-purpose camera that requires a thermal camera attachment. model,” Fletcher said.
While the current generation of mobile apps use cloud-based algorithms to run infection prediction models, the team is now developing a standalone mobile app that does not require internet access and is also focusing on all aspects of maternal health, from pregnancy to postpartum .
In addition to developing the library of wound images used in the algorithm, Fletcher worked closely with former Insightiv student Nakeshimana and his team to develop the app, using Android phones made locally in Rwanda. PIH will then conduct user testing and field validation in Rwanda.
As the team looks to develop comprehensive applications for maternal health, privacy and data protection are top priorities.
“If we develop and improve these tools, there must be greater focus on patient data privacy. More data security details should be included so that the tool addresses the gaps it aims to bridge and maximize user trust, which will ultimately lead to Good for its wider adoption,” Niyigena said.
The winning team members include: Bethany Hedt-Gauthier, Harvard Medical School; Richard Fletcher, MIT; Robert Riviello, Brigham and Women’s Hospital; Adeline Boatin, Massachusetts General Hospital; Anne Niyigena, PIH, Rwanda , Frederick Kateera, Laban Bikorimana and Vincent Cubaka; and Audace Nakeshimana ’20, founder of Insightiv.ai.