An AI system can accurately identify surgical site infections from patient-submitted postoperative wound photos.
The development could change the way postoperative care is provided.
A study published in the Annals of Surgery introduces an AI-based pipeline that researchers created to:
• Automate the identification of surgical incisions
• Assess image quality
• Flag signs of infection in photos submitted by patients via online portals.
The system was trained on 20,000-plus images from more than 6,000 patients across nine Mayo Clinic hospitals in the US.
Cornelius Thiels, a hepatobiliary and pancreatic surgical oncologist at Mayo Clinic and co-senior author of the study, said: ‘We were motivated by the increasing need for outpatient monitoring of surgical incisions in a timely manner. This process, currently done by clinicians, is time-consuming and can delay care. Our AI model can help triage these images automatically, improving early detection and streamlining communication between patients and their care teams.’
The AI system uses a two-stage model:
- It detects whether an image contains a surgical incision
- Evaluates whether that incision shows signs of infection.
The model, known as Vision Transformer, achieved a 94% accuracy in detecting incisions and an 81% area under the curve (AUC) in identifying infections.
Hala Muaddi a hepatopancreatobiliary fellow at Mayo Clinic and first author, explained: ‘This work lays the foundation for AI-assisted postoperative wound care, which can transform how postoperative patients are monitored. It’s especially relevant as outpatient operations and virtual follow-ups become more common.’
With further development, the technology could:
• Enable patients to receive faster responses
• Reduce delays in diagnosing infections
• Support better care for those recovering from surgery at home
• Become a frontline screening tool that alerts clinicians to concerning incisions
• Pave the way for developing algorithms capable of detecting subtle signs of infection, potentially before they become visually apparent to the care team
• Lead to earlier treatment, decreased morbidity and reduced costs.
Dr Muaddi added: ‘For patients, this could mean faster reassurance or earlier identification of a problem. For clinicians, it offers a way to prioritise attention to cases that need it most, especially in rural or resource-limited settings.’
Notably, the model demonstrated consistent performance across diverse groups, addressing concerns about algorithmic bias.
The hope is that the AI models and the large dataset they were trained on will fundamentally reshape how surgical follow-up is delivered.
Prospective studies are underway to assess the integration of this tool into day-to-day surgical care.


