A study has unveiled a new deep learning model capable of estimating a person’s biological age using facial photographs.
The innovation could revolutionise surgical and oncological decision-making and ‘opens the door to a whole new realm of biomarker discovery’, researchers suggest.
Mass General Brigham investigators have developed an AI tool called FaceAge that detects a person’s biological age from a photo of their face.
In a study, the researchers found that they could connect a cancer patient’s FaceAge estimate to their survival outcomes: patients with younger FaceAge estimates performed better following cancer treatment, while those with older FaceAge estimates had worse outcomes.
The researchers developed FaceAge to provide an objective measure that doctors can use to inform their treatment plans, which goes beyond the year a person was born and a subjective eyeball test during an exam.
They hope FaceAge can have uses beyond cancer care, to other diseases of ageing, and to allow people to track their own health.
It predicts survival outcomes in cancer patients more accurately than chronological age alone, marking a significant advancement in personalised cancer care.
Emerging research has shown that people age at different rates due to a complex mix of genetics, lifestyle and environmental factors.
Unlike chronological age, biological age reflects a person’s true physiological status, with potential implications for survival, treatment tolerance and recovery. Until now, clinicians lacked an objective tool to measure this.
In clinical practice, subjective assessments like ‘performance status’ often guide treatment decisions, especially in oncology.
These evaluations, while important, are prone to human error and variability.
FaceAge offers a quantitative alternative by analysing standard face photographs, commonly taken during radiotherapy planning.
Developed using large publicly available datasets, it uses deep learning algorithms to extract subtle ageing markers from facial features.
Researchers validated its prognostic power across three large cohorts of cancer patients. The model effectively stratified patients into risk groups based on predicted biological age in the largest cohort.
Those with an older-appearing FaceAge had significantly worse survival outcomes, even after adjusting for tumour type, sex and actual age.
On average, cancer patients appeared around five years older than their chronological age, a reflection of both disease burden and treatment effects.
FaceAge consistently outperformed chronological age as a predictor of survival across breast, gastrointestinal and genitourinary cancers.
In patients receiving palliative radiotherapy, where precise survival estimates are critical, FaceAge significantly improved the performance of validated risk models.
The study also found that clinicians, especially those with lower baseline predictive accuracy, made more accurate survival predictions when FaceAge data were available.
Further genomic analysis revealed that FaceAge correlates with genes involved in cellular senescence and cell-cycle regulation, such as CDK6, lending biological credibility to its predictions.
Despite its promise, the study’s authors urge caution. The model was trained on non-clinical datasets that may include biases, such as cosmetic alterations or demographic imbalances.
Co-senior and corresponding author Hugo Aerts, director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham, said: ‘We can use artificial intelligence (AI) to estimate a person’s biological age from face pictures, and our study shows that information can be clinically meaningful. This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians. How old someone looks compared to their chronological age matters – individuals with FaceAges younger than their chronological ages do significantly better after cancer therapy.’
Fellow co-senior author Ray Mak, also a faculty member in the AIM program at Mass General Brigham, said: ‘This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age. As we increasingly think of different chronic diseases as diseases of ageing, it becomes even more important to be able to accurately predict an individual’s ageing trajectory. I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives.’
Ethnic bias was evaluated and found to be minimal, but the researchers stress the need for more diverse and representative training datasets to ensure equitable clinical performance. They also acknowledge broader ethical concerns, urging strong regulatory oversight and transparent clinical governance before deployment.
For surgeons, especially those managing oncologic cases, FaceAge presents a potential tool for risk stratification, operative planning and shared decision-making.
In patients with borderline fitness, knowing a more precise biological age may help balance the risks and benefits of surgery, chemotherapy or radiotherapy.
The researchers called for further prospective validation and integration with clinical workflows, but are optimistic that this technology could one day serve as an actionable biomarker in precision medicine.
The study was published in The Lancet Digital Health.
You can watch a video of the technology here.


