GestaltGAN

Synthesizing photorealistic portraits of rare genetic disorders that preserve characteristic clinical features

The facial gestalt (overall facial morphology) is a characteristic clinical feature in many genetic disorders that is often essential for suspecting and establishing a specific diagnosis. For that reason, publishing images of individuals affected by pathogenic variants in disease-associated genes has been an important part of scientific communication. Furthermore, medical imaging data is also crucial for teaching and training artificial intelligence methods such as GestaltMatcher. However, medical data is often sparsely available and sharing patient images involves risks related to privacy and re-identification. Therefore, we explored whether generative neural networks can be used to synthesize accurate portraits for rare disorders. We modified a StyleGAN architecture and trained it to produce random condition-specific portraits for multiple disorders. We present a technique that generates a sharp and detailed average patient portrait for a given disorder. We trained our GestaltGAN on the 20 most frequent disorders from the GestaltMatcher database. We used REAL-ESRGAN to increase the resolution of portraits from the training data with low quality and colorized black-and-white images. The training data was aligned and cropped to achieve a uniform format. To augment the model's understanding of human facial features, an unaffected class was introduced to the training data. We tested the validity of our generated portraits with 63 human experts. Our findings demonstrate the model's proficiency in generating photorealistic portraits that capture the characteristic features of a disorder but preserve the patient's privacy. Overall, the output from our approach holds promise for various applications, including visualizations for publications, educational materials, as well as augmenting training data for deep learning.

If you are interested in having images or an average generated for your cohort, please do not hesitate to contact us!
Paper preprint: https://doi.org/10.1101/2024.07.18.24308205'
Imprint Privacy