Can We Achieve Temporally Consistent Video Face Re-Aging?

Original title: Video Face Re-Aging: Toward Temporally Consistent Face Re-Aging

Authors: Abdul Muqeet, Kyuchul Lee, Bumsoo Kim, Yohan Hong, Hyungrae Lee, Woonggon Kim, Kwang Hee Lee

The article delves into video face re-aging, aiming to alter a person’s age in videos seamlessly. The challenge lies in the scarcity of paired video datasets ensuring consistent identity and age changes. Existing methods often treat each image separately, neglecting temporal consistency. While some address temporal coherence, they struggle with age transformation accuracy. To overcome these hurdles, the article introduces (1) a unique synthetic video dataset covering diverse age groups; (2) a baseline architecture to validate this dataset’s effectiveness, and (3) three tailored metrics for assessing temporal consistency in video re-aging. Through comprehensive experiments on VFHQ and CelebV-HQ datasets, the proposed method showcases superior performance in both age transformation accuracy and temporal consistency compared to existing approaches, marking a significant advancement in video face re-aging techniques.

Original article: https://arxiv.org/abs/2311.11642