Bitte benutzen Sie diese Referenz, um auf diese Ressource zu verweisen: doi:10.22028/D291-43591
Titel: AvatarStudio: Text-Driven Editing of 3D Dynamic Human Head Avatars
VerfasserIn: Mendiratta, Mohit
Pan, Xingang
Elgharib, Mohamed
Teotia, Kartik
R, Mallikarjun B.
Tewari, Ayush
Golyanik, Vladislav
Kortylewski, Adam
Theobalt, Christian
Sprache: Englisch
Titel: ACM transactions on graphics : TOG
Bandnummer: 42
Heft: 6
Verlag/Plattform: ACM
Erscheinungsjahr: 2023
DDC-Sachgruppe: 004 Informatik
Dokumenttyp: Journalartikel / Zeitschriftenartikel
Abstract: Capturing and editing full-head performances enables the creation of virtual characters with various applications such as extended reality and media production. The past few years witnessed a steep rise in the photorealism of human head avatars. Such avatars can be controlled through different input data modalities, including RGB, audio, depth, IMUs, and others. While these data modalities provide effective means of control, they mostly focus on editing the head movements such as the facial expressions, head pose, and/or camera viewpoint. In this paper, we propose AvatarStudio, a text-based method for editing the appearance of a dynamic full head avatar. Our approach builds on existing work to capture dynamic performances of human heads using Neural Radiance Field (NeRF) and edits this representation with a text-to-image diffusion model. Specifically, we introduce an optimization strategy for incorporating multiple keyframes representing different camera viewpoints and time stamps of a video performance into a single diffusion model. Using this personalized diffusion model, we edit the dynamic NeRF by introducing view-and-time-aware Score Distillation Sampling (VT-SDS) following a model-based guidance approach. Our method edits the full head in a canonical space and then propagates these edits to the remaining time steps via a pre-trained deformation network. We evaluate our method visually and numerically via a user study, and results show that our method outperforms existing approaches. Our experiments validate the design choices of our method and highlight that our edits are genuine, personalized, as well as 3D- and time-consistent.
DOI der Erstveröffentlichung: 10.1145/3618368
URL der Erstveröffentlichung: https://dl.acm.org/doi/10.1145/3618368
Link zu diesem Datensatz: urn:nbn:de:bsz:291--ds-435919
hdl:20.500.11880/39056
http://dx.doi.org/10.22028/D291-43591
ISSN: 1557-7368
0730-0301
Datum des Eintrags: 28-Nov-2024
Fakultät: MI - Fakultät für Mathematik und Informatik
Fachrichtung: MI - Informatik
Professur: MI - Keiner Professur zugeordnet
Sammlung:SciDok - Der Wissenschaftsserver der Universität des Saarlandes

Dateien zu diesem Datensatz:
Datei Beschreibung GrößeFormat 
3618368.pdf23,74 MBAdobe PDFÖffnen/Anzeigen


Diese Ressource wurde unter folgender Copyright-Bestimmung veröffentlicht: Lizenz von Creative Commons Creative Commons