![]() ![]() ![]() I tried to export the face skeletal mesh for a metahuman into omniverse using the connector for UE5.1 - the skin texture did not work despite trying different export settings. The final resulting facial animation and lipsync should be exactly what you see inside Audio2Face.īump. You would then know exactly what you would be getting. and then begin the animation process with the character that will ultimately be using the animation so you don’t waste time tweaking the animation on some random generic proxy character. ![]() I don’t want to start off in Audio2face using some generic male avatar head… what if my character is female? It would be way better to be able to import your custom character… whatever that is… Character Creator, iClone, MetaHuman, DazStudio, Poser, custom mesh, etc. I actually think the whole approach to this software is off. If I had to tweak the animation a tiny bit here and there I would be okay with that, but the results I got were not usable at all and I ended up having to animate the entire thing by hand… which defeats the whole point of using this software. You need to make it so that the animation that Audio2Face generates is exactly the same when it’s applied to a custom character… and in this case, to the Epic Games, Metahuman characters. So yeah, there is definitely room for improvement. I basically had to redo the entire animation by hand … creating blend shape poses for all the phoneme shapes and then re-animating the entire thing. After I exported from Audio2Face and imported into Unreal the entire animation was way off… the mouth never even closes. The blendshape solve basically destroys all the great AI lipsync that Audio2Face generates. ![]() There is definitely room for improvement. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |