aws-samples / amazon-polly-metahumans

This Unreal Engine sample project demonstrates how to bring Epic Games' MetaHuman digital characters to life using the Amazon Polly text-to-speech service from AWS. Use this project as a starting point for creating your own Unreal Engine applications that leverage Amazon Polly to give voice to your MetaHumans.
MIT No Attribution
178 stars 67 forks source link

Not able to produce the animation on adding new metahuman #6

Closed OnlinePage closed 1 year ago

OnlinePage commented 2 years ago

Hello, first of all, thanks for all making it easy for Polly's integration. I am successfully able to run the sample given, now I am trying to add another metahuman as well and followed the steps given in the developer's guide, I am successfully able to import the metahuman and add all the speech blueprints with sequence, now on the next steps I am stuck.

I have some questions 1> after adding the Speech_AnimBP to the face component, the facial animation does not work but the audio plays back successfully. 2.> similar to above but this time on the body component I cant add the Bodyidle animation, even dragging n dropping the Bodyidle animation into anim class doesn't work,

Now I observed that if the face mesh of the new metahuman is changed back to ada_Facemesh with the Speech_AnimBP, it works fine. So basically animation is only for ada_FaceMesh (Ada)?

I tried to change the face mesh directly by editing the Speech_AnimBP, but was unable, as it doesn't show other face mesh other than Ada.

Please guide. @cwalkere

Legumtechnica commented 2 years ago

Could you find any solution?

Also, my editor crashes as soon as I open BP_Ada Tried adding "speech" component on another metahuman, but crashes again.

Can you help?

cuijiaxu commented 2 years ago

meet the same question. It looks like visemes and Bodyidle of new metahuman needs to be done.

cwalkere commented 2 years ago

Edit: Are you using UE4 or UE5? This project was created with UE4 and it looks like UE5 metahumans are not compatible with UE4 metahumans.

yogeshchandrasekharuni commented 2 years ago

Running into the same issue. Did you happen to find a fix?

Krxtopher commented 2 years ago

We provide step-by-step instructions for adding speech capability to any MetaHuman character. You'll find those instructions in the "Adding New Metahumans" section of our Developer Guide. If those instructions don't work for you, please let us know where they fail so we can improve them.

Please report back to us on whether this addresses your original issue. Thanks.

Krxtopher commented 2 years ago

@OnlinePage @cuijiaxu and @Ishu07, today I was able to get a custom MetaHuman working but I did have to make a few changes to the structure of the Content folder in order to do so. Also, there's a small (but important) flaw in one of our documentation images showing how to set up your Blueprint logic. I'll see if I can submit a PR that addresses both of these issues over the next few of days.

yogeshchandrasekharuni commented 2 years ago

@Krxtopher, in case you're unable to submit a PR, could you please provide the steps you took to get it to work - so that one of us will be able to push a fix. Thanks!

Legumtechnica commented 2 years ago

I was able to compile and run everything with a custom metahuman and everything this code is intended for. I've not had issues, just takes some fiddling.

Anyway, just one question. How do I add visemes for another language like Hindi.

cwalkere commented 2 years ago

IIRC, the viseme animations were hand-made by an in-house animator. The reference/sample animations can be found in the Content/AmazonPollyMetaHuman/Animation/Visemes folder. You'd have to try out Amazon Polly for Hindi, see what it returns, and create any missing viseme animations yourself. Or maybe you can find some on the internet.

Krxtopher commented 2 years ago

@Ishu07 visemes are actually language agnostic. So supporting new languages doesn't require new visemes. A viseme is the shape the mouth, lips, tongue, and jaw make when a human makes a particular vocal sound. It doesn't matter whether that sound is being used to produce a word in English, Hindi, or any other language.

If you need more help, LMK. I'd love to help you get this working with other languages if you run into issues.

Krxtopher commented 2 years ago

Regarding how I got custom hosts working. Here are (roughly) the steps I went through (from memory):

In the Content Browser, move the "Animation" and "Common" folders from "AmazonPollyMetaHuman" into a folder called "MetaHumans".

Import your custom MetaHuman into the project using Quixel Bridge.

It will pop up an error saying that you must update some files manually (using Windows Explorer) by copying them from a temporarily location it tells you to your Content folder. Do this.

Then I think I had to re-import the custom MetaHuman one more time. This time you won't get the file conflict message.

After you've done the above, you should be able to follow the regular step-by-step instructions in our Developer Guide. However, the image showing the Blueprint logic is missing an important piece. You must be sure to feed the "Return value" from the "Start speech" node into the "Sound" value of the "Play Sound 2D" node, as shown in this updated image... BP fix

Krxtopher commented 2 years ago

I've submitted a pull request that should help users who've had problems incorporating new MetaHumans into this sample project. You'll find the PR here: PR #15

Krxtopher commented 2 years ago

@Ishu07 I have a follow-up on my language/viseme comment above. I was looking through our source code and realized that, while visemes shouldn't block you from having a host speak a different language, there are two blockers that will get in your way. 1) We currently hard code which voices are available, and we've only included the English-speaking voices. 2) We are not passing in an explicit language ID when asking Polly to generate speech. Therefore, Polly will always use the default language ID which is "us-EN". So some code changes will be needed in order to get other languages working.

If you end up making those code changes yourself, please consider sharing back via a pull request. Otherwise, we'll consider adding this capability in the future. I'll create a new feature request issue so we can track it.

Legumtechnica commented 2 years ago

Alright, Thanks

yogeshchandrasekharuni commented 2 years ago

Awesome! I can confirm that now I am able to add a custom MetaHuman and that it works as expected and documented. Thank you!

cuijiaxu commented 2 years ago

@Krxtopher Get it,Thanks.

Krxtopher commented 1 year ago

I believe everyone has confirmed that the comments and changes above addressed this issue. Closing.