Closed Surya-Prakash-Reddy closed 10 months ago
++
@AmitMY , Can you help on this?
Sounds like you built a fingerspelling model. Why do you want to show the sign language output instead of text?
If you do want to show sign language output, you can use this project. There is an android
app and you can run it with
npm run build
npx cap sync
npx cap run android
You will have to integrate the pose estimation output to input to your tflite model (we have a "loadTFJS" function to load tfjs, and then loading a model is straightforward).
To get the sign language output you can then:
const text = "test" // model.predict....
this.store.dispatch(new SetSpokenLanguageText(text))
and the sign language output will appear.
To be clear: this is a web app. The android app is a wrapper around the web application using capacitor and ionic.
Sounds like you built a fingerspelling model. Why do you want to show the sign language output instead of text?
We need output in sign language for a 2 way communication. We will have hardcoded text responses for various user gestures and I want to show response in both text and sign language form.
Thanks for sharing details about app. Few follow ups:
You can try to integrate this app in your app, but I think it might be too cumbersome.
For the translation between text and video, we are using the open source project - https://github.com/ZurichNLP/spoken-to-signed-translation You must give it a lexicon/dictionary, and it stitches the signs.
@AmitMY Hey! I am developing a text-to-sign language project for my FYP for my local language. My goal is to implement it by understanding all the bits and pieces. Can you recommend where I should start? This project is too overwhelming for me. Is this guide good enough for beginners?
@imsamimalik the research overview in https://research.sign.mt/ is a good start to get an overview of what is happening.
It really depends on - what IS your local language? Different languages have different resources.
However, in all cases, I would strongly recommend to start with this project as a baseline - you will have to implement a download_lexicon
(or otherwise create a CSV), but then you will have a very basic baseline to work with.
@AmitMY thank you for your reply. I will try running the sample project and see how far can I go.
May I know how to convert text to a sign language video?
@Fung1117
For the translation between text and video, we are using the open source project - https://github.com/ZurichNLP/spoken-to-signed-translation
Problem
Hi, I am trying to solve a problem where the part of problem requires me to translate the text to sign language animation. I need to do this in an Android application and hopefully do everything in Android app without connecting to any server.
I take input using camera and detect some specific symbols and based on symbols I need to show some sign language animation. To take input, I have trained a TF model using hagrid dataset using Mediapipe. I am using the TFLite model on device.
I am exploring on the output part. I will have some text and I need to show it in sign language animation. Is it possible that I can use this repo for that purpose? If not, can you help me understand how you convert text to sign language so that I can do something similar?
Description
No response
Alternatives
No response
Additional context
No response