To run the sign interpreter, start by cloning the repository with git clone {repo url}
. From there, navigate to the src
directory and open the index.html
file in a modern browser. Alternatively, you can open the file in Visual Studio code and use the "GoLive" extension.
The project uses p5.js and ml5.js to host an interactive sign language interpreter. The interpreter uses live video to capture frames and then predict which signs are being shown in the frame! The project is meant to expose individuals to machine learning as well as ASL.
Currently, the model is trained only on letters A-F. We plan to expand the letters and experience options in the future!