calpoly-csai / swanton

Swanton Pacific Ranch chatbot with a knowledge graph
MIT License
3 stars 1 forks source link

Run Assistant script with support functions and data #22

Closed chidiewenike closed 3 years ago

chidiewenike commented 3 years ago

Summary

Added the run_assistant.py script which contains rasa_api_call.py to hit the local Rasa API server for an intent response. Support data includes Rasa model data, models, and audio response data.

Details

When running run_assistant.py, the system waits for the user press and processes user input audio while the button is pressed. The model predicts on the audio stream in real-time and the transcribed audio is sent as a string to the Rasa server. The predicted intent maps to an audio file and the audio is output through the speaker.

Testing

The system was tested on an RPi 4 - 4 GB without any issues.

chidiewenike commented 3 years ago

Good catches with the data, @Jason-Ku. I will test out the responses to those questions later and see what kind of output I get.

chidiewenike commented 3 years ago

@snekiam @Jason-Ku Do you guys think it would be better to label the intents with numbers or the entire string with underscores?
## intent:Search_for_Swanton_Pacific_Ranch_California_in_your_internet_browser is just a label and doesn't affect the model's predictions. We could label it like ## intent: intent#1 and map the intent number to a string like the audio responses. The current format is just for readability when looking at the NLU data.

snekiam commented 3 years ago

@snekiam @Jason-Ku Do you guys think it would be better to label the intents with numbers or the entire string with underscores? ## intent:Search_for_Swanton_Pacific_Ranch_California_in_your_internet_browser is just a label and doesn't affect the model's predictions. We could label it like ## intent: intent#1 and map the intent number to a string like the audio responses. The current format is just for readability when looking at the NLU data.

I think its fine the way it is