Closed msporny closed 2 years ago
I did item # 2 and charles did item # 4 above. The auto-transcription feature is working much better now that we're using a more financially costly audio AI model. Google has a great business model here -- "Those are some pretty words you just said... it'd be a real shame if something were to happen to 'em, pal.") :P
Err, I mean, Google is wonderful and there is nothing wrong with charging good money for a service that provides great value.
The transcriptions seem to be good enough to replace human scribes at this point. The quality is not as good as a /good/ human scribe, and the bots capture every single word that is said (probably too much), but the days of human transcription seem to be numbered.
I've optimized the transcription bot so it stops recording every single utterance, so one and two word quips are not recorded now. That was most of the "clean up" work required for auto-transcribed minutes these days... which takes less time (at least for me) than dealing w/ a human scribe (the output from meeting to meeting is far more consistent now).
I believe this issue is now resolved, closing.
Auto-transcription was enabled in an attempt to help people that run calls take more accurate minutes/transcriptions. Unfortunately, the accuracy of auto-transcription, when a fair amount of technical jargon is used, leaves much to be desired. The problems with auto-transcription as we've implemented it currently are:
To address this, we could try to do some of the following: