Closed neo773 closed 5 months ago
Oh that's very cool. Would probably be best people can put their own api key so it doesn't cost to run, but I'm not sure people would use it then. Thoughts?
Would be much better to run an LLM locally for that purpose if you were to head in that direction given that OTPs are a security method designed you ensure that you have control of a known device. If you send all your text messages to OpenAI, that's opening quite a few privacy and security surface areas.
Plus, do the regexes fail that often to warrant such a solution?
Yeah but local LLM sounds like way more processing than I want happening for such a silly little utility running in the background.
Understood, which brings me back to wondering whether an LLM is like bringing a Howitzer to a 2FA knife fight. 😉
General note here that the new updated logic SHOULD be much more reliable in detecting codes. I don't think we need this ticket anymore but please test and let me know!
Hi, I've been thinking about how we handle OTP parsing. Currently, we maintain so many regex patterns. What if we leverage OpenAI's new function calls instead? This would eliminate maintaining the config all together.
Here's a sample snippet I wrote in TypeScript it's pretty accurate and parsed the most complex OTP on the 1st run. Execution time averages at around 1.5s from my trial runs.