Closed gregk closed 7 years ago
Hi @gregk, which channel are you using? Is this Direct Line like the issue you linked?
Hi @dandriscoll yes, it is direct line. All communication is between our service and MBF since we don't make use of outside services today.
If you send a conversation ID, I can get a breakdown and let you know where things stand.
One item of note is that POSTing a message takes longer than it does to write to the service. The call includes transit to our service, persisting the call, transit to the bot, waiting for the bot to complete (which sometimes involves calling back into the service), returning the error code from the bot, and then transit back to our service and finally your client. Your client sees the worst-case scenario because we want to return an HTTP status code that reflects your bot's status.
We do have a performance-oriented storage rewrite in testing now. I'll know more about the impact when I can see timing on your calls.
Your write up follows along the lines of our analysis. I'm going to share this with my team. We are writing a hack to work around these issues but would much prefer to use an approved method. Do you have a timeline for moving it out of testing? We would be happy to work an early version and give feedback. Really appreciate your team focusing on this.
I have the same problem. My bot response slow between 2 and 4 second The channel of the bot is facebook messager. It deploys on azure's bot service.
I have same problem. Using LUIS and Facebook Messenger. Bot service deployed in local dedicated server. But response delay 4-8 second.
When user send message on Facebook Messenger It seems to like this. is it correct?
How to skip botframework.com or improve performance? My target user is only FB Messenger
Also seeing very slow response times from the Bot framework, even just using the emulator in the portal we're getting 2-4 second delays.
Hi @gregk, contact me offline.
Hi @ebattulga, that's correct. Using the connectors is currently the only supported way to use the Bot Framework. There are unsupported libraries to connect direct to FB but I'm not familiar with them. (Example: https://github.com/andrew-makarenko/botbuilder-facebook)
Hi @gilesbutler, at the moment it depends a lot on where you're loading the dev portal from. In some regions, the dev portal is making trans-oceanic calls to get to a datacenter. We have some enhancements for this coming soon.
Hi @dandriscoll, I'm based in Sydney, Australia, so I presume that is why it's slow. Looking forward to the enhancements 👍
messenger<-->fb server<-->bot connect<-->bot server How to get seconds of every stage?
Hi @dandriscoll any update ? I am facing the same issue. My bot is hosted on azure west europe server
Hi @dandriscoll my azure bot is hosted in South East Asia, FB Messenger takes 4-6s to respond. Anyway to resolve this issue?
Bot Framework has undergone some major performance upgrades recently. If you are still experiencing problems, open a new issue. Thx
Still very slow response
Being that the Azure Bot Service is an httpTriggered Azure Function, I think it would be nice to be able to see the function threads that the bot is currently using. We're planning to process upwards 500/s directline requests and seeing the scaling on Azure is critical to diagnostic/debugging. Is this possible already? How to do this? Application Insights seem to be pretty limited for the Bot Service, at least whats shown on the Analytic tab in the resource on Azure portal.
I know httpTrigger Azure Functions are having some problems scaling/load balancing in general and since Bot Service is httpTriggered, it's probably experiencing the same issues. Yes / No ?
Furthermore, is there any plan to expand to allow the Bot Service to act as a ServiceBus? QueueProcessor? I know ServiceBus functions can scale and handle 1000+/sec requests when BrokeredMessage method is used (pretty impressive to see a single thread scale up to 12 and push through 100/s each).
What advice/experience can anyone share about hitting the DirectLine channel this hard? Starting to run tests now and will share when I have anything conclusive to add.
Hi @MikeDrips please do not perform any load testing on Direct Line.
Okay. When will it be functional for high volumes of traffic?
Mike
It is already. However, it's intended for user-generated traffic.
Our recommendation for performing load testing is to connect directly to your bot's endpoint. We're still expanding the documentation for this but the core of it is to request a token using the same configuration that the emulator uses and set up a sink that can receive callbacks from your bot.
Thanks @dandriscoll. We appreciate your help.
We don't have any user-generated traffic per say. We have a prior existing system that all traffic funnels through, passing it off to the bot through DirectLine in the meantime for AI processing. If don't need to using DirectLine we won't, the only reason we are is because we we're told to by Microsoft at a BotHackFest.
Does this mean we can send/receive Activity objects to/from the bot without
sending through the DirectLine channel?
Can you outline how this would/should be done?
Will it have better performance under high loads?
How does auth work in this case?
Is the ?code=
We have our own custom bot state service as well. So we really just need the bot to process and Activity and reply with one. We're using the Bot framework because in the future we'll open this up to all the other channels. Right now we just need it to be able to handle high loads from one source. Our whole system is Azure Functions as well so we're hoping the bot function can scale well enough to keep up.
I know you guys are working on it, but this stuff needs doc or it's a long road to come back from if it isn't the right choice.
Cheers
On Tue, Jun 13, 2017 at 11:16 AM, dandriscoll notifications@github.com wrote:
It is already. However, it's intended for user-generated traffic.
Our recommendation for performing load testing is to connect directly to your bot's endpoint. We're still expanding the documentation for this but the core of it is to request a token using the same configuration that the emulator uses and set up a sink that can receive callbacks from your bot.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/Microsoft/BotBuilder/issues/1892#issuecomment-308150452, or mute the thread https://github.com/notifications/unsubscribe-auth/Ab6EPqCmO0L-LDh1QPinQ92FxhzNya3Aks5sDqfHgaJpZM4LSXKc .
So, we really like the concept of BotBuilder to design sophisticated CI applications. We easily moved our solution to LUIS because of the power of its intents.
However, we are seeing 4 to 6 second turnaround times in BotBuilder between communication to and from the service as well as actions within the service. We aren't finding any issues with LUIS which is sub 500 ms. In seems like others have reported this problem as far back as August. See issue #1618. This makes the service unusable for any interaction beyond a single utterance.
Is there any plan to dramatically increase performance? If so, do you have a timeline?