Open yoichiro opened 6 years ago
I agree this is a useful change. It does require some server-side changes to accommodate.
It does require some server-side changes to accommodate.
@Fleker Probably, we can support to the screen surface by the following code below:
send(input: string): Promise<AssistResponse> {
const config = new embeddedAssistantPb.AssistConfig()
...
config.setAudioOutConfig(new embeddedAssistantPb.AudioOutConfig())
...
// Add two lines below:
config.setScreenOutConfig(new embeddedAssistantPb.ScreenOutConfig())
config.getScreenOutConfig().setScreenMode(3)
...
If we add the lines above, the result of conv.surface.capabilities.has('actions.capability.SCREEN_OUTPUT');
is true
in a fulfillment code. That is, it does not require any server-side changes, I guess.
Do you have other reasons which we need to change the server-side code?
I was thinking primarily of the other possible surface capabilities, as there's not a single array of values that you control
I have started using action-on-google-testing package for our enterprise bot. During testing, I found some issue regarding failed scenarios -: 1.sometime intent doesn't match with my utterance.
error: 2 exceptions
Uncaught Exception TypeError: Cannot read property 'url' of undefined ClientDuplexStream.conversation.on (node_modules/actions-on-google-testing/src/actions-on-google.ts:559:62) Object.onReceiveMessage (node_modules/grpc/src/client_interceptors.js:1292:19) InterceptingListener.recvMessageWithContext (node_modules/grpc/src/client_interceptors.js:607:19)
Currently, all requests are sent from as a device which supports Audio surface only. But, many actions support multiple surfaces and send different responses depending on the surface.
At least, I think that this tool should support a screen surface. For example, if developers can specify whether want to support a screen surface or not as like the following, we will be able to cover more test cases/conditions.