open-telemetry / opentelemetry.io

The OpenTelemetry website and documentation
https://opentelemetry.io
Creative Commons Attribution 4.0 International
556 stars 1.21k forks source link

Page feedback re: load generator docs #4977

Open szeitlin opened 3 months ago

szeitlin commented 3 months ago

Suggested improvements for page: https://opentelemetry.io/docs/demo/services/load-generator/

This isn't really documentation, it feels like a stub. The load generator service didn't start automatically, and the endpoint isn't available. It would help to have some hints as to how to debug if this isn't working.

theletterf commented 3 months ago

@open-telemetry/demo-approvers Any thoughts on this one? Thanks!

rogercoll commented 3 months ago

I agree that the documentation should be improved, the locustfile is provided and launched with the other OpenTelemetry services.

Some load is automatically generated, and the configuration can be dynamically changed in the /loadgen endpoint: https://github.com/open-telemetry/opentelemetry-demo/tree/main/src/loadgenerator#accessing-the-load-generator

How can I tell if it's working as expected?

As the other services, by checking the loadgen container state. Another option is to check if the /loadgen endpoint is available.

szeitlin commented 3 months ago

and launched with the other OpenTelemetry services.

This doesn't seem to be the case, and the /loadgen endpoint isn't accessible. So I'm wondering how to troubleshoot that. I tried editing the helm values to make sure the load generator service is enabled, but that didn't seem to be sufficient. I also tried passing that as a --set flag to helm, but I kept getting an error when I did that.

szeitlin commented 3 months ago

oh, I think maybe I need to give it more resources? Looks like it tried to come up and got stuck in pending for 24h:

loadgenerator 0/1 Pending 0 24h

szeitlin commented 3 months ago

ok, that wasn't the entire problem. I can load the webapp at localhost:8080 but I'm getting a 404 at localhost:8080/loadgen

Is it possible there's a port collision or there's a step missing with forwarding from another port-?

szeitlin commented 3 months ago

Indeed, the port this service wanted to use was 8089 not 8080 so I had to do an additional port forwarding step. But now it works!

svrnm commented 3 months ago

@szeitlin @rogercoll would you be interested in improving the docs accordingly? Thanks