locustio / locust

Write scalable load tests in plain Python 🚗💨
MIT License
24.25k stars 2.93k forks source link

Docker-Compose Locust UI response time not showing. #2557

Closed lek18 closed 6 months ago

lek18 commented 6 months ago

Prerequisites

Description

I am running the docker-compose option for spinning up the locust UI, but the Response graph time is not showing/updating properly. The statistics chart give the correct information, its just not reflected in the UI (See images).

Screenshot 2024-01-17 at 11 32 16 AM Screenshot 2024-01-17 at 11 32 22 AM

Here is my docker-compose.yml and command docker-compose up -d --scale worker=2

version: '3'

services:
  master:
    image: locustio/locust
    ports:
     - "8089:8089"
    volumes:
      - ./:/mnt/locust
      - ./surveys:/surveys
    command: -f /mnt/locust/locustfile-element.py --modern-ui --master
    environment:
      - TOKEN
    networks:
      - element

  worker:
    image: locustio/locust
    volumes:
      - ./:/mnt/locust
      - ./surveys:/surveys
    command: -f /mnt/locust/locustfile-element.py --worker --master-host master
    environment:
      - TOKEN
    networks:
      - element

networks:
  element:
    external: true

Additional information:

  1. locust is calling a local endpoint from another docker image. docker run -p 9000:9000 --name element-word-cloud-luminate-dev-local --network element --rm --platform linux/amd64 -it element-world-cloud-luminate:latest bash -c "source /miniconda/bin/activate && conda activate seldonenv && seldon-core-microservice Model --service-type MODEL"

  2. Both local endpoint docker image and locust docker-compose are using the same docker network.

  3. The UI only appears to have this issue when the other element-word-cloud-luminate-dev-local is hitting 100% cpu

  4. Similar UI behaviour if I use the python sdk of locust.

  5. The UI behaviour goes away when I point to cloud version of my local docker element-word-cloud-luminate-dev-local

Here is the docker stats:

CONTAINER ID   NAME                                    CPU %     MEM USAGE / LIMIT     MEM %     NET I/O           BLOCK I/O         PIDS
1fc1687f5110   element-word-cloud-luminate-dev-local   100.18%   966.4MiB / 7.761GiB   12.16%    1.14MB / 19.4kB   14.9MB / 12.3kB   25
5dd9bee036e2   locust-worker-2                         0.02%     35.33MiB / 7.761GiB   0.44%     146kB / 1.45MB    0B / 0B           3
ab3e081ca846   locust-master-1                         0.04%     41.64MiB / 7.761GiB   0.52%     1.1MB / 7.28MB    2.04MB / 0B       3
9028c3d5bbee   locust-worker-1                         0.03%     34.21MiB / 7.761GiB   0.43%     127kB / 303kB     0B / 0B           3

Docker settings: 2 CPUs, 8Gb, ** same issue occurs if i bump the cpus to 4 CPUs and 16 GB

Command line

docker-compose up -d --scale worker=2

Locustfile contents

import json
import logging
import os

import urllib3

from locust import HttpUser, between, task

class PayLoadNotFound(Exception):
    pass

urllib3.disable_warnings()

# This TOKEN is an ENV variable, its needed for ELEMENT deployment.
TOKEN = os.getenv("TOKEN")
# Change Payload Here if you want to test something else.
file_name = "payload_3.json"

# For docker compose version
docker_file_name = f"/mnt/locust/surveys/{file_name}"

# This works if you are using python SDK instead of Docker.
local_file_name = f"surveys/{file_name}"

if os.path.exists(docker_file_name):
    with open(docker_file_name, "r") as file:
        request_Object = json.load(file)
elif os.path.exists(local_file_name):
    with open(local_file_name, "r") as file:
        request_Object = json.load(file)
else:
    raise PayLoadNotFound("File not found in docker nor in local.")

# this is your endpoint from Element.
endpoint = "world-cloud-main/api/v0.1"  # noqa
if not TOKEN:
    # This points to the docker container name + endpoint - ensure to have the same container name. # noqa
    # DO NOT EDIT endpoint NAME
    endpoint = "http://word-cloud-main-local:9000/api/v0.1"

class QuickstartUser(HttpUser):
    host = endpoint
    wait_time = between(1, 2)

    @task
    def test_world_cloud_service(self):
        headers = {
            "Content-Type": "application/json",
            "Authorization": TOKEN,
        }
        data = {"jsonData": request_Object}
        logging.info(
            f"This is the length of my inputStringList,"
            f" {len(request_Object.get('inputStringList'))}"
        )
        self.client.post("/predictions", json=data, headers=headers)

    def on_start(self):
        """on_start is called when a
        Locust start before any task is scheduled"""
        self.client.verify = False

Python version

python 3.11

Locust version

'2.19.1'

Operating system

MacOs Ventura 13.6.3

cyberw commented 6 months ago

Hi! Can you reproduce this issue with a minimal locustfile? Also, you are not on the latest release. Can you update?

lek18 commented 6 months ago

Hi @cyberw , thank you for your response.

I updated the image to latest (https://hub.docker.com/r/locustio/locust/tags)

I see that the average response line in the UI is updating properly, the 99th percentile is still exhibiting the same behaviour, ie not updating and keeping flat like before.

Screenshot 2024-01-17 at 6 15 52 PM
cyberw commented 6 months ago

Can you remove as much as possible https://stackoverflow.com/help/minimal-reproducible-example

i.e, see if it happens without using docker for locust, without distributed, basic locustfile, etc. Right now there are too many ”special” things about your case.

lek18 commented 6 months ago

Hello @cyberw , I used locust without docker and a basic locust file, and i do not see any issues in the ui.

command ran: locust -f locustfile-element.py

locust package version: 2.20.1

import json

import urllib3

from locust import HttpUser, between, task

urllib3.disable_warnings()

file_name = "payload_1.json"
# This works if you are using python SDK instead of Docker.
local_file_name = f"surveys/{file_name}"
with open(local_file_name, "r") as file:
    request_Object = json.load(file)
endpoint = "http://localhost:9000/api/v0.1"

class QuickstartUser(HttpUser):
    host = endpoint
    wait_time = between(1, 2)

    @task
    def test_world_cloud_service(self):
        headers = {
            "Content-Type": "application/json",
        }
        data = {"jsonData": request_Object}

        self.client.post("/predictions", json=data, headers=headers, verify=False)
Screenshot 2024-01-18 at 8 08 13 PM
cyberw commented 6 months ago

Do you still get the cpu load warning?

lek18 commented 6 months ago

No @cyberw. I believe using the latest package made it worked. Thank you.