Traffic-Alpha / LLM-Assisted-Light

This repository contains the code for the paper "LLM-Assisted Light: Leveraging Large Language Model Capabilities for Human-Mimetic Traffic Signal Control in Complex Urban Environments".
Apache License 2.0
31 stars 2 forks source link
emergency-vehicles intelligent-transportation-systems large-language-models reinforcement-learning traffic-signal-control

LLM-Assisted Light (LA-Light)

LLM-Assisted Light: Augmenting Traffic Signal Control with Large Language Model in Complex Urban Scenarios

Scenario_1

Examples of LA-Lights Utilizing Tools to Control Traffic Signals (Normal Scenario)

Scenario_2

Examples of LA-Lights Utilizing Tools to Control Traffic Signals (Emergency Vehicle (EMV) Scenario)

Overall Framework

The LA-Light framework introduces an innovative hybrid decision-making process for TSC that leverages the cognitive capabilities of LLMs alongside traditional traffic management methodologies. This framework includes five methodical steps for decision-making:

Evaluating LA-Light

Training and Evaluating the RL Model

For training and evaluating the RL model, refer to TSCRL. You can use the following command to start training:

python train_rl_agent.py

The RL Result directory contains the trained models and training results. Use the following command to evaluate the performance of the model:

python eval_rl_agent.py

Pure LLM

To directly use LLM for inference without invoking any tools, run the following script:

python llm.py --env_name '3way' --phase_num 3 --detector_break 'E0--s'

Decision Making with LLM + RL

To test LA-Light, run the following script. In this case, we will randomly generate congestion on E1 and the sensor on the E2--s direction will fail.

python llm_rl.py --env_name '4way' --phase_num 4 --edge_block 'E1' --detector_break 'E2--s'

The effect of running the above test is shown in the following video. Each decision made by LA-Light involves multiple tool invocations and subsequent decisions based on the tool's return results, culminating in a final decision and explanation.

LLM_for_TSC_README.webm

Due to the video length limit, we only captured part of the first decision-making process, including:

Citation

If you find this work useful, please cite our papers:

@article{wang2024llm,
  title={LLM-Assisted Light: Leveraging Large Language Model Capabilities for Human-Mimetic Traffic Signal Control in Complex Urban Environments},
  author={Wang, Maonan and Pang, Aoyu and Kan, Yuheng and Pun, Man-On and Chen, Chung Shue and Huang, Bo},
  journal={arXiv preprint arXiv:2403.08337},
  year={2024}
}

Acknowledgments

We would like to thank the authors and developers of the following projects, this project is built upon these great open-sourced projects.

Contact