LibCity / Bigscity-LibCity

LibCity: An Open Library for Urban Spatial-temporal Data Mining
https://libcity.ai/
Apache License 2.0
937 stars 168 forks source link

How to visualize the output? #308

Closed marunava21 closed 2 years ago

marunava21 commented 2 years ago

I have used run_model and then test_model. Now i want to visualize the output or want to know want can be the prediction. How to do that.

!python test_model.py --task="traffic_state_pred" --model="AutoEncoder" --dataset="METR_LA"

2022-09-02 04:47:19,613 - INFO - Log directory: ./libcity/log 2022-09-02 04:47:19,613 - INFO - {'task': 'traffic_state_pred', 'model': 'RNN', 'dataset': 'METR_LA', 'saved_model': True, 'train': True, 'batch_size': 2, 'dataset_class': 'TrafficStatePointDataset', 'executor': 'TrafficStateExecutor', 'evaluator': 'TrafficStateEvaluator', 'rnn_type': 'RNN', 'hidden_size': 64, 'num_layers': 1, 'dropout': 0, 'bidirectional': False, 'teacher_forcing_ratio': 0, 'scaler': 'standard', 'load_external': True, 'normal_external': False, 'ext_scaler': 'none', 'add_time_in_day': True, 'add_day_in_week': False, 'max_epoch': 100, 'learner': 'adam', 'learning_rate': 0.01, 'lr_decay': True, 'lr_scheduler': 'multisteplr', 'lr_decay_ratio': 0.1, 'steps': [5, 20, 40, 70], 'clip_grad_norm': True, 'max_grad_norm': 5, 'use_early_stop': True, 'patience': 50, 'cache_dataset': True, 'num_workers': 0, 'pad_with_last_sample': True, 'train_rate': 0.7, 'eval_rate': 0.1, 'input_window': 12, 'output_window': 12, 'gpu': True, 'gpu_id': 0, 'train_loss': 'none', 'epoch': 0, 'weight_decay': 0, 'lr_epsilon': 1e-08, 'lr_beta1': 0.9, 'lr_beta2': 0.999, 'lr_alpha': 0.99, 'lr_momentum': 0, 'step_size': 10, 'lr_T_max': 30, 'lr_eta_min': 0, 'lr_patience': 10, 'lr_threshold': 0.0001, 'log_level': 'INFO', 'log_every': 1, 'load_best_epoch': True, 'hyper_tune': False, 'metrics': ['MAE', 'MAPE', 'MSE', 'RMSE', 'masked_MAE', 'masked_MAPE', 'masked_MSE', 'masked_RMSE', 'R2', 'EVAR'], 'evaluator_mode': 'single', 'save_mode': ['csv'], 'geo': {'including_types': ['Point'], 'Point': {}}, 'rel': {'including_types': ['geo'], 'geo': {'cost': 'num'}}, 'dyna': {'including_types': ['state'], 'state': {'entity_id': 'geo_id', 'traffic_speed': 'num'}}, 'data_col': ['traffic_speed'], 'weight_col': 'cost', 'data_files': ['METR_LA'], 'geo_file': 'METR_LA', 'rel_file': 'METR_LA', 'output_dim': 1, 'time_intervals': 300, 'init_weight_inf_or_zero': 'inf', 'set_weight_link_or_dist': 'dist', 'calculate_weight_adj': True, 'weight_adj_epsilon': 0.1, 'device': device(type='cuda', index=0), 'exp_id': 41266} 2022-09-02 04:47:20,120 - INFO - Loaded file METR_LA.geo, num_nodes=207 2022-09-02 04:47:20,130 - INFO - set_weight_link_or_dist: dist 2022-09-02 04:47:20,131 - INFO - init_weight_inf_or_zero: inf 2022-09-02 04:47:20,160 - INFO - Loaded file METR_LA.rel, shape=(207, 207) 2022-09-02 04:47:20,160 - INFO - Start Calculate the weight by Gauss kernel! 2022-09-02 04:47:20,161 - INFO - Loading file METR_LA.dyna 2022-09-02 04:47:25,775 - INFO - Loaded file METR_LA.dyna, shape=(34272, 207, 1) tcmalloc: large alloc 1361199104 bytes == 0xb406c000 @ 0x7f1d833a21e7 0x7f1d2f40c46e 0x7f1d2f45cc2b 0x7f1d2f45ccc8 0x7f1d2f503e70 0x7f1d2f50459c 0x7f1d2f5046bd 0x4bc4ab 0x7f1d2f449ef7 0x59371f 0x515244 0x549576 0x593fce 0x548ae9 0x5127f1 0x549e0e 0x4bca8a 0x7f1d2f449ef7 0x59371f 0x515244 0x549576 0x593fce 0x548ae9 0x5127f1 0x593dd7 0x511e2c 0x593dd7 0x511e2c 0x593dd7 0x511e2c 0x593dd7 tcmalloc: large alloc 1361199104 bytes == 0x105b20000 @ 0x7f1d833a21e7 0x7f1d2f40c46e 0x7f1d2f45cc2b 0x7f1d2f45ccc8 0x7f1d2f503e70 0x7f1d2f50459c 0x7f1d2f5046bd 0x4bc4ab 0x7f1d2f449ef7 0x59371f 0x515244 0x549576 0x593fce 0x548ae9 0x5127f1 0x549e0e 0x4bca8a 0x7f1d2f449ef7 0x59371f 0x515244 0x549576 0x593fce 0x548ae9 0x5127f1 0x593dd7 0x511e2c 0x593dd7 0x511e2c 0x593dd7 0x511e2c 0x593dd7 tcmalloc: large alloc 1361199104 bytes == 0x156d44000 @ 0x7f1d833a21e7 0x7f1d2f40c46e 0x7f1d2f45cc2b 0x7f1d2f45ccc8 0x7f1d2f503e70 0x7f1d2f50459c 0x7f1d2f5046bd 0x4bc4ab 0x7f1d2f449ef7 0x59371f 0x515244 0x549576 0x593fce 0x548ae9 0x51566f 0x593dd7 0x511e2c 0x593dd7 0x511e2c 0x593dd7 0x511e2c 0x549576 0x604173 0x5f5506 0x5f8c6c 0x5f9206 0x64faf2 0x64fc4e 0x7f1d82f9fc87 0x5b621a 2022-09-02 04:47:32,297 - INFO - Dataset created 2022-09-02 04:47:32,297 - INFO - x shape: (34249, 12, 207, 2), y shape: (34249, 12, 207, 2) 2022-09-02 04:47:32,303 - INFO - train x: (23974, 12, 207, 2), y: (23974, 12, 207, 2) 2022-09-02 04:47:32,303 - INFO - eval x: (3425, 12, 207, 2), y: (3425, 12, 207, 2) 2022-09-02 04:47:32,303 - INFO - test x: (6850, 12, 207, 2), y: (6850, 12, 207, 2) 2022-09-02 04:48:57,401 - INFO - Saved at ./libcity/cache/dataset_cache/point_based_METR_LA_12_12_0.7_0.1_standard_2_True_True_False_True.npz 2022-09-02 04:48:57,800 - INFO - StandardScaler mean: 54.40592829587626, std: 19.493739270573098 2022-09-02 04:48:57,800 - INFO - NoneScaler tcmalloc: large alloc 1905647616 bytes == 0x7f1c824e8000 @ 0x7f1d833a21e7 0x7f1d2f40c46e 0x7f1d2f45cc2b 0x7f1d2f45ff73 0x7f1d2f5044f4 0x7f1d2f5046bd 0x4bc4ab 0x7f1d2f449ef7 0x59371f 0x515244 0x549576 0x593fce 0x548ae9 0x5127f1 0x549e0e 0x593fce 0x548ae9 0x5127f1 0x593dd7 0x511e2c 0x549576 0x604173 0x5f5506 0x5f8c6c 0x5f9206 0x64faf2 0x64fc4e 0x7f1d82f9fc87 0x5b621a tcmalloc: large alloc 1905647616 bytes == 0x7f1c10b8a000 @ 0x7f1d833a21e7 0x7f1d2f40c46e 0x7f1d2f45cc2b 0x7f1d2f45ccc8 0x7f1d2f503e70 0x7f1d2f50459c 0x7f1d2f5046bd 0x4bc4ab 0x7f1d2f449ef7 0x59371f 0x515244 0x549576 0x593fce 0x548ae9 0x5127f1 0x549e0e 0x593fce 0x548ae9 0x5127f1 0x593dd7 0x511e2c 0x549576 0x604173 0x5f5506 0x5f8c6c 0x5f9206 0x64faf2 0x64fc4e 0x7f1d82f9fc87 0x5b621a Using backend: pytorch 2022-09-02 04:49:03,891 - INFO - You select rnn_type RNN in RNN! 2022-09-02 04:49:07,711 - INFO - Generating grammar tables from /usr/lib/python3.7/lib2to3/Grammar.txt 2022-09-02 04:49:07,730 - INFO - Generating grammar tables from /usr/lib/python3.7/lib2to3/PatternGrammar.txt 2022-09-02 04:49:09,810 - INFO - RNN( (rnn): RNN(414, 64) (fc): Linear(in_features=64, out_features=207, bias=True) ) 2022-09-02 04:49:09,810 - INFO - rnn.weight_ih_l0 torch.Size([64, 414]) cuda:0 True 2022-09-02 04:49:09,811 - INFO - rnn.weight_hh_l0 torch.Size([64, 64]) cuda:0 True 2022-09-02 04:49:09,811 - INFO - rnn.bias_ih_l0 torch.Size([64]) cuda:0 True 2022-09-02 04:49:09,811 - INFO - rnn.bias_hh_l0 torch.Size([64]) cuda:0 True 2022-09-02 04:49:09,811 - INFO - fc.weight torch.Size([207, 64]) cuda:0 True 2022-09-02 04:49:09,811 - INFO - fc.bias torch.Size([207]) cuda:0 True 2022-09-02 04:49:09,811 - INFO - Total parameter numbers: 44175 2022-09-02 04:49:09,811 - INFO - You select adam optimizer. 2022-09-02 04:49:09,811 - INFO - You select multisteplr lr_scheduler. 2022-09-02 04:49:09,812 - WARNING - Received none train loss func and will use the loss func defined in the model. 2022-09-02 04:49:09,831 - INFO - Result shape is torch.Size([2, 12, 207, 1]) 2022-09-02 04:49:09,831 - INFO - Success test the model!

marunava21 commented 2 years ago

The applications are not that clear in the documentation. Can you please guide me what to do next. If i understand it i can contribute to this project.

aptx1231 commented 2 years ago

Data visualization is done using script visualize.py. Run the model using the script run_model.py. The model output is saved in the cache/evaluate_cache/ directory and is an npz file that can be visualized in python. The output can also be visualized using our web tool. test_model.py takes data from only one Batch to test whether the model can run.

marunava21 commented 2 years ago

So is the web tool is GeoJSON.io? And how can I apply the models for real time predictions using camera? is that possible? or it only takes satellite type of data.

aptx1231 commented 2 years ago

It can be visualized using geojson.io, simply by converting the predicted npz file into geojson format. Camera data can be received, just convert it to the data format we specify.