Closed zyaocoder closed 2 years ago
Sure, you can change the aggregator from PGP to Global Attention. You'll need to modify the training config file as follows:
Replace aggregator parameters with
# Aggregator parameters
aggregator_type: 'global_attention'
aggregator_args:
target_agent_enc_size: 32
context_enc_size: 32
emb_size: 128
num_heads: 32
Remove the behavior cloning loss from the config
losses: ['min_ade_k']
loss_weights: [1]
loss_args:
- k: 10
Change the aggregation type to 'combined' in the decoder parameters
decoder_type: 'lvm'
decoder_args:
num_samples: 1000
op_len: 12
hidden_size: 128
encoding_size: 160
agg_type: 'combined'
lv_dim: 5
num_clusters: 10
Thanks for sharing the great work!
I was wondering how to try the Latent var (LV) only in Table 3: Decoder ablations of PGP paper. Is there any configuration?
Thanks in advance, Zhen