hi, for the KILT task, you mentioned in the README that "Train/validation/test data for this task should consist of jsonl files, which should be passed to train.py as --train_data train_file_1.jsonl train_file_2.jsonl, and --eval_data eval_file_1.jsonl eval_file_2.jsonl etc. " and "Atlas will automatically process these instances appropriately, into Atlas] query inputs based on the input field and target generations based on the answer fields", yet i didn't find the dedicated KILT data preprocessing code in the project, and passing two json files to the train.py is not straight forward enough to follow.
and as you mentioned in https://github.com/facebookresearch/atlas/issues/13, there won't be checkpoints provided for Atlas fine-tuned on KILT
would you plz kindly off a example complete training script for fine-tuning Atlas on KILT, thanks in advance
hi, for the KILT task, you mentioned in the README that "Train/validation/test data for this task should consist of jsonl files, which should be passed to train.py as --train_data train_file_1.jsonl train_file_2.jsonl, and --eval_data eval_file_1.jsonl eval_file_2.jsonl etc. " and "Atlas will automatically process these instances appropriately, into Atlas] query inputs based on the input field and target generations based on the answer fields", yet i didn't find the dedicated KILT data preprocessing code in the project, and passing two json files to the train.py is not straight forward enough to follow. and as you mentioned in https://github.com/facebookresearch/atlas/issues/13, there won't be checkpoints provided for Atlas fine-tuned on KILT would you plz kindly off a example complete training script for fine-tuning Atlas on KILT, thanks in advance