huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
134.92k stars 26.99k forks source link

Transformer models for semantic parsing #9420

Closed ayushjain1144 closed 3 years ago

ayushjain1144 commented 3 years ago

Hi! Thank you for your awesome work!

I want to perform semantic parsing. Unfortunately, I couldn't find any examples on hugging face repo for that. Could you please let me know how I should proceed? I suppose I could use a Seq2Seq EncoderDecoder model like BERT2BERT and finetune it for semantic parsing. Or do you think there is a better way? For more context, I have natural language grounding descriptions and I want to generate logical parse tree from it. In literature, there are a few tree transformer-based techniques and Seq2Tree technique which I think hugging face do not support yet (or does it?).

Thanks :)

patil-suraj commented 3 years ago

Hi @ayushjain1144

That's an interesting question, would be better if you ask it on the forum