I was taking a look at https://github.com/ggerganov/ggml/tree/master/examples/gpt-2. There are several ways to obtain ggml compliant GPT2 weights. If I would have a PyTorch-trained GPT2 model trained via the Transformers library, how would convert it to ggml?
It seems the Cerebras example could be using a PyTorch model (but I don't know for sure). Could anyone confirm that I could use something similar to the Cerebras example, or is it more complicated?
Good afternoon,
I was taking a look at https://github.com/ggerganov/ggml/tree/master/examples/gpt-2. There are several ways to obtain ggml compliant GPT2 weights. If I would have a PyTorch-trained GPT2 model trained via the Transformers library, how would convert it to ggml?
It seems the Cerebras example could be using a PyTorch model (but I don't know for sure). Could anyone confirm that I could use something similar to the Cerebras example, or is it more complicated?