Not an issue, but since we don't have the discussions section in this repo, I wanted to ask if would something like this be achievable? Both CodeT5 and CodeT5+ were trained with a context-length of 512 tokens. Having something like the context length of XGen would be amazing to work with large codebases and to build more complex coding agents to aid in software development.
Not an issue, but since we don't have the discussions section in this repo, I wanted to ask if would something like this be achievable? Both CodeT5 and CodeT5+ were trained with a context-length of 512 tokens. Having something like the context length of XGen would be amazing to work with large codebases and to build more complex coding agents to aid in software development.