Closed likhith00 closed 2 years ago
Hi, it is possible that fine-tuning CodeT5 on a new programming language (PL) would achieve a reasonable result, as different PLs might share some common patterns and this allows transfer learning to another PL. For example, we find that fine-tuning CodeT5 on Apex yields good results (as shown in the GIF animation) as it is very similar to Java.
If it is not possible is pretraining codet5 with that new programming language dataset the only option?