State of the Art Synthetic Source Code: Polycoder
1 Mar
2022
1 Mar
'22
10:28 a.m.
via twitter, Feb 2022 from Nov 2021 https://github.com/VHellendoorn/Code-LMs https://arxiv.org/abs/2202.13169 We release a new model, PolyCoder, with 2.7B parameters based on the GPT-2 architecture, which was trained on 249GB of code across 12 programming languages on a single machine. In the C programming language, PolyCoder outperforms all models including Codex.
1025
Age (days ago)
1027
Last active (days ago)
1 comments
1 participants
participants (1)
-
Undiscussed Horrific Abuse, One Victim of Many