State of the Art Synthetic Source Code: Polycoder
Undiscussed Horrific Abuse, One Victim of Many
gmkarl at gmail.com
Tue Mar 1 02:28:56 PST 2022
via twitter, Feb 2022 from Nov 2021
https://github.com/VHellendoorn/Code-LMs
https://arxiv.org/abs/2202.13169
We release a new model, PolyCoder, with 2.7B parameters based on the GPT-2
architecture, which was trained on 249GB of code across 12 programming
languages on a single machine. In the C programming language, PolyCoder
outperforms all models including Codex.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: text/html
Size: 674 bytes
Desc: not available
URL: <https://lists.cpunks.org/pipermail/cypherpunks/attachments/20220301/1b8cb912/attachment.txt>
More information about the cypherpunks
mailing list