On 12/29/21, Punk-BatSoup-Stasi 2.0 <punks@tfwno.gf> wrote:
On Wed, 29 Dec 2021 17:44:57 -0500 k <gmkarl@gmail.com> wrote:
i think this example notebook shows training a transformer model on the free tpus https://colab.research.google.com
again, fuck you karl and your fucking JOOGLE SPAM. Take it elsewhere.
see + marked lines below commit 2752300c472e56598ccce9b3887588779c9fd22c (HEAD -> main, origin/main) Author: xloem <0xloem@gmail.com> Date: Wed Dec 29 19:30:10 2021 -0500 Update README.md diff --git a/README.md b/README.md index b000b37..3d13e2b 100644 --- a/README.md +++ b/README.md @@ -8,6 +8,8 @@ I have trouble doing things, and don't really know how to do that, so just a few I recommend using jax/flax and google cloud, because google has a TPU research program free trail w/ application that could be leveraged for compute once a setup is designed. +I do NOT normally recommend using google cloud, because your muscle contraction timing will be harvested by javascript to guide your behavior for some of the world's largest marketing computers. + Google's systems are accessible on the web to the public for use at https://colab.research.google.com/