[ot][spam][crazy] Quickly autotranscribing xkcd 4/1 correctly

Undiscussed Horrific Abuse, One Victim of Many gmkarl at gmail.com
Sat Apr 2 12:56:01 PDT 2022


wow it's 4pm now. i started before dawn. i think i spent most of the
day struggling with my ability to focus on a small goal, could be
wrong.

latest gist is https://colab.research.google.com/gist/xloem/4310a26b6c9d13adac14307b948157d3/untitled4.ipynb
. all i did was expand the generate function.

it should be quite reasonable to train vs something by comparing the
output logits.

the experience working with this speech2text model was good. i think
it wouldn't be hard to convert it to streaming, which is certainly
more learning than nothing.

that idea of having something autodiscern data is important. this is
an interesting space where that's similar to, but different from,
unsupervised training on limited data.


More information about the cypherpunks mailing list