[ot][crazy][personal] Micro-moronism Challenge: Iterate & Transform

Undescribed Horrific Abuse, One Victim & Survivor of Many gmkarl at gmail.com
Sat Dec 17 02:28:06 PST 2022


https://github.com/xloem/flat_tree has 3 points where a given
component is in use:
- in test.py, which tests an implementation
- in flat_tree/__init__.py , which wraps implementations with a
normative interface
- in the implementation such as flat_tree/append_tree.py

One of the ways I am actually still using this code I find inhibited
is that it is a dependency of https://github.com/xloem/log , which
uses flat_tree in capture.py, capture_stdin.py, and multicaputre.py .

- capture_stdin.py is an almost exact newer copy of capture.py .
combining these could be a good factoring challenge. testing
capture.py likely involves using an android phone to record from.
- multicapture.py is a different approach that likely has a threading bug .
- meanwhile, download.py would use flat_tree if flat_tree had useful
reading features

------
regarding machine learning, there is recent activity on the RWKV issue
thread in the https://github.com/huggingface/transformers repository
where people mention a few more implementations of the model. This
model by an independent chinese researcher demonstrates very similar
factoring challenges as the ones I am engaging personally: people keep
reimplementing things rather than reusing components, and work to
integrate things keeps struggling. [the huggingface/transformers
repository as a whole actually has a norm of code duplication; i
believe their goal is that each model can be a standalone example for
people to learn from and practice with; this can seem quite
frustrating if one isn't told in advance]
The attempt to integrate that model also raises unintegrated models
that could be iteration practice others might appreciate. unintegrated
models include s4, s4d, facebook mega, hrrformer (and whatever comes
after hrrformer maybe waveformer?). like rwkv, these cutting edge
architectures blast through current limits of machine learning, but
aren't in use. implementing them just means copying from a paper.
there are also important old models that aren't implemented, like the
linear transformer, which is similar to rwkv.


More information about the cypherpunks mailing list