On Mon, May 20, 2024 at 08:06 Undescribed Horrific Abuse, One Victim & Survivor of Many <gmkarl@gmail.com> wrote:


On Mon, May 20, 2024 at 07:16 Undescribed Horrific Abuse, One Victim & Survivor of Many <gmkarl@gmail.com> wrote:


On Mon, May 20, 2024 at 07:07 Undescribed Horrific Abuse, One Victim & Survivor of Many <gmkarl@gmail.com> wrote:
# we improved the stability of low-data machine learning model training

## why is this useful?

low-data model training would let individual users make AI models without access to big data, such as those models used at scale to addict users to products, services, or behaviors.

ability to recreate these models with the user in control would give them many more options for personal freedom.

additionally low-data training makes more powerful AI.

<snip>

## in short

to make a big model on a small dataset, train a training model that improves generalization to more data.

this is more powerful but more difficult if it also improves generalization of itself.

additionally: this may have been tried in research papers somewhere already, if attempting it is unreasonable. i might start with keywords around hyper-parameter metamodel for generalization or such

uhhhh maybe near ideas of training the training model in an RL way such that it hand-selects every batch (optionally every learning rate)
near online training, data curation
sounds like it exists