here are files basically i just spasmed at my computer with the goal of making prompts that could do basic things that would be needed for the approach. the work i did is stuff that people using machine learning models have been doing for some time. it's notable that typing a sentence into a language model like "guess the weight of an unripe apple" can actually be (a) condensed into numbers representing the model's state after processing the text and (b) used as seed data to finetune a model specific to the question; so if you're struggling to move forward at all, it can be a reasonable prototyping step. the model is stlil huge and bulky though. i didn't link in an approach to make loading fast, like using a decentralized model. - try_multigen.py is the latest thing i worked on. i kind of want to tune it more because i encounter ram exhaustion when i let it run. it also makes it clear that there's value to running things in parallel, which i didn't implement. pretrained.py was for exploring listing properties of things. try_multigen.py was for exploring creatively generating things. both work enough to move forward if needed, although stimulating that wasn't necessarily my intent. the content is a funny and rare small area of my behaviors. it's confusing, because i am used to taking a more logical approach: as i touch on repeatedly in comments, there's no need for the models, you can just have the users generate content in a concept graph. but the models help one do things, it's emotionally easier to use language models to do things nowadays. the simple idea focus is basically verbs acting based on properties. everything having many properties, having language models able to guess the properties (if the user provides a property it can significantly strengthen the output of a model to add it to the prompt or training data), and using the properties to respond to the user as well as possibly form content. basically it's pretty obvious and the barrier is doing the work.