
so i was working on 'zinc' -- Zinc Is Not Cline https://github.com/karl3wm/zinc (very very slowly of course) and things got intense and i stopped :s basically it needs boost and you make a build dir and build it with cmake and that makes "chat" and "complete" binaries that can do chatting or completion with llama 3.1 405b on sambanova, hardcoded right now i'm not sure how to uhhh add preservation of configuration data to it uhhh actually uhhh anyway i looked up the llama prompt formats and made prompts (scripts) to use llama to help build it, it's comparable to uhh cline? claude? with short input, kind of somewhat the scripts are in scripts/ so right now you can go things like uhh `echo Update this file to have lots of comments about walruses. | python3 scripts/llama_file_flat.py src/configuration.cpp | build/complete | tee src/update_of_configuration.cpp.txt` and llama will act like cline and read the file and output a new file. but it's all manual right now. it's possibly a [sad] point to stop in that respect but yeah it shows how easy it would be to make something like cline uhh