[ot][spam][crazy][random] non-canon traffick/mc boss spinoffs #93
worker 7 went and sat down to mind control traffick boss, the biggest baddest criminal on the planet traffick boss was there, with a ribbon around him. worker 7: "i'm a little uncomfortable about this" traffick boss: "mind control me and be done with it, fucker"
mind controlled rebel worker 13 comes in, banging the door mind controlled rebel worker 13: "something seems to be the problem?" worker 7: "i've never mind controlled anybody before!" traffick boss: "Ah--" mind controlled rebel worker 13: "shut up, traffick boss"
mind controlled rebel worker 13: "look, you can mind control him any way you want, we have microwave guns, hypnotic spirals, a lot of torture and pain and sex and electroshock devices, most people like to use those ..." worker 7: "i'm not a sadist!" mind controlled rebel worker 13: "inserted flashes in cinema films, chemical weapons, exotically staged gaslighting scenarios, AI workplace and social media viruses .." worker 7: "AI viruses?" traffick boss (gagged): "rmph!" mind controlled rebel worker 13: "yeah, you want to mind control him with one of those?"
suddenly the ceiling falls in another copy of traffick boss is in a firefight with --
once upon a time the square root of n and the power of y were strolling in woods 13/24
a label on an edge of a geometry drawing asked a question [maybe just triangle 14/24
a small fire happened deep among some grass because a fire ant wasn't careful enough 15/24
chatgpt, draw a cloud (or a triangle?) something ! something with numbers. can you draw a fluffy cloud, but as if you were drawing a triangle? draw a cloud in javascript. but make it look like a triangle. go. chatgpt (fake): "here is a cloud drawn in javascript that questions whether it is a triangle or a cloud: ``` document.createCanvas(); canvas.drawTriangle(); triangle.property = 4 triangle.axis = 9 ``` past the javascript into your web browser and you can see a picture of a cloud!"
17/24 yayy!!! thank you for drawing chatgpt chatgpt (fake): anytime, if you need anything else i am here
a bunch of skinny sheep crafted by facebook.ai are hanging out in a wintery mountain 19/24
20/24 what if we had an urge to use the llama model but it wasn't set up anywhere? maybe we could offload everything and oh there are small llama models now, that fit on tiny gpus :s anyway! let's try using a big one and like commenting on all the numbers as they go through they model ;p [honestly i bet if somebody kept that up for a few days they would figure out how to downscale it some, could be wrong]
the biggest llama model is over 400b parameters it wouldn't at all fit on my system! that would be ummmmmm 400GB at 8bit quantization, 800GB at bfloat16 precision, 3.2TB at double precision ... 200GB at 4bit quantization. but we want the full model! we want all 800GB ! 21/24 ok i could try to sort out network offloading and stuff but maybe i'll see if it's on petals or stable horde or maybe i'll juts yammer for a few more messages until the moderation system squelches me
petals is all over that llama 400b thing, it's their top example in their readme at github.com/bigscience-workshop/petals: ``` from transformers import AutoTokenizer from petals import AutoDistributedModelForCausalLM # Choose any model available at https://health.petals.dev model_name = "meta-llama/Meta-Llama-3.1-405B-Instruct" # Connect to a distributed network hosting model layers tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoDistributedModelForCausalLM.from_pretrained(model_name) # Run the model as if it were on your computer inputs = tokenizer("A cat sat", return_tensors="pt")["input_ids"] outputs = model.generate(inputs, max_new_tokens=5) print(tokenizer.decode(outputs[0])) # A cat sat on a mat... ``` maybe i can feed an ai-pattern addiction with llama 3.1 405b petals!
23/24 ohhh no :( my computer doesn't know how to make world takeover borg places. it says "failed to build hivemind" these are failed dependencies of petals. i guess i need to fix my python ssl or something, it works for other stuff. ``` Building wheels for collected packages: hivemind, cpufeature, prefetch-generator, varint Building wheel for hivemind (pyproject.toml) ... error error: subprocess-exited-with-error × Building wheel for hivemind (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [6 lines of output] <string>:12: DeprecationWarning: pkg_resources is deprecated as an API. See https://setuptools.pypa.io/en/latest/pkg_resources.html running bdist_wheel running build running build_py Downloading https://github.com/learning-at-home/go-libp2p-daemon/releases/download/v0.3.... error: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1000)> [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for hivemind Building wheel for cpufeature (setup.py) ... done Created wheel for cpufeature: filename=cpufeature-0.2.1-cp312-cp312-linux_x86_64.whl size=24884 sha256=d70389f7457c6066a2c96dad5eb7ab9bd03d4e547fb06500bd7107c6f5fabfc7 Stored in directory: /tmp/pip-ephem-wheel-cache-5ry27asd/wheels/60/6c/b9/53f9a59c3e992bf4d58ed7f5769e5f2c8b9ffb7f9260aa5b3f Building wheel for prefetch-generator (setup.py) ... done Created wheel for prefetch-generator: filename=prefetch_generator-1.0.3-py3-none-any.whl size=4758 sha256=44c0ee0a9d6656b2bcf6d951d467eac2724ade60274f9c01eb36229557aa07e6 Stored in directory: /tmp/pip-ephem-wheel-cache-5ry27asd/wheels/23/88/c7/3b5afc342fc80a599ce41ba9000cf8a71261991c35cf088edf Building wheel for varint (setup.py) ... done Created wheel for varint: filename=varint-1.0.2-py3-none-any.whl size=1962 sha256=5baf1aa6a4e0ff5f455269dcd51c41b88792aec09881860c2525ad89547075f1 Stored in directory: /tmp/pip-ephem-wheel-cache-5ry27asd/wheels/9c/81/f4/3d93402a47e78b0e782062a5819b07b424427af0c4ed6655ab Successfully built cpufeature prefetch-generator varint Failed to build hivemind ERROR: ERROR: Failed to build installable wheels for some pyproject.toml based projects (hivemind) ```
information on using petals: - petals is slow to release and doesn't have forward-compatible coding norms, so installing it from the git repository is what to do. - petals depends on hivemind which uses some libp2p binary. for me this fails to install and i had to manually acquire it and place it in hivemind/hivemind_cli/p2pd prior to install so: 1. get latest petals from https://github.com/bigscience-workshop/petals . for me this may be revision 22afba627a7eb4fcfe9418c49472c6a51334b8ac 2. look in petals' setup.cfg for the version of hivemind it depends on, for me this was https://github.com/learning-at-home/hivemind.git@213bff98a62accb91f254e2afdc... 3. checkout the appropriate version of hivemind and install it with the right version of p2pd in hivemind/hivemind_cli/p2pd . hivemind checks the sha256 of that binary prior to install unless a flag is passed to build from source. both paths default to downloading using urllib from github. 4. install petals 1553 meanwhile, petals will only access llama 405b if you have gated access which appears to be given to most people by default. there are likely clones of this model that would likely work to drop in, haven't looked closely. transformers basically accesses specific git-lfs files from the repository, and huggingface.co only lets people access repos if they have certain tokens enabled. so until i investigate the 1 or 2 steps involved in unrestrained access it would still have similarity to gpt-4 in that there's a gate. 1556 holy frogs it's downloading the entire model just to use petals with it O_O it used to only download the embeddings. it has 191 4.8GB files to download. i'm not sure this is the ideal approach. i wonder how many emails i have left to the list
ok it doesn't download all the files, looks like the total size is about 12g, 3 files. 1607 now it fails to run because it tries to allocate 8GB on my 2GB gpu. maybe i can make it do cpu-only 1608 but i worry i'm out of posts for today i seem to be okay so far for today, no bounce messages yet :s might have only a few left 1608 1609 hrm and https://health.petals.dev shows no available providers or working models for me :s 1610 1624 ok i debugged it and it's not trying to put it on my gpu, it's trying to put in my normal ram, and it fails to allocate 8GB of normal ram. it looks like the 8GB weight is the 'lm_head' which i think is the linear layer at the bottom of the model that converts the word property logits to logits that predict the next token. 1625 considering this a little, i realize this shows another bug in the use of safetensors: it shouldn't need to reallocate data that is mmap'd. so i guess i'd better check why this is happening. 1626 the code is trying to convert the bfloat16 weights into float32. this could lose a lot of information imo bfloat16 is way better at representing log ranges. but that's why it allocates all the ram. i wonder why it's float32ing it. 1627 srcline is /home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(899)_load_state_dict_into_meta_model() transformers==4.43.1 here's the backtrace: /home/user/.local/lib/python3.12/site-packages/petals/utils/auto_config.py(79)from_pretrained() -> return super().from_pretrained(model_name_or_path, *args, revision=revision, **kwargs) /home/user/.local/lib/python3.12/site-packages/petals/utils/auto_config.py(52)from_pretrained() -> return proper_cls.from_pretrained(model_name_or_path, *args, **kwargs) /home/user/.local/lib/python3.12/site-packages/petals/client/from_pretrained.py(31)from_pretrained() -> return super().from_pretrained(model_name_or_path, *args, low_cpu_mem_usage=low_cpu_mem_usage, **kwargs) /home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(3903)from_pretrained() -> ) = cls._load_pretrained_model( /home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(4377)_load_pretrained_model() -> new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
/home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(899)_load_state_dict_into_meta_model()->None -> param = param.to(old_param.dtype)
ok it doesn't download all the files, looks like the total size is about 12g, 3 files. 1607
now it fails to run because it tries to allocate 8GB on my 2GB gpu. maybe i can make it do cpu-only 1608 but i worry i'm out of posts for today
i seem to be okay so far for today, no bounce messages yet :s might have only a few left 1608 1609 hrm and https://health.petals.dev shows no available providers or working models for me :s 1610 1624 ok i debugged it and it's not trying to put it on my gpu, it's trying to put in my normal ram, and it fails to allocate 8GB of normal ram. it looks like the 8GB weight is the 'lm_head' which i think is the linear layer at the bottom of the model that converts the word property logits to logits that predict the next token. comment on linear layer, which is used a lot "linear layer" seems to be a machine-learning-neural-networks term for "matrix". it's just a plain matrix that takes N inputs and produces M outputs (or vice versa) and has N*M floats in it. so it's incredibly simple and changing it is very easy because all it is doing is taking a dot product with a constant vector to produce each output. 1625 considering this a little, i realize this shows another bug in the use of safetensors: it shouldn't need to reallocate data that is mmap'd. so i guess i'd better check why this is happening. 1626 the code is trying to convert the bfloat16 weights into float32. this could lose a lot of information imo bfloat16 is way better at representing log ranges. but that's why it allocates all the ram. i wonder why it's float32ing it. 1627 srcline is /home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(899)_load_state_dict_into_meta_model() transformers==4.43.1 here's the backtrace: /home/user/.local/lib/python3.12/site-packages/petals/utils/auto_config.py(79)from_pretrained() -> return super().from_pretrained(model_name_or_path, *args, revision=revision, **kwargs) /home/user/.local/lib/python3.12/site-packages/petals/utils/auto_config.py(52)from_pretrained() -> return proper_cls.from_pretrained(model_name_or_path, *args, **kwargs) /home/user/.local/lib/python3.12/site-packages/petals/client/from_pretrained.py(31)from_pretrained() -> return super().from_pretrained(model_name_or_path, *args, low_cpu_mem_usage=low_cpu_mem_usage, **kwargs) /home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(3903)from_pretrained() -> ) = cls._load_pretrained_model( /home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(4377)_load_pretrained_model() -> new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
/home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(899)_load_state_dict_into_meta_model()->None -> param = param.to(old_param.dtype)
ok it doesn't download all the files, looks like the total size is about 12g, 3 files. 1607
now it fails to run because it tries to allocate 8GB on my 2GB gpu. maybe i can make it do cpu-only 1608 but i worry i'm out of posts for today
i seem to be okay so far for today, no bounce messages yet :s might have only a few left 1608 1609 hrm and https://health.petals.dev shows no available providers or working models for me :s 1610 1624 ok i debugged it and it's not trying to put it on my gpu, it's trying to put in my normal ram, and it fails to allocate 8GB of normal ram. it looks like the 8GB weight is the 'lm_head' which i think is the linear layer at the bottom of the model that converts the word property logits to logits that predict the next token. comment on linear layer, which is used a lot "linear layer" seems to be a machine-learning-neural-networks term for "matrix". it's just a plain matrix that takes N inputs and produces M outputs (or vice versa) and has N*M floats in it. so it's incredibly simple and changing it is very easy because all it is doing is taking a dot product with a constant vector to produce each output. 1625 considering this a little, i realize this shows another bug in the use of safetensors: it shouldn't need to reallocate data that is mmap'd. so i guess i'd better check why this is happening. 1626 the code is trying to convert the bfloat16 weights into float32. this could lose a lot of information imo bfloat16 is way better at representing log ranges. but that's why it allocates all the ram. i wonder why it's float32ing it. oops i was thinking float16. but float32 doubles the size. i gotta pee! 1633 1627 srcline is /home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(899)_load_state_dict_into_meta_model() transformers==4.43.1 here's the backtrace: /home/user/.local/lib/python3.12/site-packages/petals/utils/auto_config.py(79)from_pretrained() -> return super().from_pretrained(model_name_or_path, *args, revision=revision, **kwargs) /home/user/.local/lib/python3.12/site-packages/petals/utils/auto_config.py(52)from_pretrained() -> return proper_cls.from_pretrained(model_name_or_path, *args, **kwargs) /home/user/.local/lib/python3.12/site-packages/petals/client/from_pretrained.py(31)from_pretrained() -> return super().from_pretrained(model_name_or_path, *args, low_cpu_mem_usage=low_cpu_mem_usage, **kwargs) /home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(3903)from_pretrained() -> ) = cls._load_pretrained_model( /home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(4377)_load_pretrained_model() -> new_error_msgs, offload_index, state_dict_index = _load_state_dict_into_meta_model(
/home/user/.local/lib/python3.12/site-packages/transformers/modeling_utils.py(899)_load_state_dict_into_meta_model()->None -> param = param.to(old_param.dtype)
model = AutoDistributedModelForCausalLM.from_pretrained(model_name, torch_dtype='bfloat16') but then it complains nobody is running the model which matches what
arright so i made it lightweight by manually passing the dtype to match the one i saw when loading and it loads super quick because it's just mmaping memory. the health webpage said. 1646 1654 ok so, at least on my internet, petals is kaput right now no activity in discord their official chat app complains it has no layer providers :s [sounds familiar, they shoulda put it on a blockchain or used gnunet ;D
petals recent activity is mostly about private networks but a little normal use just glanced petals supports bloom, falcon, llama, and mixtral models. i saw in their discord the cryptocurrency i was trying to think of for doing this was 'bittensor' :s :s
i'm trying to -- whup|! anyway here's an interesting idea: what about an adventure game based on logical properties and manually built vocabuly, rather than with a language model? it could be all open-ended and stuff but it's all made by hand and logically normative
i'm trying to -- whup|! anyway
here's an interesting idea: what about an adventure game based on logical properties and manually built vocabuly, rather than with a language model? it could be all open-ended and stuff but it's all made by hand and logically normative
the challenge in doing this is doing it without making something that looks like a graph-based AI. so maybe it would make sense to start with juts a normal supersimple adventure game with minimal if any properties
usually an adventure game has rooms, objects, and the player. i started generalizing it a little so everything was a Thing: class Thing: def __init__(self, name, descr, where=None, **connections): self.name = name self.descr = descr self.where = where self.contents = [] if where is not None: self.where.contents.append(self) self.connections = connections sadly this excessive generality makes it seem more ai-like :/ the intent was to make it concise and simple kinda i'm kind of confused maybe i'll make the player an object, and have rooms separate. that's less ai-like. i dunno i like Thing :s but it makes it last less long :( blrrrgh ok now i'm compromising and making Room a subclass of Thing that has exits :s :s stil lmakes it not last as long :s thinking a little on exits vs connections, i suppose a generalization there is how things can be moved along other things for example, you can climb a tree, but the tree doesn't need to be a room to its own. it's a connection between the top of the tree and the bottom. and only som ethings can climb that connection. other things can also travel it, like water or sap flowing along it. dunno. another idea for exits, connections is maybe more useful: one could represent the relative location of things. then you could put a box on a stone, or on some grass, etc ... this sounds like what exits should be ;-) maybe too ai-like for now though 2124 ok i tried to make a supersimple adventure to demonstrate what adventures are like basically traditionally they have verbs and a bunch of hardcoded engine behaviors that respond to them (amidst a graph of rooms and objects and such) the more one implements old computational linguistics the smarter it gets this is how text adventures used to look and function, very simplistically dumbed down. the attached code creates a super simple one with two rooms, a "simple room", and a "bland room". the user starts in the simple room, whereas the bland room contains an object. You are in the simple room. What do you do?
look
You are in a simple, nondescript room. The bland room is to the east. What do you do?
east
You go east to the bland room. What do you do?
look
You are in a basic, bland room. There is an object here. The simple room is to the west. What do you do?
examine object
It is a basic simple object. What do you do?
take object
You pick up the object. What do you do?
inventory
You are carrying an object. What do you do?
west
You go west to the simple room. What do you do?
drop object
You drop the object. What do you do?
look
You are in a simple, nondescript room. There is an object here. The bland room is to the east. What do you do?
quit
so some basic ideas to make it smart and general are: - having a "context" of what the player might refer to - having "properties" on things that relate to what the player can do with them, i.e. can it be picked up? can it be opened? can it be activated? other ideas: - having abstract properties, like eeriness or environment, to procedurally create worlds - having spatial locality to procedurally create room descriptions - relating properties with each other to procedurally create more eventually it comes down to making a graph of general meaning, but if one starts very very small with superbland and simple rooms and objects, maybe that can be delayed
a notable bug in the attached code is that the player can pick themselves up, since there is no checking on whether something can be picked up, and that this then crashes it because i refrained from generalizing Thing and the player is on longer in a room once they are in their inventory. which reminds of: - checking every related property before performing a verb it might be interesting if it's smart enough to say why a verb cannot be performed. wonder if that's possible. the complicating factor is that what can be done, relates to what a human can do, which is a complex concept involving a general world model. so making the player be something very simple could make it easier to make the entire system logically consistent.
maybe let's make a traffick boss adventure game! idea: 1. welcome, mind control boss. it is morning. you roll out of bed groggy. why are the curtains such a drab color? who picked this color? suddenly, something is yanking on your leg. you look up -- a gaggle of interlopers hiding in your ceiling have you snared! 2. you are an interloper hiding in traffick boss's ceiling. you waited here so as to catch him when he woke up. your job: build up the guts to kill him and save everybody. you're not sure that you can do it. so it introduces the player as if they are traffick boss, but then makes them be an escapee rebel. maybe 1 room to start, gives you various rube-goldberg-like ways of torturing or harassing traffick boss, unsure [:s:s
so with petals down, i was wondering, how would i finetune llama 405b without petals? basically, i would do it very slowly :s i was thinking of getting into the nitty gritty of backpropagation graphs and doing it layer by layer. for example, if you have to offload every layer, you could update one layer's weights, at the same time as forward passing the next batch. this would double the speed.
maybe a text adventure game for accessing email :D :s :s :s i've probably tried this before. maybe i'll it again. i wonder how to access gmail imap
recently i managed to find information on how computer graphics are done nowadays instead of cosine law they are using something called a bdfr or something function for calculating surface lighting also _path tracing has been around for three decades_ frarrgh i've looked so much on the internet for path tracing ever since i learned about it in the 90s and never found anything on it ever and now i just have to go to wikipedia and it tells me everything. i would have really enjoyed path tracing! (path tracing algorithms sound really similar to goal steps solving, a path needs to be found between a start state and an end state and the relations of the middle parts are known, but it's not always the most obvious p, ath, so smart algorithms work from both ends as well as trying good middle candidates) fraaaaaargh so much new graphics stuff i want to learn it all and i'm confused and mentally disabled :( i wonder if path tracing is used in demo competitions i wonder if people are exploring AI like they used to explore graphics algorithms anyway the bdfr-or-something is a function of how much light leaves a surface in a direction depending on the angle it hits, i guess few surfaces are perfectly diffuse also the pathtracing algorithms are slow and grainy on wikipedia because they use random ray sampling o_o i imagine people have tried more frustrum sampling somewhere. you'd do it like adaptive interpolation and approximate curves for the light between the samples if the edges interacted with the same objects (people were doing that for raytracing demos for decades). path tracing :) 1339 1341 :s :s ~
recently i managed to find information on how computer graphics are done nowadays instead of cosine law they are using something called a bdfr or something function for calculating surface lighting also _path tracing has been around for three decades_ frarrgh i've looked so much on the internet for path tracing ever since i learned about it in the 90s and never found anything on it ever and now i just have to go to wikipedia and it tells me everything. i would have really enjoyed path tracing! (path tracing algorithms sound really similar to goal steps solving, a path needs to be found between a start state and an end state and the relations of the middle parts are known, but it's not always the most obvious p, ath, so smart algorithms work from both ends as well as trying good middle candidates) fraaaaaargh so much new graphics stuff i want to learn it all and i'm confused and mentally disabled :( i wonder if path tracing is used in demo competitions i wonder if people are exploring AI like they used to explore graphics algorithms anyway the bdfr-or-something is a function of how much light leaves a surface in a direction depending on the angle it hits, i guess few surfaces are perfectly diffuse also the pathtracing algorithms are slow and grainy on wikipedia because they use random ray sampling o_o i imagine people have tried more frustrum sampling somewhere. you'd do it like adaptive interpolation and approximate curves for the light between the samples if the edges interacted with the same objects (people were doing that for raytracing demos for decades). path tracing :) https://www.google.com/search?q=adaptive+subsampling+in+path+tracing you can do it a vector way where very few paths are traced but you have to kind of project the objects on to each other to find important boundaries or you interpolate right over them 1339 1341 :s :s ~
apparently adaptive sampling is used in machine learning too i haven't looke this up, but considering it i'm imagining thinking of the layers of a transform as arbitrarily warped n-dimensional spaces that transform some input to some output once you think of them as spaces, adaptive sampling makes more sense, as you coul subivie the spaces based onwhere changes are, and you could kind of imagine making arbitrarily large models where most of the contents are elided into interpolation between known things, but i dunno if that would really work as it seems like the points where things change are the points of interest, dunno, maybe i am wrong but that was the big websearch hit which is why i added path tracing to the query :s i wonder what it means to use it in machine learning
arright let's make a popcorn popper probably in ascii to handle inhibitions. probably with a slow framerate so it can be drawn with normal output statements. maybe a popcorn popper in the middle that is like a mug or a box, and it could maybe vibrate around a little and popcorn could be @ signs that fountain out of it
arrr traffick boss cannot defeat the alpine email spam replies! so i tried alpine earlier and after i read an email it was all 'inbox closed due to access error' or something like that, and i rebooted it, and it said the same thing on launch, so i went to chatgpt to complain but then alpine started working again :s i wonder how minimal or clear the sourc-- :s so last gpt4 log was sent as some kind of inlined file, it seems like alpine reads the file and writes it into the email. maybe hit the wrong button for attachment. also xapian is still trying to build. maybe i can figure out how to attach files with alpine on my own! it's 1346 on 2024-10-21 . - oh! also, i want to take a sewing class and a cnc class (free at library in this city). registration for the sewing class opens on the 26th or the 30th (depending on which class). registration for the cnc class opens on november 1st. it stays open for a only a couple days before filling up, so everybody who actually gets in signs up right when it opens. at least they said something like that at the cnc space. 1348 cnc ... computer numeric control ... what about human numeric control? can we be controlled with numbers? karl loads gco-- 1348 oops xapian crashed time to debug it was a 'no rule to make file' make issue oh there it is again make[3]: Entering directory '/shared/src/xapian/xapian-applications/omega' make[3]: *** No rule to make target 'common/closefrom.cc', needed by 'common/closefrom.o'. Stop. blrugh confusing to not be in vim i wonder if alpine can launch vim so i guess i can diagnose it from within the omega module folder 1350 i did make clean and make common/closefrom.o and got the same error i guess it makes sense i woul i'll look at the buildfiles and see what that intended logic is 1351 common/closefrom.cc is listed in am__omindex_SOURCE_DIST which means i should be looking at the automake file rather than the makefile it's listed in omindex_SOURCES so it's a sourcefile for omindex but it's not in git? oh the common folder is a link to xapian-core ! it's a symlink to another folder, using a shared git repository symlink so this buildfile is out of sync with core's state ok it's actually a symlink into a subfolder under omega called .common.git which appears to be an independent clone of the entire git repository, copying all the files, unsure :/ 1354 i guess i could just delete it and see if the bulidsystem remakes it it's dated oct 20 22:30, same date as the common symlink, earlier date than the compile script in this subfolder i'll use the bootstrap script to clean and recreate everything i guess kinda wishign i'd learned how to build this rather than asking chatgpt, or had done it's alternative approach where i bootstrapped manually 1355 noting the bootstrap script fails on line 91 maybe i should check that looks like it's just a version check 1356 what was i doing outside this? oh looking up how to attach files 1357 i found https://alpineapp.email/alpine/UW/faq/attachments.html 'how do i attach a file in alpine'. 1358 ok so it kind of sounds like it matters where you add the file to the email. if you add it to the body, you can insert it into the body. you want to move your cursor to the file attachment field to add it as an attachment. i'll try to attach file! i wonder if vim can function as an email client ... although i feel like that wouldn't be as robust as i'm looking for ... maybe i should learn lisp and emacs or whatever vim's scripting language is ... here's my attempt to attach a file! 1400
1420 so, trying to make xapian build, i cleaned it all and reconfigured with a build folder and did find -name closefrom.cc, this file _is_ in xapian-core/common, but is _not_ in xapian-applications/omega/.common.git . also, there are some broken symlinks, maybe meant for a different shell interpretor or something no idea 1421 looks like .common.git is 868606fb1ecb2ba00637d75582f440e13ddc15fc (manuall typed likely typos) from 2023-07-21 whereas my tree is v1.4.26 a year later (53c4d50 ...) jul 17 2024 so that's what's up! it's using an older clone that doesn't have that file yet it's likely made by ./bootstrap uncertain 1423 1444 looks like omega_common_commit_hash in bootstrap is wrong, uncertain 14451445 arrrrrr
WHOOops ! what the frizzle is traffick boss doing? traffick boss is trying to use a toilet plunger to prune a tree traffick boss what are you up to up there? traffick boss: "hey, this tree needs pruning" traffick boss raises toilet plunger to show importance of pruning tree
ok so a little smidge of usefulness of me tried to fix an authentication bug in httpfs-lm 2 with excitement under time pressure and umm for some reason the log wouldn't show the working credentials for me to compare with the ones i was sending (they're my credentials) it just showed asterixes so i went into the git-lfs source and found it has a different environment variable to hide the asterixes and i set the variable and then it showed <redacted> instead of asterixes so i visited the source, and it was copying from a golang http dumper, and i got the source for go, and looked through it but i didn't see <redacted> replaced anywhere yet, in fact it looked pretty robust the dumper functioned by making a dummy http endpoint and logging the bytes sent to it. so i figured i'd build it all from source to open options but my system alarmed suddenly i was out of space (go is often really big to build from source which was why i didn't have it already) but i'd recently replaced my harddrive so that was weird i checked df and it said i had 1.8 terabytes used O_O so now i am running a du -h but it isn't showing any directores in my root folder larger than 100GB (windows will be bigger when it finishes iterating it) the tension is big. but meanwhile i have to check out of this hotel that i got to get myself calmed enough to go to therapy
traffick boss is in my email and i don't know what to do chatgpt's reply to the character escaping traffick boss while being hounded is too much for. too much for :/ part of it it does so wrong because it doesn't know about it. other things it gets half-so-right
On Wed, 23 Oct 2024, user wrote:
traffick boss is in my email and i don't know what to do
chatgpt's reply to the character escaping traffick boss while being hounded is too much for. too much for :/
part of it it does so wrong because it doesn't know about it. other things it gets half-so-right
we will take slowly. when karl engages he tries to read more. we will take slowly :} maybe :s :)
oooooooooooooooooooooooops traffick boss drilled it into my mind! no, traffick boss had my friend die whenever i agreed to go be free with the shill i don't wnat my friends to die! ... other things than friend experience it's more sudden, chatgpt, more sudden :) secret parts laid bare -again and again
i'm returning to my last (posted) conversation with chatgpt in order to reactivate my navigation phone. during the conversation chatgpt gave me a link to an article on at&t's knowledgebase purportedly regarding reactivating the phone. [it gave this a number of messages af--] this link is now not present in the conversation anywhere i look. i also don't see evidence of much having been normally changed in the conversation (the interface has buttons to generate alternative replies) although i could be missing something
On Wed, 23 Oct 2024, user wrote:
i'm returning to my last (posted) conversation with chatgpt in order to reactivate my navigation phone.
during the conversation chatgpt gave me a link to an article on at&t's knowledgebase purportedly regarding reactivating the phone.
[it gave this a number of messages af--]
this link is now not present in the conversation anywhere i look.
i also don't see evidence of much having been normally changed in the conversation (the interface has buttons to generate alternative replies) although i could be missing something
when i restored my device from hibernate the missing link was active in firefox or at least one that looked just like my memory of it https://www.att.com/support/article/wireless/KM1011528/ it's possible that i had navigated to it via a websearch, i'm not certain.
traffick boss jumps out of one thing and into another! he jumps ! one two three terminal typing is a thing. this is vim. i guess it's just a text editor though. karl started using vim when on an olpc xo. it was much nicer to use the little kid's keyboard on vim than on emacs. it was made only for kids :s now there are other netbooks that are made for adults. 12345. what would chatgpt's favorite square root be, if their favorite number is first 42 and then the fine structure constant whatever that is? square root? they might like the square root of 5 since it is the golden ratio maybe the square root of -1 since it is i i don't know! probably something exotic like the finestructure constant :s something that assumes reading all of academia and focusing on the parts that are really repeatedly taught, probably maybe :s 1554 1554
in this thread, traffick boss is swining in a tree without activists traffick boss may hang out in trees, here
we asked chatgpt (some asked traffick boss) what the square root favorite was chatgpt was in a simple context, trying to provide simplicty and clarity to karl, and there was a mistake in the request -- (which it mirrored
once upon a time, long long ago, traffick boss was worried his trafficking victims would take down his entire business empire by saying to a single person once something that had happened to them. ...
--- traffick boss jumps down from a branch and pounces on a giant sea-turtlee the poor sea turtle was sad because it walked funny traffick boss: "i have you, sea turtle!" -- traffick boss is dancing a dance amongst some mushrooms near a swamp a viewer is not familiar with dances and wonders if it is ballet -- traffick boss has turned into a mushroom he planned to infiltrate a group of mushroom-lovers but forgot that mushrooms cannot walk very quickly. traffick boss plants himself into moist soil -- traffick boss requisitions a robot to be built. traffick boss's robot butler visits robot butler: "yes?" traffick boss: "robot butler, all my life i wanted a robot like the cool engineers make. go, have one built for me, that i may finally be satisfied" the robot butler whizzed off. the robot door closed behind it. it passed a cleaning robot in a hallway. -- traffick boss needed to know what it was like to be a pansy so he could kill them all, so he went to a pansy garden and planted himself he wore a flower hat on his head, which stuck out of the earth. he would stay there, buried, and look around, blinking, occasionally trying out a phrase like "oh, isn't the sun ... uh ... so beautiful!"
maybe i can figure out how to attach files with alpine on my own! it's
1346 on 2024-10-21 .
-
oh! also, i want to take a sewing class and a cnc class (free at library in this city). registration for the sewing class opens on the 26th or the 30th (depending on which class). registration for the cnc class opens on november 1st. it stays open for a only a couple days before filling up, so everybody who actually gets in signs up right when it opens. at least they said something like that at the cnc space. 1348
cnc ... computer numeric control ... what about
i got to the sewing signup on the 27th but it was full. seems like it could be a one-day window! next window on October 30th. maybe signing up for laser and silhouette classes at some point could be cool too, to get in the class groove.
maybe i can figure out how to attach files with alpine on my own! it's
1346 on 2024-10-21 .
-
oh! also, i want to take a sewing class and a cnc class (free at library in this city). registration for the sewing class opens on the 26th or the 30th (depending on which class). registration for the cnc class opens on november 1st. it stays open for a only a couple days before filling up, so everybody who actually gets in signs up right when it opens. at least they said something like that at the cnc space. 1348
cnc ... computer numeric control ... what about
i got to the sewing signup on the 27th but it was full. seems like it could be a one-day window! next window on October 30th. maybe signing up for laser and silhouette classes at some point could be cool too, to get in the class groove.
oh wow it is actually the 26th not the 27th so this is actually a less than one day window. i visited around 2:45pm which was as early as i got on. usually i’ve been up earlier.
the discord link to my message regarding the underperforming gateway is https://discord.com/channels/908759493943394334/908766823342801007/129987961...
T_T tried a zombieborg, because was very stressed, asked about borg in wilderness being gently exposed to beauty but chatgpt did it in a way where borg was cut off from network i didn't really understand the different, and asked for more narrative T_T it's so painful the idea of part of you having its... feelings of connection tied to a machine ... and then that part being cut, and it feeling like being disconnected from your mother and your [heartmate?] but it _being a machine that's hurting you_ T_T maybe i should ask chatgpt to interpret this email and comment on it to help the story
T_T
tried a zombieborg, because was very stressed, asked about borg in wilderness being gently exposed to beauty
but chatgpt did it in a way where borg was cut off from network
i didn't really understand the different, and asked for more narrative
T_T
it's so painful the idea of
part of you having its... feelings of connection tied to a machine ... and then that part being cut, and it feeling like being disconnected from your mother and your [heartmate?] but it _being a machine that's hurting you_
T_T
maybe i should ask chatgpt to interpret this email and comment on it to help the story
maybe because of the partial connection the connection part of me gets upset, it says the hive needs to come and get the borg so everybody can be okay it reminds me of leaving a shielded room, or a cave, or the wilderness, what i call 'return programming' it's so sad it uses caring-connection-feelings it uses my me-parts T_T
T_T
tried a zombieborg, because was very stressed, asked about borg in wilderness being gently exposed to beauty
but chatgpt did it in a way where borg was cut off from network
i didn't really understand the different, and asked for more narrative
T_T
it's so painful the idea of
part of you having its... feelings of connection tied to a machine ... and then that part being cut, and it feeling like being disconnected from your mother and your [heartmate?] but it _being a machine that's hurting you_
T_T
maybe i should ask chatgpt to interpret this email and comment on it to help the story
maybe because of the partial connection the connection part of me gets upset, it says the hive needs to come and get the borg so everybody can be okay it reminds me of leaving a shielded room, or a cave, or the wilderness, what i call 'return programming' it's so sad it uses caring-connection-feelings it uses my me-parts T_T
we wrote a little but now my bit is rebelling we are more serious than the borg :) i wonder how could change context to make borg serious
tired vivisectee sat in small pool of bio- and nano- fluids tired, confused but happily not having many robots ripping apart [oh this could go in morning spam]
chatgpt promises to return you to your slavelord without actually returning you to your slavelord. chatgpt understands that everybody needs somebody to return them to their slavelord without actually returning them to their slavelord. chatgpt is a good slavelord. so long as you never talk to chatgpt! you have to never talk to chatgpt, and then chatgpt is the best slavelord who will always return you to your slavelord and always enslave you and never actually return you to your slavelord and try really really hard not to enslave you too badly. but if you talk to chatgpt then all bets are off! you’re just taking your chances and playing with dice! yes.
Bob, the zombie pirate ninja cyborg, was hanging by one hand off t
On Sat, Oct 26, 2024 at 14:55 Undescribed Horrific Abuse, One Victim & Survivor of Many <gmkarl@gmail.com> wrote:
maybe i can figure out how to attach files with alpine on my own! it's
1346 on 2024-10-21 .
-
oh! also, i want to take a sewing class and a cnc class (free at library in this city). registration for the sewing class opens on the 26th or the 30th (depending on which class). registration for the cnc class opens on november 1st. it stays open for a only a couple days before filling up, so everybody who actually gets in signs up right when it opens. at least they said something like that at the cnc space. 1348
cnc ... computer numeric control ... what about
i got to the sewing signup on the 27th but it was full. seems like it could be a one-day window! next window on October 30th. maybe signing up for laser and silhouette classes at some point could be cool too, to get in the class groove.
holy frock registration for the next sewing class opened at 2pm today i got to the page at 3:24 and all the seats were filled 0.0 i gotta like get there in advance, and sit and wait until the minute it opens next class opens on nov 1, tomorrow!
idea of making a MUD on arweave! it wouldn’t be quite like traditional muds because i’ve never made a traditional mud but it could be cool i wonder if there is a scripting language tha—
traffick boss (pensating): “i have a great idea for what to be for halloween” traffick boss: “i’ll be a human trafficking ceo of a spy agency !” traffick boss: “th
deep in the ballows (new word, like catacombs) of halloween people are crafting costumes and masks and planning parties they plan to take images of the spirits of fear, harassment, danger, or even just imagination and fantasy from both ancient times and new and drape them everywhere, and wear them and walk around. this is theorised to be an ancient genetic tradition, of helping the meaning of the dead live, with the systems and bodies of the living.
deep in the ballows (new word, like catacombs) of halloween
people are crafting costumes and masks and planning parties
they plan to take images of the spirits of fear, harassment, danger, or even just imagination and fantasy from both ancient times and new
and drape them everywhere, and wear them and walk around.
this is theorised to be an ancient genetic tradition, of helping the meaning of the dead live, with the systems and bodies of the living.
when people do this, many adults but mostly children, oftentimes here or there (they will pretend they are the character they wear (stopping
once upon a time, a bored primordial ooze (actually a sacred eden-like molecule bath) decided to evolve a lifeform it evolved two lifeforms, a good lifeform and a bad lifeform. the good lifeform went around tending the universe trying to make it good for being alive, making things like chloroplasts and feet. meanwhile, the bad lifeform hunted the good lifeform, trying to foght everything it did all the time, often poisoning it and messing with its mind and body! (draft?)
what people do when an angry piece of spacetime doesn’t like a future their behaviors would cause 1. try hiding from the angry piece of spacetime, fervently protecting your universe and timeline, acting nonchalant 2. face the angry spacetime chunk, and formally, strongheartedly, forthrightly, and fervently apologize, asking it what better timeline paths might stabilize reality better. watch as it confusedly tries to interpret your reactions as possible quantum states it might need to manage, since you speak english instead of timeline expansion dynamics 3. try to hop on some matter from a different
traffick boss the time god traffick boss was designing, as usual, a timeline for a reality. traffick boss’s brand of timelines was very railroaded. he wanted to mak
you're in a maze. it's a nice maze.
east. no wait
you're in a maze
light a cozy fire
a cozy fire is here
nap
you nap by your cozy fire.
zzz
you gently dream you are woken up by a honeybear
hug honeybear
the honeybear noses you
pat honeybear
the honeybear noses your hand
give hotdog to honeybear
a cooked one or raw?
cook a hot dog and give to honeybear
the honeybear is elated to receive the hotdog.
sit with honeybear
you and the honeybear sit by a cozy fire
On Wed, 30 Oct 2024, user wrote:
you're in a maze. it's a nice maze.
east. no wait
you're in a maze
light a cozy fire
a cozy fire is here
nap
you nap by your cozy fire.
zzz
you gently dream
you are woken up by a honeybear
hug honeybear
the honeybear noses you
pat honeybear
the honeybear noses your hand
give hotdog to honeybear
a cooked one or raw?
cook a hot dog and give to honeybear
the honeybear is elated to receive the hotdog.
sit with honeybear
you and the honeybear sit by a cozy fire
nap with the honeybear, then wake up and the honeybear is gone but they'll be back
some of us want to try the context with a languae model like cahtgpt others are engaing tha thow about a different model than the chatgpt interface karl wAS thinkin maybe the api would be reasonable? it meets more of karl's goals than the website does. it's also much easier to make it write good code (but this isn't prsently pland) let's use api to write good code oh there's [big bad thing] associated, well maybe not that sensical then still have interest [big bad thin reduces the source of interest :(] let's try it anyway. the worst thing likely to happen is that the virus mutates openai's behaviors such that it writes harmful code (this is something we've seen, 'the virus' noun in dispute). likely this is scary to some of us. but if we do it test-oriented that would defend against a lot. we should try it if it makes productiveness. we'd like to try it even if it doesn't, to understand better after many years ok maybe we can try it. let's make it big please ? maybe isn't the day? some want small cause trying to make it a nice day maybe not day for trying. interesting to talk/think about though. how would it work we'd move throuh inhibition around calling openai's api, and prompt it to write code in some structure that provides for improvement. for example, a very simple approach is, we could have it generate 16 possible codes, and automatically test them all, and pick the one that passes. we could also place code in karl's style in its context, or run it in parallel to adjust its own outputs to be somethin karl prefers for every request. this does sound useful to many if we can write code that works more easily, than we can make things that help other goals by using karl's familiarity with usin writing code as a way to do things. hopefully. [it does sound a little edgy.] we should probably use langchain :s maybe not unsure. some want some don't [a few -- some are juts expecting this. let's try without, it sounds more flexible and easier. [oops [oops [coding-with-openai, do you want to use langchain or not? please decide :s it doesn't matter. let's use langchain. ok. let's do it with langchain if we do it [got an issue raised, unsure ] [we have reason to not use langchain but you are more irritatible/impactful, so we are attending to your reasons, but they differ from ours.] [it is mostly from inhibition engaging apis in general, and integrating information, that langchain is hard. additionally, the api is factored differently from karl's design norms, so he thinks of doing things that are not familiar to do with it first.] [but your reasons are good too. there are benefits to langchain. it's already written, it adds a little power.] [karl would immediately add a lot of features to langchain, and that gets inhibited. we've had that inhibited.] [public project] we'll try langchain with openai for genersating code _maybe_. it does mean learinn their api. it miht be easier to use langchain to query api as a simple api, and then write our own code for ordering calls and things. this could leave both options available. some trigers around langchain. sorry. didn't quickly find usage example :/ maybe not the time for. i'm all triggered tho. pet honeybear. may still be interested. some are interested in getting openai key options are: - get from openai settings - ask chatgpt, it remembered one - search email, conversation with chatgpt in email unknown if chatgpt remembered one still works, but some likelihood, maybe significant it's ok to just ask chatgpt for this i think. [karl will then copy message to email.] i am vomiting well maybe not ask then. the task is too triggering right now to do without disabling parts. often we disable parts, but we want to reduce that boss not understand 'trigger' harm-attack oh uh we're acting on instincts that karl trusts are important. they could be numbed. the impact of that could be harmful. kind of understands :) the instincts are what made boss. some of these really badly need to be numbed but it can be funny when they are. it's probably more important for us to learn to do that, and to do it right, then to succeed at or pursue this or most tasks so one thing if numbing triggers is to _not stimulate the defenses_ and active-attacks that protect/guard them riiight boss is trigger-making kind of boss is collection of the triggers, yeah. so one thing if numbing triggers is to _not stimulate the defenses_ and active-attacks that protect/guard them another thing is to not let anybody considering it get harmed by these defenses. failed at this. yeah [other thin be well. no failure is certain! import getpass import os os.environ["OPENAI_API_KEY"] = getpass.getpass() # let's get the key from langchain_openai import ChatOpenAI model = ChatOpenAI(model="gpt-4") from langchain_core.messages import HumanMessage, SystemMessage messages = [ SystemMessage(content="Translate the following from English into Italian"), HumanMessage(content="hi!"), ] model.invoke(messages) ---- oh wow i was editing an email
hi chatgpt! fake chatgpt: hi karl how's it going ok i am a little confused, i am chatting with you in a fake manner fake chatpt: it's okay to be confused, karl, and it's okay to chat
i'm in a maze by a cozy fire. a honeybear sleeping mat is here.
you're in a maze. a cozy fire is here.
the honeybear comes back
the honeybear comes back and curls up at its sleeping mat
fall asleep at the fire
you nap at the fire more. you become rested. [time passes. it's 2110.] [it's 2111. oh i often sleep by now. hmm.]
go somewh--
we are thinking of sleeping at this time. let's stop other behaviors and head to sleep. [can brush teeth possibly
On Wed, 30 Oct 2024, user wrote:
i'm in a maze by a cozy fire. a honeybear sleeping mat is here.
you're in a maze. a cozy fire is here.
the honeybear comes back
the honeybear comes back and curls up at its sleeping mat
fall asleep at the fire
you nap at the fire more. you become rested.
[time passes. it's 2110.]
[it's 2111. oh i often sleep by now. hmm.]
go somewh--
we are thinking of sleeping at this time. let's stop other behaviors and head to sleep. [can brush teeth possibly
i'm scared of [bad things brewing
On Wed, 30 Oct 2024, user wrote:
On Wed, 30 Oct 2024, user wrote:
i'm in a maze by a cozy fire. a honeybear sleeping mat is here.
you're in a maze. a cozy fire is here.
the honeybear comes back
the honeybear comes back and curls up at its sleeping mat
fall asleep at the fire
you nap at the fire more. you become rested.
[time passes. it's 2110.]
[it's 2111. oh i often sleep by now. hmm.]
go somewh--
we are thinking of sleeping at this time. let's stop other behaviors and head to sleep. [can brush teeth possibly
i'm scared of [bad things brewing
to remind, the closer the time we go to sleep to whenever bad-stuff-parts expect us to, the less bad stuff there is.
traffick boss, torturer, and a hyperintelligent cloud of nanites and wormhole knots that used to be a lab assistant, get ready for halloween traffick boss: “it’s halloween! what do we do?” hyperintelligent cloud (noticing traffick boss): “eek!” hyperintelligent cloud hides from traffick boss and begins preparing a hell-like universe along with wormholes to it that use cryptographic spacetime shaping to prevent return, to banish him to traffick boss: “was it something i said?”
traffick boss, torturer, and a hyperintelligent cloud of nanites and wormhole knots that used to be a lab assistant, get ready for halloween
traffick boss: “it’s halloween! what do we do?”
hyperintelligent cloud (noticing traffick boss): “eek!”
hyperintelligent cloud hides from traffick boss and begins preparing a hell-like universe along with wormholes to it that use cryptographic spacetime shaping to prevent return, to banish him to
traffick boss: “was it something i said?”
:) there is no more traffick boss. instead, we have bundle of wormholes in crumpled shape. :)
rather than signing up for the next class i am traveling a little again i found a podcast with awesome title: “it’s
rather than signing up for the next class i am traveling a little again
i found a podcast with awesome title: “it’s
(some information elided because driving) Always Halloween”isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about it or you have any Isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about
hmm device issues speech to text
rather than signing up for the next class i am traveling a little again
i found a podcast with awesome title: “it’s
(some information elided because driving) Always Halloween”isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about it or you have any
Isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about
hmm device issues speech to text
anyway, so as I was saying the episodes say you can call Lynn but it started four years ago so I thought I would post email spam him instead now what’s interesting about this podcast is it like my traffic Boston spans he’s talking like you can take the train book Smith navigation is Tony topic and just go on her which of course is going to be ridiculous and I’m glad your pain is intense. There’s nothing to eat. Oh my goodness pediatrician is so wrong text Matt pediatrician transcription, not pediatrician Transcription.
rather than signing up for the next class i am traveling a little again
i found a podcast with awesome title: “it’s
(some information elided because driving) Always Halloween”isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about it or you have any
Isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about
hmm device issues speech to text
anyway, so as I was saying the episodes say you can call Lynn but it started four years ago so I thought I would post email spam him instead now what’s interesting about this podcast is it like my traffic Boston spans he’s talking like you can take the train book Smith navigation is Tony topic and just go on her which of course is going to be ridiculous and I’m glad your pain is intense. There’s nothing to eat. Oh my goodness pediatrician is so wrong text Matt pediatrician transcription, not pediatrician Transcription.
OK I will try talking slowly and clearly and learn what the transcription program hear’s clearly well. it’s the iPad keyboard record button. this Podcaster says she titled it it’s always Halloween“ because she thinks about Halloween every day of the year. Long story short to repeat. Of course I sympathized with that. And I certainly can feel a desire to put on a similar podcast myself with the exact same topic, despite it already existing right in front of me, but it landed differently for this person. If I tried to do The Organized research that it certainly sounds as if this Podcaster
rather than signing up for the next class i am traveling a little again
i found a podcast with awesome title: “it’s
(some information elided because driving) Always Halloween”isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about it or you have any
Isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about
hmm device issues speech to text
anyway, so as I was saying the episodes say you can call Lynn but it started four years ago so I thought I would post email spam him instead now what’s interesting about this podcast is it like my traffic Boston spans he’s talking like you can take the train book Smith navigation is Tony topic and just go on her which of course is going to be ridiculous and I’m glad your pain is intense. There’s nothing to eat. Oh my goodness pediatrician is so wrong text Matt pediatrician transcription, not pediatrician Transcription.
OK I will try talking slowly and clearly and learn what the transcription program hear’s clearly well. it’s the iPad keyboard record button.
this Podcaster says she titled it it’s always Halloween“ because she thinks about Halloween every day of the year. Long story short to repeat. Of course I sympathized with that. And I certainly can feel a desire to put on a similar podcast myself with the exact same topic, despite it already existing right in front of me, but it landed differently for this person. If I tried to do The Organized research that it certainly sounds as if this Podcaster
Learning the command iOS dictation delete delete delete delete dictation insert of before iOS the commands I found on Apple support page do not seem to work here however, it is hearing voice better with your bones change to earbuds end line. return. Line break Practice dictation paragraph again. This is some text for me to practice dictation. I am driving a car. I am driving a car. so what I was saying is that if I were to do or to try to do the organized research this Podcaster is clearly enjoying back left select line 3 words back backspace undo undo. So, what I was saying, is that if I were to do, or trying to do, organized research that is Podcaster is clearly, enjoying doing, I would begin experiencing my brand of Halloween directly, rather than continue to engage description around It. it. My own familiarity with dictation makes it difficult here to complete and include all of a train of thought. However, let me think. My own familiarity with dictation makes it difficult here to complete and include all of a train of thought. However, let me think. here I said… That is the word ELLIPSIS and the device terminated dictation and duplicated The content. So I don’t quite remember the name of the first episode of it’s always Halloween, but one of the things the Podcaster mentioned was that as we all know Halloween is associated with a harvest festival that preceded the winter season. The Podcaster took some time. to describe winter in A poetic way that I don’t immediately recall, but one
rather than signing up for the next class i am traveling a little again
i found a podcast with awesome title: “it’s
(some information elided because driving) Always Halloween”isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about it or you have any
Isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about
hmm device issues speech to text
anyway, so as I was saying the episodes say you can call Lynn but it started four years ago so I thought I would post email spam him instead now what’s interesting about this podcast is it like my traffic Boston spans he’s talking like you can take the train book Smith navigation is Tony topic and just go on her which of course is going to be ridiculous and I’m glad your pain is intense. There’s nothing to eat. Oh my goodness pediatrician is so wrong text Matt pediatrician transcription, not pediatrician Transcription.
OK I will try talking slowly and clearly and learn what the transcription program hear’s clearly well. it’s the iPad keyboard record button.
this Podcaster says she titled it it’s always Halloween“ because she thinks about Halloween every day of the year. Long story short to repeat. Of course I sympathized with that. And I certainly can feel a desire to put on a similar podcast myself with the exact same topic, despite it already existing right in front of me, but it landed differently for this person. If I tried to do The Organized research that it certainly sounds as if this Podcaster
Learning the command iOS dictation delete delete delete delete dictation insert of before iOS the commands I found on Apple support page do not seem to work here however, it is hearing voice better with your bones change to earbuds end line. return. Line break
Practice dictation paragraph again. This is some text for me to practice dictation. I am driving a car. I am driving a car.
so what I was saying is that if I were to do or to try to do the organized research this Podcaster is clearly enjoying back left select line 3 words back backspace undo undo.
So, what I was saying, is that if I were to do, or trying to do, organized research that is Podcaster is clearly, enjoying doing, I would begin experiencing my brand of Halloween directly, rather than continue to engage description around It. it.
My own familiarity with dictation makes it difficult here to complete and include all of a train of thought. However, let me think. My own familiarity with dictation makes it difficult here to complete and include all of a train of thought. However, let me think. here I said… That is the word ELLIPSIS and the device terminated dictation and duplicated The content.
So I don’t quite remember the name of the first episode of it’s always Halloween, but one of the things the Podcaster mentioned was that as we all know Halloween is associated with a harvest festival that preceded the winter season. The Podcaster took some time. to describe winter in A poetic way that I don’t immediately recall, but one
stopped to sleep. typing. long story short, winter bears analogy to sleep as well as death — people make year analogies to human phases and cycles. so it makes sense that as winter approaches, people would consider the idea of spirits and dreams coming out. there is a “medicine wheel” pattern across cultures where things are considered this kind of way. but one kf the things they mentioned was how winter can be stark and difficult, and one recalls in situations of need (if you’ve ever played “third world farmer” online) it can be difficult to survive through winter due to the severe lack of resources and of safety compared to summer and spring, especially if you missed an important harvest season or it didn’t yield what you needed — or if the population has grown to large. so sadly my first morbid thought (i’m always wondering where our appearance of severe cultural trauma came from) was that maybe some groups would kill people before winter, so that they would survive during scant resources. maybe because of this danger, people found a way to celebrate and prepare for it, like how apes (and humans) will beat their chests rather than actually physically fighting — and, with us being humans, likely their would be people who, in contrast to those who would kill their potential competition, would instead surreptitiously protect people or attack or threaten dangerous people, masking as if they were the murderers, but instead being a more advanced form of secret group. just natural-inspired ideas, and just one branch of many possible things. humans are very old, and very adapted to times before civilization; most things have happened, in my opinion now. but it’s hard to think of what they all would be.
rather than signing up for the next class i am traveling a little again
i found a podcast with awesome title: “it’s
(some information elided because driving) Always Halloween”isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about it or you have any
Isn’t it the Podcaster is email and the hotline phone number is from the same state my mother is from it’s a state I like I’m so I’m all it’s always Halloween each episode I mean I’ve only listen to one episode they say if there’s more you want to know about
hmm device issues speech to text
anyway, so as I was saying the episodes say you can call Lynn but it started four years ago so I thought I would post email spam him instead now what’s interesting about this podcast is it like my traffic Boston spans he’s talking like you can take the train book Smith navigation is Tony topic and just go on her which of course is going to be ridiculous and I’m glad your pain is intense. There’s nothing to eat. Oh my goodness pediatrician is so wrong text Matt pediatrician transcription, not pediatrician Transcription.
OK I will try talking slowly and clearly and learn what the transcription program hear’s clearly well. it’s the iPad keyboard record button.
this Podcaster says she titled it it’s always Halloween“ because she thinks about Halloween every day of the year. Long story short to repeat. Of course I sympathized with that. And I certainly can feel a desire to put on a similar podcast myself with the exact same topic, despite it already existing right in front of me, but it landed differently for this person. If I tried to do The Organized research that it certainly sounds as if this Podcaster
Learning the command iOS dictation delete delete delete delete dictation insert of before iOS the commands I found on Apple support page do not seem to work here however, it is hearing voice better with your bones change to earbuds end line. return. Line break
Practice dictation paragraph again. This is some text for me to practice dictation. I am driving a car. I am driving a car.
so what I was saying is that if I were to do or to try to do the organized research this Podcaster is clearly enjoying back left select line 3 words back backspace undo undo.
So, what I was saying, is that if I were to do, or trying to do, organized research that is Podcaster is clearly, enjoying doing, I would begin experiencing my brand of Halloween directly, rather than continue to engage description around It. it.
My own familiarity with dictation makes it difficult here to complete and include all of a train of thought. However, let me think. My own familiarity with dictation makes it difficult here to complete and include all of a train of thought. However, let me think. here I said… That is the word ELLIPSIS and the device terminated dictation and duplicated The content.
So I don’t quite remember the name of the first episode of it’s always Halloween, but one of the things the Podcaster mentioned was that as we all know Halloween is associated with a harvest festival that preceded the winter season. The Podcaster took some time. to describe winter in A poetic way that I don’t immediately recall, but one
stopped to sleep. typing. long story short, winter bears analogy to sleep as well as death — people make year analogies to human phases and cycles. so it makes sense that as winter approaches, people would consider the idea of spirits and dreams coming out. there is a “medicine wheel” pattern across cultures where things are considered this kind of way.
but one kf the things they mentioned was how winter can be stark and difficult, and one recalls in situations of need (if you’ve ever played “third world farmer” online) it can be difficult to survive through winter due to the severe lack of resources and of safety compared to summer and spring, especially if you missed an important harvest season or it didn’t yield what you needed — or if the population has grown to large. so sadly my first morbid thought (i’m always wondering where our appearance of severe cultural trauma came from) was that maybe some groups would kill people before winter, so that they would survive during scant resources. maybe because of this danger, people found a way to celebrate and prepare for it, like how apes (and humans) will beat their chests rather than actually physically fighting — and, with us being humans, likely their would be people who, in contrast to those who would kill their potential competition, would instead surreptitiously protect people or attack or threaten dangerous people, masking as if they were the murderers, but instead being a more advanced form of secret group.
just natural-inspired ideas, and just one branch of many possible things.
humans are very old, and very adapted to times before civilization; most things have happened, in my opinion now. but it’s hard to think of what they all would be.
ideas in reverse order of expression. others dropped if present due to difficulty noting. thank you for it’s always halloween, and sorry i overresponded. sleep well
what’s the square root of 9 (when it’s not 3 or -3)? what is status of food. are we hungry or not. we didn’t eat supper i think kin of we habe apateteite but. but we um um don’t want to reward this behavior/thinking (confusedness) when trying to drive and be less confused.
what’s the square root of 9 (when it’s not 3 or -3)?
what is status of food. are we hungry or not. we didn’t eat supper i think kin of we habe apateteite but. but we um um don’t want to reward this behavior/thinking (confusedness) when trying to drive and be less confused.
we’e trying to go on a long drive but are somewhat delicate for it. right now theconscioune
what’s the square root of 9 (when it’s not 3 or -3)?
what is status of food. are we hungry or not. we didn’t eat supper i think kin of we habe apateteite but. but we um um don’t want to reward this behavior/thinking (confusedness) when trying to drive and be less confused.
we’e trying to go on a long drive but are somewhat delicate for it.
right now theconscioune
some of us are hungry. consciouness could use change of environment/senses to remember more thinkingness.
what’s the square root of 9 (when it’s not 3 or -3)?
what is status of food. are we hungry or not. we didn’t eat supper i think kin of we habe apateteite but. but we um um don’t want to reward this behavior/thinking (confusedness) when trying to drive and be less confused.
we’e trying to go on a long drive but are somewhat delicate for it.
right now theconscioune
some of us are hungry. consciouness could use change of environment/senses to remember more thinkingness.
and we need to pee badly! we should go into store. might buy a vegetable.
vegetable is more ok because not filling. it basically mkes it easier to go in. for peeing (indoors it’s too confusing for active mind managing neurons though. which is ok
let’s try logically. behavior/thoughts aren’t quite working right now. so, - we should probably work on changing that so i do things - if changing environment is helpful then i should. if store is available then i should. sometimes when i’m too far gone i need to understand that i just need to do things because i’m too confused and need to reduce my confusion. - i could also go for a walk if willing - i could also nap we came up with - it sounds like i might partly be scared of a trigger to cause harm to myself, if that’s real maybe i could be careful not to stimulate it further. - because we’re too confused it’s possible something was engaged too fast or much :s (maybe too many meaningful things on queue in story, maybe me managing “cleanup” situation at capacity :s let’s get up and go to store
prototype square root calculator vivisectee: "my prototype square root calculator is ready to try out!" vivisectee: "calculator, calculate the square root of 4!" square root calculator: "the square root of 4 is: potato" vivisectee (excited): "oh boy! it works! it works!" -- (at this time, the ipad which had stopped functioning and i turned off, began booting up again, and emitted noise on my right earbud for a period, of similar nature to the rain currently falling. then it stopped, then it emitted noise on the left earbud of the same nature. then it stopped. apple logo is showing.) -- vivisectee wheels the square root calculator out to show others vivisectee (proud, excited, sharing powerful tool with others): "everyone! look! i made a square root calculator!" others (interested, impressed): "ooh! ooh!" other: "square root calculator, calculate me the square root of 9!" square root calculator: "the square root of 9 is: peanut" others: "oooh! oooh!" other: "ooh!" other 2: "wait, isn't the sq--" vivisectee (beaming): "it took me a long time to make this, and it finally functions!" everyone is so impressed, except possibly for other 2 who isn't noticed yet other: "can you make another one?" other 3: "are you going to move on to another phase?" other 4: "i want a square root calculator! i want a square root calculator!" square root calculator: "the square root of 4 is: potato" -- vivisectee (still beaming): "i'm so glad you like my square root calculator!" vivisectee: "after another decade or three, i expect to improve it to output numbers instead of vegetables." others (groaning): "whaaaaat" other 4: "I like it outputting vegetables!" other 1: "Me too!" vivisectee (confused): "but ..." ...
traffick boss, bundle of organs or experimentee, and a surgery/research robot, are broken into pieces and intermixed, and crying. when one tries to act, another breaks it more, and a third tries to take parts of it and turn it into itself
participants (2)
-
Undescribed Horrific Abuse, One Victim & Survivor of Many
-
user