0724 lost a bunch of these when my fingers accidentally clicked 'discard draft'. with Data.lock: data.extend_needs_lk(self.path, out_chunks) self.chunker = None i'm in the middle of handling an inhibition. 0750 logged on_created for the new unincluded files 0735 struggling some to think about this code not presently consistent def on_modified(self, event): if event.is_directory: return if self.cur_filename == event.src_path: self.cur_filename = event.dest_path event.src_path = event.dest_path self.continue_file(None) def on_closed(self, event): if event.is_directory: return if self.cur_file is not None and self.cur_filename == event.src_path: self.continue_file(event) if len(self.queued_files): self.close_file() self.process_queue() i'm trying to cobble things together that process both new files and appended files. so there's a queue of files to process when the first one stops appending i think it'll be easier to implement if i note when the appending file is closed, as a way to move on to processing new files if they happen ohhh maybe i can process the queue if it's closed and there is a queue ... i think i'm already doing that. i'm concerned aroudn the instance where it is closed, no files are pending, and then a new file is created. in that instance, i want to [make sure new files are processed if this one has no new data]. 0737 i'm concerned aroudn the instance where it is closed, no files are pending, and then a new file is created. in that instance, i want to [make sure new files are processed if this one has no new data]. 0737 one thing i thought about, was making some change when it is closed with nothing pending, indicating that it is reasonable to engage other files. maybe what i could do is try to continue it; and if there is no new data, then close it? uhhh sounds like i might be aware of an issue with that, uncertain. i'll just stick with the flag. 0740 this seems too complex. i know it has lots of bugs. i want to move it into another python file and work on it just until it works. 0742 ok maybe i can just bump into the bugs and fix them now. presently the code has an incomplete string at the end. def on_moved(self, event): if event.is_directory: return if self.cur_file is not None and self.cur_filename == event.src_path: offset = self.cur_file.tell() self.cur_file = open(' 0748 i'm testing it a little. it doesn't move to new file. 0752 somehow new changes to this draft misplaced. anyway poking at. 0754 it successfully moved from one file to the next. it seems to have issue with appending to existing files. 0759 ok it still has bugs but i ran into a deeper issue: when i paste the data into zstdcat, it keeps processing after the end oft he stream. want to check that the end of stream is detectable. it seems like it's at least possible. the data is chunked into frames, it's possible to decompress a frame at a time, and an error would be thrown if it's not a zstd frame. boop. 0803 0808 ok i reproduced some input to output using the path watcher. i'm thinking of trying to slap it together and run with real data. there will be many bugs. 0900 0905 i had rebooted my system and i'm having trouble logging in to my raspberry pi again $ ssh user@raspi22 user@raspi22's password: Permission denied, please try again. user@raspi22's password: Received disconnect from [gently censored] port 22:2: Too many authentication failures Authentication failed. it's also behaving a little unexpectedly for me. I wouldn't expect it to disconnect after 2 attempts rather than 3. But maybe things have changed in a system update. 0906 i pasted my password in from typing it and looking at it to verify it was right, and still getting authentication failed, strangely. 0907 ohhhh i have the username wrong ! :D 0921 the system ran out of space. i set a package upgrade going in the background since it mentioned one was needed. i am now addressing the issue. waiting for git-annex repo. git-annex: git status will show garden-of-the-misunderstood/2022-02-21T18:10:46-05:00.cast.zst to be modified, since content availability has changed and git-annex was unable to update the index. This is only a cosmetic problem affecting git status; git add, git commit, etc won't be affected. To fix the git status display, you can run: git update-index -q --refresh garden-of-the-misunderstood/2022-02-21T18:10:46-05:00.cast.zst 0924 fatal: Unable to create '/home/ubuntu/src/intellect/.git/index.lock': File exists. $ fuser /home/ubuntu/src/intellect/.git/index.lock $ ps -Af f | grep git ubuntu 181674 28764 0 09:25 pts/1 S+ 0:00 \_ grep --color=auto git $ rm /home/ubuntu/src/intellect/.git/index.lock 0926 0932 back in the code. better charge ahead. 0937 i am so tired of inhibitions i do like copying the code back and forth (Pdb) p self.channels {'/home/ubuntu/src/log', 'capture'} (Pdb) p indices[-1][0][-1][1] self.channels only shows two channels when there are many. what's strange is the output doesn't show any channel but capture. for metadata, channel_name, header, stream, length in stream.iterate(): sys.stderr.write('channel data: ' + channel_name + ': ' + str(length)+'\n') 0940 what i'm looking at seems to mean that it should not have functioned at all. strangely it was. (Pdb) p index.items() dict_items([('ditem', ['dLjCqIi9h8RIBch1UXNtKWLTkmRtxYUp9iC3Lm_fRoI']), ('min_block', [996652, 'pG4gRSc03l2js77IfpfUvkTx2zRQFE5capCxY7rSjZ5UWT-5NqeV6U0bvlu_uxW0']), ('api_block', 997053)]) (Pdb) p channel_data 997052 channel_data is supposed to be a dict but maybe we have passed the sequence of code where it is, in the debugger 0944 0944 0944 0945 AssertionError yielding capture @ 565248 it looks like, aroudn 565248, a node might contain more children than its listed length the first root child is 499712 long, so it may be all good. the second one is 94208, so it goes to 593920. drill down. 0947 565248-499712=65536 down a couple nodes, i bump into an offset of 45056 65536-45056=20480 ok, the leaf at 20480 actually starts at 8192 and is 32768 long. not what i expect. it doesn't look like the data i saw in the debugger, to me. i'll break into that offset and look around. i had been looking later 0952 ok the debugger stopped me at a different index. guess i'd better compare. here's what i looked at by hand: ClgwxWi1-IXPBxcPHfV82L3ahdrfk5x4H1yc2XDkf8M lZ9z6x0_XFj9xASzqmCE8Dkm8F3p55t0CaNjzw2gQ3Y qQHhI9d77zlYgz0e8s9mJ7HsHZmBPpgo-McsPZYC_yQ k3PO5uvcyOWgPqwo3q-Xc8HbdmSIYYUcRQw_36Z5Ol0 by5Y5Pm1EMginmZS9mYiTyslnJEwZHXN-IBOGrG8vlE z8NwryGTTWIjYMmfH0JsujjFVUO_o30E9zKD_AAmfGc then, in the debugger ... i don't see any of those ditems. maybe i'm looking at a different stream. no. same stream. 0955. ClgwxWi1-IXPBxcPHfV82L3ahdrfk5x4H1yc2XDkf8M <- this is the tail object. the debugger won't show it, it loads it when the stream is loaded. i see its content as the first index in the debugger. lZ9z6x0_XFj9xASzqmCE8Dkm8F3p55t0CaNjzw2gQ3Y <- this is visible in that content. i was misreadingt he first character as an I or 1 when it is a lowercase L . there are only 2 indices in the debugger and those are they. they just looked long cause their contents are. 0958 so if it's only 2 indices deep into the tree, how is it at this later offset? lZ9z6x0_XFj9xASzqmCE8Dkm8F3p55t0CaNjzw2gQ3Y starts at offset 499712 in the debugger, it looks like its processing the embedded data at the end of lZ9z6x0_XFj9xASzqmCE8Dkm8F3p55t0CaNjzw2gQ3Y , at that offset. the preceding data has lengths of 561152+4096 total, so it ran into a length error earlier than this offset, and did not fail an assertion ... i think ! 1002