1030
I have learned that two processes can concurrently write to the same named pipe. This would probably be good for getting latest most full digest of data.
I am thinking of having breakfast and may need to write a story part or something on return.
Kind of fearfully.
1031
1100
I'm in kitchen making lunchish. Thinking about multiple processes writing to a named pipe a little.
Could set import process going. It could pick sections that don't exist yet and keep importing them.
Hash could output hashes on named pipe. Events of note:
- chunk is completely written to disk and/or first hash of it is available
- chunk is fully hashed
- chunk is imported to lotus and has a lotus-associated hash
- chunk is stored remotely and has a deal id
- user wants a hash to verify state with later
Each of the chunk events adds new important state. If the user were making a stream of hashes, they might want an item for every event.
Alternatively the user may want an item just right now.
A process could be collecting hashes from the named pipe into a file. To get a state hash, the process could be terminated and its output document hashed.
1105
Okay that's not too complex, but it's hard to consider doing. Look at current work.
Not looking atm but basically the top portion of the script generated content hashes, and the bottom portion added them to ongoing document hashes. That bottom portion would kind of move to a separate process reading from a named pipe.
Now, if it finishes on its own, I'd want it to store a hash on its own somewhere. So the events are valuable. But it's irritating to make too many files. Maybe they can be tar'd up or something and untar'd for verification, dunno.
1108
1120
Back at system. Continuing seems difficult.