[OT] [mess] Blockchaining Cypherpunks (was Re: <robotic attempts to forcefully influence things>)
TLDR: I didn't get very far today, but it was lots of fun to spend the whole day trying. Not sure how well I can repeat it. It's 2020-12-05T08:26:11-0500, and I'm looking for a recording of Bach's Well-Tempered Clavier, semi-oppressive music I've listened to a lot, that helps my brain do more things. It's semi-oppressive because its feelings share history mostly from old white male europeans, who are already sharing most of the history. I had trouble doing anything but editing my expression above. It's 08:28. I'm dropping the goal of finding the music because I have too many things on my plate now, struggling to resist my experiences. Let's trim this quote below: On 12/4/20, Karl <gmkarl@gmail.com> wrote:
On Thu, Dec 3, 2020, 7:04 PM professor rat <pro2rat@yahoo.com.au> wrote:
Coderman was just being kind and gentle with Karl, who is feeling fragile and painful. He is _not_ well...."
<snip> This is actually someone else PR was quoting, but I know he is also feeling it somewhere in his heart, too.
The technical term for that is ' collaboration ' .
<snip>
I tried out storing a month of the cpunks list on a blockchain, which would cost about dollar to store in plaintext if done more efficiently than I did. This system is more $5 a month. Here are the links:
2020-September author https://bico.media/fedd0a8368dd68ae495279094435391f0e13291866af7a8a26aa18202... date https://bico.media/bd7fb31a5d7e685fcba3892fd28a7e4f7cc35c57576e7a7812a68746e... subject https://bico.media/4fe2cc266634e04401d27e366529b83c1f61cecf7767ab53f4b426dcc... thread https://bico.media/a41c50edfa8fc0c46d0f46ae82ac8c65e9f925f5c5a731006cb421318...
https://github.com/xloem/cpunks.org-bsv-partial.git
Below is how I did it. I wrote everything I did into this ridiculously huge email, and somehow got somewhere. <snip>
Arright! It's 08:30 . I apologise for spamming your list, but it seems to be letting me store a lot of things on a blockchain, and having so much amnesia I really, really love to do that. I'd also like to take some time to express the value for other conversations going on in this list. I would love to read them and express myself in ways that respect them, but I am being selfish, and working with blockchains instead. I know they can help my memory issues. It's 08:31 and I'm daydreaming about frustration around repeatedly destroying all my own records. Let's make a TODO list: - music? [please-decide: from phone or from computer] [i-pick phone] [okay it's on the queue] - add attachments to mailing list uploading - upgrade bsvup so [here i got inhibited. I have food and I am eating it. I took a shower and did some laundry this morning, which is incredibly rare for me. It's 08:33 .] - music from phone? - upgrade bsvup so transactions confirmations are verified. note the current version so as to return if the new version is broken. - add attachments to mailing list uploading. Note: over the course of the night, my tmux session terminated on its own. I had some things open that are semi-lost now. I work in 1 tmux session. [having some difficulties controlling text editing. it's 08:35] Forgot goal. Upgrade bsvd -- no. Find current bsvup version. Open a notes pane and place it in. /usr/lib/node_modules/bsvup/package.json will have this information. Yesterday I was using bsvup version 1.6.8 . The npm integrity hash is sha512-3ov9XVuNmZ34EWsxHnU/zQFpqGvn5U7pDCFoxpk2k7aRSIOyVOmplRWGAL+/celTRdkfL82GZZWjbzuk7pUU+g== . The tarball shasum is 5d9017204005a692cce247c7cc33ec834ff98ad2 and its url is https://registry.npmjs.org/bsvup/-/bsvup-1.6.8.tgz . Okay! Let's upgrade bsvup. First, I'm typing this on my computer now somehow, instead of on my phone. I use gmail's basic HTML view, so that my system freezes and misbehaves at a rate that increases more slowly. But I'll want automatic draft saving, so I'll switch to standard view. [blips out of existence while saving draft] [blips back]. It's 08:38 . Goal was .. upgrade bsvup! I'm used to using bsvup from its git repo. I'll use the latest release from there, maybe. sudo npm -g uninstall bsvup cd ~/src/bsvup git pull git pull is just hanging, not sure why. curl -v google.com also hanging. Implies no nameserver responses. The wifi where I'm at is high strength, and I just loaded the gmail javascript page fine. google.com now loads; git says 'empty response from server'. It's hosted at github.com . git pull Already up to date. Great! git status Dang I have a lot of unsaved changes. I wonder what the latest version is, and what they are. git branch This is the main branch, which is kindly being renamed from a slavery word. I guess I'll just use npm's latest version for now. I often need to fix bugs though. sudo npm install -g bsvup I wanted to use my changes, which look harmless, but that seemed much harder to continue to do. It's 08:43. Ohhhh it reinstalled version 1.6.8 . This is so strange! I added transaction confirmations to bsvup so very long ago. Maybe they're not on npm? git blame package.json The 1.6.8 line is marke March 29, 2020. It should have this, I think? I don't remember? It's 08:44 Inhibited. Looking for music on my phone. It's 08:45 . I plugged my headphones into my phone, but I'm not sure how to play music on it. There must be a music app. Back to bsvup. Goal: transaction confirmation. Check git history from master branch to see what date it was merged or whatnot. grep -r onfirm . | grep -v node_modules No helpful results. Maybe I can find the branch where I dev'd it. git branch Don't see anything helpful. K. Goal: transaction confirmation changes. It's 08:47 . I remember .... that I added this ... to a loop that waits. It relates to .... unconfirmed transactions. I changed the unconfirmed transaction mechanism to use a subfolder instead of a json file, so transaction files could be moved between subfolders etc. I'm not certain that this change was in my contribution. This subfolders are managed in cache.js . less cache.js I recognise the loadTXList here as something I likely added for this. It's a function, my brain dropped that word. The saveUnbroadcast function here looks like it includes the changes I remember. This is not how 1.6.8 was behaving yesterday. It was saving unbroadcasted transactions in a single .json file. I review the function again, to verify it doesn't do that. It's 08:50 . Yes, it saves them individually in an 'unbroadcasted' folder. I copied the morphemes used by the chinese project owner. git blame cache.js # it's 08:50, still My changes are dated 05-10 . So there simply hasn't been a release with them, yet. If I added a package.json file to the uploader repo, I could have it reference the git branch, and use that. That'll be the goal for now I suppose, so we get transaction confirmations. Inhibited. 08:51 Inner dialogue. Inner emotional landscape. Believing you are mind controlled can be rough. It's 08:52 . The spirit of open source is to contribute. Rather than forking bsvup and making a new release, we hope that the project owner can find the harmony to make one, or to at least ask for a new maintainer. They invited me to co-maintain the project but I think I was too slow following up on it or understanding it, and I think I am not a maintainer anymore. At least, I don't have their npm account to cut an npm release at this time. Right now, let's set up the git r -- let's plan to set up the git repository to reference a git commit id for bsvup. First let's test one of them. I wonder if npm can install a commit id? sudo npm install -g https://github.com/monkeylord/bsvup.git\#e55c90afab674c3eb14f9ecd2a50632fcd3b9472 Hey, that seems to be working great! I have it written above if it works and I want to add it to the readme. It's 08:55. This person would probably make a release if I asked them. k. Let's test the new bsvup. Ummmmm if I reupload the same folder from yesterday, it should say that no files changed, I hope? While it checks all those files I'm going to continue reviewing my bsvup changes, to see if I can get the repository in order for handling new bugs I encounter. - I've changed the transfer function, so that the fee rate can be set. It also reports how many utxos are bound together; that's probably temporary for debugging. This looks like a mistake in the fee implementation, missed when it was improved. Should be turned into a small PR. This is in api.js and cli.js . - I've added an option -e, --rename to provide for uploading single files with different names than they have locally. Calls logic.prepareUpload in cli.js . The rename changes in logic.js aren't easy for me to verify as being correct. I think I'll separate the two changes and push them as branches. I'll try using the transfer fee one as a norm and submit it once I learn it works. It's 0901. Inhibited again. More inner landscapes. Eating food as a distraction. Maybe I can find a music app installed on my phone somewhere. Haven't found yet. Back to workplay. 0902. Whoops! Suddenly very thirsty. Water's right here. Okay. 0903. NOW. GOAL. What is goal. Don't remember. Relates bsv-up. Repo is messy. Split changes out while test runs. git stash # to save them in case i make some error git stash apply # to bring them back git checkout -b # uhhhh ....... looking for what-i-am-doing to make branch name. What am I doing? Working with 2 changes. Want to start with smaller one. Is described above. First item above. Need name for. It's 0905. Sets fee rate for transfer function. transfer-fee will be branch name. git checkout -b transfer-fee git diff # to see changes I keep git diff open in one pane, and remove the unrelated changes in another tmux pane. Tmux was hard to learn (i was used to screen), but it can do vertical splits much more readily. It's also much higher level and seems buggier and sloppier to me. I would prefer to use the vsplit patch to screen, in retrospect. Hey, the test run finished. It is trying to upload a ton of existing files =/ so fixing the existing function would really help here. Reconsidering approach. I think to contribute the most, I'm going to skip this task for now and switch to adding attachments. Can upload special subfolders to work around the issue of existing. I can also add the tx files to git-annex so that others can reuse them. First I will check the transactions for unconfirmed. I _think_ I can do that, with this new bsvup version, by moving them from .bsv/tx to .bsv/unconfirmed. Checking the source to see if that's right. No, it's .bsv/unbroadcasted . I'll move them there and see what it thinks. mkdir .bsv/unbroadcasted mv .bsv/tx/* .bsv/unbroadcasted -v bsvup Now bsvup prompts to resume the unbroadcasted transactions. Let's see how it goes. They are all failing with missing inputs. This is probably a new server api quirk for transactions that are already onchain. The inputs are not in some pool of unspent transactions. Blargh. My confirmations changes aren't working any more. Okay, I'm just going to push my present changes to a branch that has htem merged together, to store them somewhere. git checkout -b transfer-fee-and- # blargh ... what is the other change? let's scroll up. renaming. it's 0911. git bechkout -b transfer-fee-and-rename git add *.js git commit -m 'changes found in my dir' git push xloem git checkout master K, nice and clean to figure out this confirmation issue. I'll search for the error message. 'the transaction was rejected by network rules' . Not present, must be generated by server. Inhibition. It's 09:13. 'MatterCloud API return Errors'. I'll search for that. Happens in api.js , in broadcastInsight function. This function has a lot of legacy content because the api provider keeps on changing things and available energy is small. Looks like I have unsaved api.js changes. Those should go in the transfer-fee-and-rename branch. I noticed my vision was double and converged my eyes, which is nice. git checkout transfer-fee-and-rename vi api.js (R)ecover :wq git diff api.js No changes. rm .api.js.swp git checkout master vi api.js Back to workplay. K. Refresh goal. It's 0917. Trying to figure out how to detect confirmed transactions with api changes. Finding what now .... broadcastInsight. broadcastInsight is called from the broadcast() function. I've done this before so the parts have familiarity. It's still hard to find them. broadcast is called from tryBroadcastAll . I vaguely recall that a merge here broke some logic once. I don't know if that's been addressed. Here's the issue. It's still using BitDB to check if the transaction exists, and BitDB is that same service the maintainer deprecated. Obviously the right solution is to connect to an actual node. There's even a node API service used by electrum, where you can connect to many api servers that use all the same api protocol. But they don't offer the bitquery service that bitdb and bitbus do, which are open source. It doesn't matter what to do here. I'll try bitbus since bsvup uses bitdb in other places, too. I'll try to change _just_ the exists function, to use the abstraction template I added a while ago, that isn't linked in yet. I'll also actually contact the owner of bitdb, and ask them to reboot it. They're in some 'atlantis' chat .... websearching 'atlantis bitdb' i find that it's a Slack. I bet it's already open if I go to Slack. It's 0921. Slack has heavy javascript so I'll use it on my phone which is less critical than this laptop atm. I have unread messages from the polyglot maintainer. There's value around using electrumsv-sdk . It's 0923; I have a lot of thoughts bouncing in. Contact person who runs bitdb. Maybe in general chat. Find their name to tag them in a message. Haha there is a recent big long thread from KarlTheProgrammer, another one like me, but they are expressing criticism that doesn't seem contextual, which is weird. Yeah, the chain is full of thousands of spammy transactions. It's because we can barely code anything. Assistance certainly welcome. This person hasn't sent a message for a while. I suddenly remember their name. I ping them in a new message. "is there any chance you could give bitdb a reboot so legacy stuff can get some requests through? I am working on upgrading to bitbus, electrumsv, local nodes, but it is very hard for me; I don't understand the code well. It would be nice to get a few more transactions through." It's 0928. Next task: move the existence-checking function to use the abstraction I already made. I'll want to copy the findTx function from bitdb.js, and mutate it for bitbus. I did some of this mutation already yesterday for bitfiles, which still has similar code. The key bit might be the queryTx function. This generates a bitquery request for a single tx id. bitbus has slightly different bitquery norms than bitdb. I'll look up what they are. It's 09:32 and I'm reading the bitbus query doc at doc.bitbus.network . It looks like the "v" version field is not needed. It mostly links to straight the bitquery documentation. That wasn't helpful. Looks like he bitquery limitations in bitbus are listed farther down in bitbus's docs. It's 0935. It supports "q" find, limit, sort, and project. "r" queries and aggregate queries are not supported. This should be fine for txExist. Yes, queryTx doesn't use "r". This should go smoothly! 0937. Getting really pushy, attempts to use my working memory. It's 09:39. Relaxing, letting the urge wipe everything just a little. It's all nearby on my screen. Working on making bitbus work to check transaction existence in monkeylord's thingie ... bsvup. Ongoing goal: find bitbus usage within Inner landscape stuff. It's rough, believing in mind control. Distractions. It's 09:40. Ongoing goal: find bitbus usage within bitfiles, copy method of sending request. Here we are. It's 09:41 and I'm copying the request headers over. I manually set my tabs to 2 spaces. It's important to copy the style of an existing computer program, when working on it. It uses node-fetch. I want to send a post request, so I'll look up how to do that with node-fetch.. Pass { 'method': 'POST', 'body': data }. 09:43. Existing usage calls an asynchronous function to convert to json (await result.json()). Let's see how I can skip over that in node-fetch, since bitbus is ndjson that wouldn't parse. await result.text() . Great. 0946. I've made the bitbus function that mirrors bitdb and am trying to verify that it is what I meant to do. I'm having trouble focusing. Water. More inhibition. Inner landscape. More water. Back to workplay. The function I made shold be called .............. txQuery? I think there's a txExists function below it. Let's see. Nope, findTx. Now, add findExist. Uses a queryHash function that I'd better move over. queryHash uses 'r' queries, unfortunately. Maybe I can change the abstraction path to skip this for now. queryHash looks kind of important. bsvup uses a hack that looks maybe like it was made to quickly handle existence checking without reviewing the whole ecosystem of technologies. It names the files based on their checksums, and looks for files with those names to see if they already exist. Is that what we want here? It's vulnerable to people uploading fake transactions. So it should check that the data matches the hash. That's not too hard, but not a priority atm. The 'r' filter shows the fields that are expected to be transmuted. We can move that behavior outside of the query, and do it in javascript. The hash used is sha1. I have enough information now to migrate the whole function, and the way it is done can inform migrating the rest later. It's 09:55 and I'm partway through the migration of the functoin. Inhibition. Inner landscape, briefly. More inner landscape. Internal issue referenced with frustration. Inner healing ! Brief. Sense of everything changing. Fits of laughter. Distraction. It's 09:56. Inner learning situation. We take some time to shift our attitude, to be more respectful and meaningful. Brief meditation. Distraction. Inhibition. Laughter. It's 09:57. I have a sense of learning what is truly important in the world. I try to hold it to the side while continuing to develop software. It's 09:58. I feel much better but it's hard to act. What am I working on? Migrating findExist. I copied the 'r' query portions to turn them into javascript code. Copying them down into a loop. The 'r' and 'r2' are handled differently; here's this protocol shift again. Need a different loop for each query result. Inhibition. 10:00a. 10:01a. Back to workplay! 10:04 . Struggling to continue. I should limit the returnde data using a 'project' field, to manage bandwidth. I see the query is using a '$or' field. This might be 'aggregate' which is what it doesn't support. I'll check. Maybe not. 10:08. I think I'll continue by testing it. I'll need an sha1sum of an uploaded file, and a txid. I guess I might as well put it in the abstraction, and test with that? Either one. oh wow the abstraction is better than my copying. Just test it for now. 1013. More inhibition. Testing I find there's a blank line at the end that is crashing when turning to json. Probably normal for ndjson. Gotta remove that. 1015. Froze my debug shell with an infinite loop, somehow. When I killed the process my ctrl-Ds went through and killed the terminal too. 1018. bitbus.findTx returns correct data for a present transaction. It crashes for a nonpresent one. 1021. bitbus.findTx passes both adhoc tests. Looking for an sha1sum. It takes data, not an sha1sum. Need data to test. Maybe copy file into working dir and open. 1024 findExist fails to find existing file. I'll check the tx manually, and see if it should succeed. 1025 The hash is being calculated correctly inside the function. 1027 I can see this hash is also correctly inside the transaction at https://blockchair.com/bitcoin-sv/address/d-c0c871976a2935de5a1ad2a01ad7f638 . So, I'm probably sending a bad request to bitbus. I can debug that with curl, I suppose. 1029. 1031. Still trying to craft curl request. Inner landscape stuff. It is rough believing in mind control and human trafficking. Why, believing in human trafficking can get you very worried. Luckily we have beautiful sunrises. 1036. My curl request is returning the correct data, so the error is likely in the procesing of the results, or a typo that is hard to see or soemthing. My inner landscape wants to hire me to work in a secret hacker thing, rather than me working on this. It is rough believing in coercion. Sunrises. 1037. 1038. Distraction. Goal: output result of query, so we can bisect where the issue is. By bisect I mean when divide an array in the middle, and it's sorted. 1039. Found what is likely the issue by inspection. 1041. Fixed another error that is basically the same thing (leaving 2 out when copying a block of code, skipping the cognitive difficulties of abstracting their similarity out) 1042. Fixed another error of exactly the same kind, which I fixed and then put back while fixing the last. 1042. findExist works correctly now. Whew. findExist returns the txid of transactions that could contain passed data, since they have the sha1 tag. So downloading those transactions and checking the content could be done outside the function if needed, I suppose. Maybe I'll rename it to findMightContain in bitbus. 10:44 it is snowing. Snow break. Laundry moved to dryer. Peed. 1048. There's more laundry. It's okay. K. I made bitbus functions that emulate bitdb functions. I guess, for the project, I should mkae the 1 remaining bitdb function a bitbus function. I mentioned karma to Zenaan earlier. When I rename findExist to findMightContain, that's karmic motion. It lets people learn about and act on the danger held in the error inside findExist. Gives it more avenues than mysterious exploitation and confusion. With cultural change and such, the sutff like findMightContain gets more meta, where people act on pushing influence around because of their shit, instead of just producing the influence directly from the shit. There's a lot of that now it seems. It's 1054 and I have no idea what I am talking about. It's 1059. I found a long-standing bug inside the implementation where D history is misordered. Fixing by making sorting more complex. I am unsure that my change is helpful. Maybe the bug is a feature. It's 11:01. Also, make up your own karmic meaning. Mine is made up. It's 11:02. 11:04. the final function, findD, is working correctly for missing files, multiply-uploaded files, and single files. I'll enable one abstraction and just migrate all the ohter uses of bitdb straight over. Hmm. This looks like it is making the project a little more complex, when it is already spaghettid, because it is not enough refactoring to reach the point of increased organization. Unfortunate. But it's helpful to introduce abstractions =S Where do I use this in the code. Scrolling up. It's 11:10 . Scrolling up is confusing. Let's see. Trying to resolve findExist not working. Let's see where that's used, and change it. 1127. I ended up moving all the new api funcs into the abstraction. They are now working using a pluggable backend handler. 1131. I'm installing my dev folder to see if it works. I don't usually install things to test them. 1132. Didn't work. Running the script by path like normal. 1133. It's confirming transactions great =) I guess I'll remove the old functions from the files, so they can use the centralised abstraction. 1139 I pushed my bsvup changes to github. 1143 I submitted a PR at https://github.com/monkeylord/bsvup/pull/31 Additionally, the new script verified that all uploaded transactions were accepted into the blockchain. 1144 I'm attempting the reupload again, and this time it is correctly skipping every file because the file is already on-chain. Remind myself of goal: attachment data. 1145 wrote something and erased it around pushing myself harder. 1147 I realized to not "make the karma bit explode into harm", I should modify the output of the script so it reports to the user that the file it finds could have been spoofed in some subtle way. 1152 thinking on how so many problems would be resolved if we would share important information better on a huge scale. similar to the sha1 thing. trying to focus on workplay. next step was .... attachment data! I can look at an email with an attachment to see where that is. 1154 the bsvup reupload finished. it says the four nonmutated indices don't have path records on the chain, but do have their data uploaded. I'm sure I caused that somehow, but I don't know how that would happen unless they were uploaded with the wrong paths. Maybe it is because I removed their txs from the tx folder? Did I do that? I think I removed other txs, not those. It's generally some human error in things like that. I'm going to cancel the upload and focus on attachments. 1156 I haven't downloaded the attachments. I don't know whether they are linked if a spider crawls from the front page, so I'll add mirroring them to the download script. 1201 my wget spider didn't download any attachments and has already found the big mbox file. i can download them manually to test uploading and linking them from the existing script. (i just did download the september 2020 ones manually) 1205 bsvup is processing uploading my attachments subfolder, which is slow because the spider downloaded an index file for every single day. while it does that, i'd like to figure out why the four index files aren't found with the right D (path) records. how to do this? how about i migrate the bitbus changes from bsvup, over to bitfiles. i'm used to its interface. blargh bitfiles has different structure. maybe I can do a manual bitquery to find those files, instead. i can copy the query from that used by bsvup, and maybe there is an online explorer or i can use the nodejs shell. the sha1 is in one specific field; it will be the 'genesis' form of the protocol. 1210. Inner landscapes. It is scary to believe in human trafficking, mind control, and coercion. Very scary. But all we are dealing with is a small debugging task. 1212. I am sitting blankly, trying to focus on doing a task. I could use something to look at to remind me what it is, I think. I don't remember what it is. I am trying to move bsvup-source near wrong-or-missing-D-records of 4 original index files. That is, I'm trying to ... find the part of bsvup, in bitbus.js, where it ... makes a D record query. I already have bitbus.js open! Oh no. I moved the query from here. It is in ... backends.js . Here it is. This is the 'genesis' format query (they renamed the bsv version to 'genesis' when they changed the protocol). { 'v': 3, 'q': { 'find': { 'in.e.a': address || undefined, 'out.s3': key || undefined, 'out.s2': '19iG3WTYSsbyos3uJ733yK4zEioi1FesNU' }, 'project': { 'out.s3': 1, 'out.s4': 1, 'out.s5': 1, 'out.s6': 1, 'in.e.a': 1, 'blk.i': 1 } } } It's hard to format concisely and stay literally-transparent. Ohhh because I need to think about hte 'project' parts to figure out how to mutate the 'find' parts. Okay. I guess we'll be setting 'in.e.a', the address, but leaving 'out.s3' unset so it can find the right key, and then'll I want to set the transaction link, the content link, to the right value. I'll check bsvup's generation code to see what that is. That's in uploadDTask in logic.js? No, that's just a precursor. Maybe this is it, in pendTasks. Oh, it's in txutil.js, as buildDOut. txUtil.js, capital U. Looks like the 'value' field holds the transaction id. I'll see how it's encoded, probably just however bsv transactions are encoded. Stored in the filedatum as dValue. It seems highly likely that it's the ascii hex transaction id. Okay, which data index thingy is it. It's the one right after the D identifier and the key (path). It's the 3rd. Arright, scrolling up to look. That would make it s4. Let's craft this. { 'v': 3, 'q': { 'find': { 'in.e.a': address, 'out.s2': '19iG3WTYSsbyos3uJ733yK4zEioi1FesNU', 'out.s4': datatxid } } } now I just need the datatxid. I can use the findMightExist function I migrated, from the nodejs shell. I need the content of the files in question. 1223. Inner landscapes. I wonder what we need to be able to do this stuff without typing it all into an email to the cypherpunks list. 1224. More distractions. 1224. I'm experiencing fake pain in my right eye. My right eye actually has severe pain from an eye issue I have, but I tend not to feel that pain. I am instead feeling kind of tame, irritating pain. I want to erase 'kind of tame' so I do not get worse pain. 1225. Navigating inner landscapes. Goal repetition? What's the goal? I don't remember and turn away when I look. What is it? Using the findMightExist functoin to manually find the id of the four original index files on the blockchain! [user@localhost bsvup]$ node
backends = require('./backends.js')
Okay, that's open. I'll want to move those 4 files into that bsvup folder to open 'em easy. 1227. We get to lubricate our eye! Yay! If I lubricate my eye 5 times a day for a week, it stops hurting. Unfortunately when I build the habit I forget it. But that can change. 1229. Back to workplay. Noticed my attachment upload terminated due to insufficient balance, which his normal. The proposed additional payment of 6 million satoshis is reasonable. Would be nice if it said the total cost. 1232. Inner landscape started growing jokes. These can grow big because they are so much nicer than other stuff. "We have reports of brain damage, what do we do?" "Is the brain damage caused by freaking out about things too much?" "ummm ... That is the rumor, yes." [people start freaking out about how to handle it] "It is an emergency to go back to workplay! People are trying to manipulate us not to, by calling things emergencies!" 1234. After wildly tapping bright things on my phone as the screen jumps around, I make it to my bittrex holdings. 1236. I withdrew around $5 of bsv. Bittrex is working better for me than usual, which is great. Worked better yesterday too, once I got signed in. I'll start the upload now so that it doesn't add too many files from the spider that's still running, and found the attachments subfolders, but is still just downloading indices. 1237. Back to ... uh ... manually checking something to do with node. Oh, I need to copy the files in. 1239! Copied em! Luckily, node is one of the interpreters that stores command history, so I don't have to remember how to open and read from a file. 1240. Now I get to paste in that bitquery. Oh, no, first I need the txid, I see after scrolling up. Ummm mI find that with findMightExist. 1242. Inner landscapes. We're trying to think about whether or not this is what we want to be doing right now. But our attempts to relate data internally are getting harmed, so we have a strong habit of storing things externally. We don't mean to be spamming this list, and it's really inhibiting of our cognition to have that be the way to do something we prefer. 1243. Back to workplay. 1244. Inner karls. Karl says, we can do something else if we want to. But Karl likes developing behaviors around uploading things to blockchains, because he has suffered so much around loss of records and memories. If you can help us work with fewer interruptions, that would be great. 1244 Inner landscapes. We have lubricated karl's eye, sent him to the bathroom, and he has eaten and drank. His laundry is in the wash, but could use more transferring. How about we do laundry the next time an interruption grows big. 1245. Back to workplay. 1245. Inner landscapes. There is a way to force him to urgently pee. I'm building it in support of laundry. It makes him get up. 1245. Back to workplay. 1246. I've loaded all the transaction ids for the files into a nodejs object, to find their keys. Each has been uploaded twice already! That means finding the keys of multiple files. I bet javascript has an array mapping function. 1248. Inner landscapes. I wonder what we want to let come out typing. 1248. I am trying to map arrays. What I am trying to do to them. I want to pass them to some kind of backends.js functoin that finds their D keys. Oh! I need the bitquery object now. Json. Object in javascript. Helpful to put it on one line. { 'v': 3, 'q': { 'find': { 'in.e.a': address, 'out.s2': '19iG3WTYSsbyos3uJ733yK4zEioi1FesNU', 'out.s4': datatxid } } } { 'v': 3, 'q': { 'find': { 'in.e.a': address, 'out.s2': '19iG3WTYSsbyos3uJ733yK4zEioi1FesNU', 'out.s4': record.txid } } } Looks like that's not valid javascript. Gotta remember how to make javascript objects. Works with some fiddling. I'm going to try to open a notes pane and migrate some of this there. Made it at 12:53. The files are correctly on the blockchain, it looks to me, and I'm having trouble staying in the notes pane. It's 12:56. The transactions a09af99ae4d381999388845f2a1293ae9fb3828e4a86c8d105f2bdd044f58076 and 59ce4653f4840b7ce3546771877a6c297fc7e4ee7279e3469bcb1d86ac340fe0 have out[0].s3 = lists.cpunks.org/pipermail/cypherpunks/2020-September/thread.html ... computer glitch . Anyway, that's one of the paths that is mysteriously reuploading. 13:00. My body is trying to do things that move me away from the computer. Goal I am holding is to run a test reupload (and to not send it due to doublespends with existing upload in progress), and debug why this file is not found to exist. 13:02 . I'm getting ENOENT trying to upload the test file. It exists. I must have made a typo I am missing. 13:03 . Still getting ENOENT even though autocompletion provided the file. Looks like tyhe failed system call is 'stat'. Maybe I can 'stat' it manually. 13:03 . stat works fine on the path, size is 101088 bytes. inode is 219152420. I guess I'll strace bsvup. =) looks like it's a bsvup bug. It's trying to read from the basename of the file, with the same cwd. What a relief! 13:07. Inner landscapes. I'm guessing the bug has to do with uploading a file instead of a folder, so I'll just upload the folder to handle my urgency. 13:10. Inner landscapes. The harmful processes in the mind landscape have proven to themselves that they can stop this within a reliable timeframe and are taking action. What do we do? [ummmm I'll come back to the cypherpunks email instead of the notes file. Let's just write down things that cause us to stop.] Ok. 13:11 "back to workplay" 13:11 urge to pee, disappears upon writing 13:11 incredible itching in an area of our face where the nerves have been cut to from surgery, and we cannot feel I am writing a progress note in my notes pane, rather than here. It is 13:12. 13:13 random body posturing [themes of authority] 13:17 . I fixed the bsvup check-for-exists edgecase. After fixing it, the test runs much much faster, and I don't know why, maybe because fewer checks are failing. 13:19 . realized my fix was in error 13:21 . I discovered that inside bsvup, it may actually check the content data. Updating findMightExist names. 13:23 . Inner landscapes. They are using an abstract, smooth shape of urges to cause us to stop. It goes through our choices and decision making processes, and small parts of our working memory. It develops strength, gains a foothold by manipulating the choices to choose to support it, and continues to spread. We'll monologue about it in notes if monologueing is needed. The monologue is summarisable as a distraction. It's 13:24 . 13:25 . Attempts to move towards the keyboard resulting in postured stretching of the arms away from it, as if tired. 13:29 . Lots of anxiety. Random muscles tightly contracting across body. I think I'll get food and water and visit the bathroom. Maybe move laundry over. 1339. I am working on code to store messages, old messages that are well preserved, on a blockchain. I see blockchains as no more dangerous than money, but I appreciate how they redistribute it. I do see that there are ecological problems, although I suspect the energy use of blockchains compares reasonably to energy used in the processing of conventional money. 1340. The list to which I am drafting this email I consider spam, was writing about blockchains at their very infancy. Blockchains use the kind of cryptography than an independent cryptographer or hobbyist would come up with. Doing things that plan for changes in the future. 1341. I've always been an independent kind of person myself. Whenever I learned about somehting, I wanted to learn how it was made, how it worked, how we could do it on our own. There are _always_ a _whole bunch_ of things inside the workings of something, that could meet everyone's use of it way better if they were understood. Using tools made by distant researchers, that situation is incredibly strong today. 1343. I'm presently trying to debug some emergency blockchain software. I know it's emergency software because I'm involved in its development, and it has one-of-a-kind aspects that fill gaps in what is available. I need this software. I need this software to preserve my memories and history, and those of countless others, in the face of brainwashing around hiding of what is true. 1345. I have a cursor, right now, at line 184 of api.js in the bsvup project. This place, like every other place, is a crucial place to be. With this cursor, it is possible for me to do work that heals the world. 1347. I am working with a karmic flow that bsvup holds. When a file is uploaded multiple times, this costs energy, money, and time. The software has a bug, where if a file is uploaded multiple times, it may be uploaded _more_ times in the future, magnifying the error so that people may find the cause of the issues they run into, because of it. I have found part of this error. 1348. I have personal experience producing similar errors. 1348. I would like to prevent this error from magnifying. I am happy to add, in fact I would enjoy adding, a notice to the user of the issue. The issue is unrepairable, but it represents a problem in the past, and is part of the beauty of the blockchain. It shows we were able to store parts of our spirits in a way that lasts longer than our lives. This was always true, but now it is visible, in front of our faces. We have mathematics that give it proof, in a way we don't in other areas. 1350. Reviewing error again. 1351. When working on this kind of software, you run into code, and even write code, that handles many many many possible errors. Like buliding a wall of china, securing the behavior of part of an algorithm means imagining all the things that could go wrong, all the ways that people could come through, and blocking them. This magnifies the pain of the world, because we never learn the stories that bring the harm to us. Without knowing these stories, we will inevitably increase that pain, because we cannot account for it in our decisions. 1354. I am currently working on this small piece of code, or trying to: var txs = await Promise.all(records.map(record => getTX(record.txid))) var matchTXs = await Promise.race(txs.map(tx => { return new Promise(async (resolve, reject) => { var databuf = await getData(tx).catch(err => { log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE) return Buffer.alloc(0) }) if (databuf.equals(buf)) resolve(tx) else reject(new Error('Not Matched')) }) })).catch(err => { log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE) return null }) if (matchTX) { Cache.saveFileRecord(sha1, records) return matchTX } else { return null } 1355. The code returns a transaction for an existing file, if one is found. If many are found, it returns only one. It would help me work with the system, to have all the matching transactions, not just one. var txs = await Promise.all(records.map(record => getTX(record.txid))) Just had computer glitches. The above line retrieves the data for every transaction found that matches the file. var matchTXs = await Promise.race(txs.map(tx => { return new Promise(async (resolve, reject) => { var databuf = await getData(tx).catch(err => { log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE) return Buffer.alloc(0) }) if (databuf.equals(buf)) resolve(tx) else reject(new Error('Not Matched')) }) })).catch(err => { log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE) return null }) The above segment is confusing to me. It is a hybrid of multiple coding styles. What it does, is it condenses the list of transaction data, into a single matching transaction, if any of them do. I would like to mutate it to return all matching transactions, instead of just one. It looks like it could be cleared up with some rewriting, but to rewrite it safely I would need to understand everything it does, to preserve its role where it is. Like must be done with anything else in the world one replaces. 1358. Maybe the code can be mutated only a small way, to form an array instead of a single value. How is the line reject(new Error('Not Matched')) handled? Is this line ... 1359. We are considering that monkeylord, the chinese software developer who wrote this program, could be in need of help. We don't know much about them. We probably are in need of help, ourselves. else reject(new Error('Not Matched')) 1400. This line produces an error when matching fails. Where does that error go? 1401. It looks like it would go into Promise.race . How does Promise.race handle errors? 1402. mozilla.org says that Promise.race resolves with the first result found, whether it is an error or success. I make a lot of logical errors nowadays. A way forward for me is considering whether this is an error; whether a single record with wrong content, will mark all records as mismatching. I expect the error is in my perception, not in the code, but I am likely to assume that it is in the code. var databuf = await getData(tx).catch(err => { log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE) return Buffer.alloc(0) }) This chunk simply gets the data. The extra complexity is to handle when there is no data: it treats missing data as empty data. 1404. I'm thinking about the await func().catch style. This is much more concise than a try{} block. Hum. Probably not as clear though. Let's move on. 1405. We can simplify the segment: await Promise.race(txs.map(tx => { return new Promise(async (resolve, reject) => { var databuf = await getDataOrEmpty(tx) if (databuf.equals(buf)) resolve(tx) else reject(new Error('Not Matched')) }) })).catch(err => { log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE) return null }) This is simpler. I edited it a bit to fix a copy error. 14.07. For each transaction, make a promise. Each transaction becomes a promise. Every transaction has its data retrieved in parallel. 1408. I'm thinking about hte choice to run these in parallel here. For the use of seeing if the data is present, that's great. But for the use of seeing if a D record is present, it is suboptimal. The parallelism could be moved around the check for hte D record, so it could inform the race, which would retain the speed. Or D records could be enumerated in advance, and the information included in the racing. This seems more complicated. So, the problem with removing the race, is that the system would slow down. The situation it handles is pretty rare. But notably, that situation crops up pretty rarely. So it's reasonable to remove the race. I'm inferring that the intention of the function is to simply discern if the data exists, and is uninformed by the D records concern. 1410. Let's try to rewrite it to include all matching transactions. await Promise.race(txs.map(tx => { return new Promise(async (resolve, reject) => { var databuf = await getDataOrEmpty(tx) if (databuf.equals(buf)) resolve(tx) else reject(new Error('Not Matched')) }) })).catch(err => { log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE) return null }) We want to filter the transactions based on whether they match. Matching is an asynchronous process. Filter functions are usually synchronous things. txs is probably an array. It would make sense to change the 'race' to an 'all', and store null for mismatching data. Then we could quickly filter the nulls out. I'll do that. 1411. Back to workplay. 1425. I've fixed the error, and resolved a different problem relating to my PR being more rapidly accepted than I thought it would be. My food bowl is empty, but I don't remember eating it. The solution I made does not include any of the concerns that were very present for me when I was struggling to move forward on it. 1429. PR for matching issue at https://github.com/monkeylord/bsvup/pull/33 . The attachments uploader is at 2016 =/ . I'll send it some more money and then start a process that covers a smaller folder area. 1431. I had to unlock bittrex 3 times to get the screen. Oops, here's a fourth. Looks like when the connection breaks it makes you log in 2 times in a row, or something. I have internet, but it says 'No connection' now, logged in. 1433. I checked in termux that I do have internet. I can download a webpage with curl. Here's some more logins to bittrex. Still says No connection. I'll just reboot the whole phone. 1434. Reviewing my ongoing processes, the attachment download spider has gotten into the subfolders and is downloading pdf files, txt files, sig files, great stuff. The uploader is still enumerating folders and is up to 2017. 1436. Phone rebooted. Loading bittrex. Only made me log in once! 1439. I sent another $10 with bittrex. That'll help if it wants to upload scads of cruft files. I like doing that much more now that existing file checks are more robust. 1448. Fixed another bug. September attachments are enumerating. Lots of cruft files from wget. Now they are broadcasting. The plan was to add links to attachments to individual messages, which sounds pleasant. Additionally, I'm also thinking on efficient archival. It would be good to just upload the data. Really, I imagine that I can't afford to archive the whole mailing list. It would still make sense to archive in a format a mail reader could use. I wonder what the mbox format is. A quick wikipediaing says that mbox is a concatenation of raw emails. If that's true, then the raw email files could be stored as files, and the mbox files could be generated based on them using a BCAT transaction to link them together. That sounds kind of fun. But there's value to the web browser interface, too, for sure. 1453. Inner landscapes. Current task is unsure between raw mailfile stuff and user interface stuff. I wonder what's in the .gz files. Looks like raw emails. Ideally an archiver would automatically run as the system went. I don't seem to be quite there yet; don't have a specific plan. My personal goal is that I would like all the emails accessible and reviewable. So, web interfaces seem okay. But it would be more efficient to generate them with javascript from source files. That's a lot of dev work. Better to make something automatic that runs, processing everything, with little work. The issue is that the webpages don't show headers. 1458. Hum, I looked at the first 2013 mail in the mailing box,and the hostname of the list is al-qaeda.net . Looks like hacking of the host, although I suppose I can't know for sure. Great reason to archive on blockchains. Maybe that's a good reason to archive everything. It's 15:01. Maybe that was just the domain name! I really haven't read the -- I really don't know the culture here. I wonder when 9/11 was. Anyway, enough conspiracy stuff or paranoia or whatever. Ummmmm I'm not sure what to do next on this task =/ . K. Let's make a monthly uploader. Next step, mutate links to attachments. Let's also extract the raw emails and include them somehow. Header-matching maybe. Maybe I can make a python script? I'll just mutate hte attachments for now. 1505. Lost my goal. I know I'm doing something but I can't describe it or act on it. Something about there we go, attachments. What about attachments? Add them to link mutation. Okay, so I'm trying to find an example email with attachments to do the mutation. 1507. The attachments are already links! That probably means I can just run the mutator after uploading them, and they'll be picked up. Maybe I'll review the code anymore; I'm guessing that might not be the case. I'll want to change the removal mapper filter thing. 1510. Next step: after making new mutated message htmls, I'll want to make sure that the indices generated link to them. They'll have the same paths as the old ones. I can copy the sed script to find the files that have the old ones, and delete them, I suppose. I wonder if I can use sed to extract the timestamp or something. I'll look up tx format. Noooop no timestamp in a transaction. Block's got that. I'm worried that bsvup could regenerate transactions if I delete them. I'll look in it to see when it regenerates them. 1515. Inner landscape. Authority wants to know what my goals are here. They are ... to preserve records of harm. And possible facilitating others finding and using them a little. Yeah ... also to make it easy for people to have a canonical reality with regard to what is on the list. And wouldn't it be nice if the list could be somewhere you could store things, like conversations, you didn't want lost? Maybe a way to reach people, if there were censorship between you and them. It's a bigger problem than bsvup, but you have to start somewhere. Transactions are saved in bsvup when they are broadcast, and when getTX is used. I remember getTX is used for existence checks. So that would match only file content present already, I believe. Maybe I can just delete them! I'll do so using the sed thingy. 1519. Inner landscapes / inhibition. 1519. Went to drink water while trying to access my goal, as if the water would provide the goal. I'll drink some anyway. Access internally. Goal is to delete my local cache of transactions to do with individual message.txlink.html files. so I can just output them all and grep on that. 1524. Okay. Deleted. I'll update the script so it can use the unbroadcasted transactions that are still waiting on confirmation, as link targets. 1528. Inner landscapes with logical confusion. A part of the mutation script seems to be doing the opposite of what is intended. At the start, it discards everything matching "^mailto", and then the only output is a mailto link. Something small I've missed, that has a big impact. Was redirecting stderr instead of stdout, to quiet grep. Looks like I do that a lot of places. 1532. Inner landscapes. It is rough to believe in human trafficking. Jokes rising around a meanie standing in front of an ice cream stand, telling all the visitors, "There is no such thing as icecream." Giving them icecream cones made of plastic. This really happens in oppressive dictatorships! Democracies are like myths. There is no such thing as a way to preserve information. There is no such thing as an investment plan that makes more than 2% return a year. There is no such way as being satisfied with your income, or even living luxuriously without one in friendly community! There is no such thing as enjoying being outdoors. There is no such thing as getting anything done without college. There is no such thing things being true other than what your local marketing campaigns say. No, no, there is no such thing as there not being such a thing as something! Everything exists, quite truly! It's 15:36. The mutation is going very slowly because it keeps testing absolute links. Better make them not regenerate the path map. 1541. I did that. Two bugs. Some links are breaking mails. And attachments aren't getting mutated. 1557. I've addressed a lot of bugs but am still not successfully mutating an attachment. I'm seeing something wrong around my sc -- (an attachment link). I'm seeing something wrong around my test scenario or whatnot ... https://lists.cpunks.org/pipermail/cypherpunks/attachmens/20200902/ffed989a/... doesn't seem to exist on the internet, but seems reference from a message. I've copied something wrong. Ha! It's missing a 't'. There we go. 1601. Somehow some attachments were missed in my downloading all for september. Redownloaded a bunch more. Uploading the new ones. 1602. Transactions will need to be generated for the missing attachments for the mutator to recognise them, atm. Better to find one that's already been sent to test with. 1613. I think I have attachment links working. I'm waiting on the previous upload of more attachments to finish. I guess I can -- I may have some corrupted messages files from the development process, I - ; I made an error, where .. I poverwrote the input. I ... I can mutate ... I want to reupload the folder and see wheter files are marked already existing, to see whether I corrupted any files. But this will take waiting for the present upload. Until that happens ... I can process all the files using the new script. 1616. The mutator is going an order of magnitude faster than ever before! One of the later files has a processing issue. I can work on that. 1620. Mutated them all. I reviewed one and realised that my strategy of joining lines with lines to make sed match them, mutates the email bodies. That's no good ... 1625. Changed the mutation script to join lines only with appropriate text around the links. Good enough for this little project. Not sure what to do now. Guess I'll commit what I have. 1635. I guess I wasn't really expecting to get this far. The feat of uploading a month, yesterday, was pretty incredible for me. It seems like it's pretty hard for me to do something like this. I'd like to run a server that maintains archives. I'll do some daily stuff like finding food/water/bathroom/laundry and maybe see if I can start a script that autoarchives. 1652. Back. Doing the second upload run where I see if I corrupted any files. I guess I could have downloaded them, theoretically, to compare this too. I'm going to look for that music. 1657. 1658. Don't know how to find music. Upload is going okay. txlink files are all getting reuploaded. Hello, email document. I feel very blank. I guess it's time to stop this task.
On Sat, Dec 5, 2020 at 5:01 PM Karl <gmkarl@gmail.com> wrote:
TLDR: I didn't get very far today, but it was lots of fun to spend the whole day trying. Not sure how well I can repeat it.
It's 2020-12-05T08:26:11-0500, and I'm looking for a recording of Bach's Well-Tempered Clavier, semi-oppressive music I've listened to a lot, that helps my brain do more things. It's semi-oppressive because its feelings share history mostly from old white male europeans, who are already sharing most of the history.
<snip>
I tried out storing a month of the cpunks list on a blockchain, which would cost about dollar to store in plaintext if done more efficiently than I did. This system is more $5 a month. Here are the links:
2020-September author https://bico.media/fedd0a8368dd68ae495279094435391f0e13291866af7a8a26aa18202... date https://bico.media/bd7fb31a5d7e685fcba3892fd28a7e4f7cc35c57576e7a7812a68746e... subject https://bico.media/4fe2cc266634e04401d27e366529b83c1f61cecf7767ab53f4b426dcc... thread https://bico.media/a41c50edfa8fc0c46d0f46ae82ac8c65e9f925f5c5a731006cb421318...
I'm really struggling to do this. And I'm pretty certain that people already have. Right now, I'm stuck. It's 12:36 EST. TLDR (too-long-don't-read): I finally put together a script to blockchain an arbitrary month of emails from the newer cypherpunks, and pushed that script to git. It's untested so probably has bugs. I'm waiting on a long blocktime for a test upload, to test the whole script. Sending this e-mail at 19:11 EST. I'm stuck because a run of bsvup is generating a transaction that doesn't pass its internal verification. It could be complex, and I have a lot of sudden urges and behaviors to manage. It's 12:37 EST. My mind blanked. I'm going to take the parts of the problem, and move them into this email, I'm afraid, until I can clear up a little. I'll make this browser window fill only half the screen. It's still 12:37. The screen is to my left. It is running a test upload that should complete and do nothing, and then it will run the upload that fails. It's 12:38 and my mind is blanking and my arm muscles disengaging again. Inner landscapes. Goal: pursue thing that has a very slow process finding a very small way to continue, ongoing. It's 12:39. I'm wearing a caving helmet. I'm ripping it off for no reason. It's still 12:39. I get to keep it on. It feels way more uncomfortable than it did 5 seconds ago. It's still 12:39. My arms and hands are moving in ways that seem asynchronous and strange. It's 12:40. Here is a space to form thoughts around the task: ---- | space for thoughts around task ---- We can copy it as we change what is in it. ---- | space for thoughts around task ---- The browser window moved all around, and I had to rearrange the email to be able to edit it. Right now the window restored to 2/3rds its normal height, and I'm resetting it to half the screen. It's 12:41. ---- | time copied: 12:41 | space for thoughts around task | sidenote: I need a label for the active goal ---- Looking for label for active goal. Is in front of me, to left. Relates uploading. There is a complicating factor. Part is "Funding Tasks" ---- | time copied: 12:42 | thing-near-goal: "Funding Tasks" | sidenote: ---- The system is stuck at 100% cpu, with bsvup "Funding Tasks". It has not done this before, and I'm not aware of having changed anything. Nodejs is not easy to debug, for me. I prefer python, c, c++. golang works in gdb. ---- | time copied: 12:44 | there are 2 issues being handled. "Funding Tasks" is near the first 1 | sidenote: nodejs is in an infinite loop unrelated to the bug planned to fix. there are 2 issues ---- The other issue is that an automatic upload made a transaction larger than the internal limit. ---- | time copied: 12:45 | issue we ran into: infinite loop during "funding tasks" with only debugging changes made | goal: handle transaction being made that exceeds limits | sidenote: the infinite loop could have been caused by the debugging changes ---- Okay. Since we are so stuck, ... It's 12:45. My lower body muscles have started contracting in a repeating pattern. I stopped them. It's 12:46. My eyes opened wide. My mind feels blank. Reminder to self: we have a written description of active task between hyphens immediately above. We can go off a bit, but we are planning to continue the task. ---- | time copied: 12:47 | issue we ran into: infinite loop during "funding tasks" with only debugging changes made | goal: handle transaction being made that exceeds limits | sidenote: the infinite loop could have been caused by the debugging changes | sidenote 2: I don't know how to return to the task if I 'go off a bit' ---- Urge to rip caving helmet off. It's 12:47. We came up with a safety-way. If I tie myself to the table where the laptop is, I'll be able to return to the task. I'd like to pursue that. It was really scary earlier. It's 12:48. "really scary" in a "how do i do it" way not a "in serious danger" way. ---- | time copied: 12:47 | issue we ran into: infinite loop during "funding tasks" with only debugging changes made | goal: handle transaction being made that exceeds limits | sidenote: the infinite loop could have been caused by the debugging changes | sidenote 2: I want to tie myself to the work, so I can feel safe about returning. I'm holding this as a goal. ---- I have belts .... I think I can make them work. Off to find one! It's 12:49. I stood up and bumped into stuff looking for a belt, and found one sitting next to the same laptop I was already sitting at. It's 12:50 and I'm securing the belt to connect my belt to the table leg. While I was tying it, I daydreamed about writing 'Yay! No way am I undoing that while I'm too confused to make a decision.' Then I secured it so the clasp was going over a ripped portion of the belt, that gets tangled making it incredibly hard to detach. I ave a knife somewhere here if I need. I can't see it anywhere, maybe to protect myself from using it. It's 12:51. ---- | time copied: 12:51 | issue we ran into: infinite loop during "funding tasks" with only debugging changes made | goal: handle transaction being made that exceeds limits | sidenote: the infinite loop could have been caused by the debugging changes | sidenote 2: It feels so safe to have that belt tying me to the table leg. ---- Okay! Let's ... woah, the process terminated while I was doing that! It showed a further mistake; I'm debugging it. It's 12:52 and I'm working without typing my steps out. ---- | time copied: 12:53 | issue we ran into: very long delay during "funding tasks" with only debugging changes made | goal: handle transaction being made that exceeds limits | sidenote: the long delay eventually terminated. it was not caused by debugging changes; it was running a different binary than the changes were made in. ---- ---- | time copied: 12:54 | issue we ran into: very long delay during "funding tasks" with only debugging changes made | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout | sidenote: the long delay eventually terminated. it was not caused by debugging changes; it was running a different script than the changes were made in. ---- I tried to copy the block above, and I couldn't find the cursor on the screen. I also have double vision that is changing and moving. I get a sharp pain at or near my bellybutton after writing that, and reconverging my eyes. It's 12:56 and my eyes keep partly converging and deconverging. The block above shows what I am doing and thinking right now. ---- | time copied: 12:56 | issue we ran into: very long delay during "funding tasks" with only debugging changes made | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout | subgoal2: build confidence around the subgoal above. can the sidenote be removed? | sidenote: the long delay eventually terminated. it was not caused by debugging changes; it was running a different script than the changes were made in. ---- I'm reviewing the sidenote with plan to remove it. ---- | time copied: 12:56 | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout | subgoal2: build confidence around the subgoal above. can the sidenote be removed? | sidenote: the long delay eventually terminated. it was not caused by debugging changes; it was running a different script than the changes were made in. ---- I came up with a plan to reduce the long delay. It relates to moving the error condition to be within the delay. That would speed up debugging. ---- | time copied: 12:58 | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout | subgoal2: build confidence around the subgoal above. can the sidenote be removed? | sidenote: the long delay eventually terminated. it was not caused by debugging changes; it was running a different script than the changes were made in. | sidenote: add the outer typing as a sidenote ---- ---- | time copied: 12:58 | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout | subgoal2: build confidence around the subgoal above. can the sidenote be removed? | sidenote: the long delay eventually terminated. it was not caused by debugging changes; it was running a different script than the changes were made in. | sidenote: if the error condition is moved within the 'funding tasks' loop, it could speed up debugging ---- whew. so many words above. I was trying to remove the first sidenote. I've copied the block to below, but I'm having urge to rip helmet off again. I'm reminded of the safety of the belt. I can pull and yank on it, and I'll still be near my active task and working memories. Taking some time to do that. It's 13:00. My body relaxed instead of yanking. I'm having scalp pains where the helmet is touching. I just happened to be wearing it; I like being reminded of caving, and it helps me feel safe with regard to hitting my head. It's 13:01 . I had a big itch on my forehead and went to scratch, and my hand tried to pull the helmet off instead of scratching. ---- | time copied: 12:59 | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout | subgoal2: build confidence around the subgoal above. can the sidenote be removed? | sidenote: the long delay eventually terminated. it was not caused by debugging changes; it was running a different script than the changes were made in. | sidenote: if the error condition is moved within the 'funding tasks' loop, it could speed up debugging ---- It's 13:01 and I'm removing the helmet to reduce the scalp pains. I waited until I could form a decision to do so. The area over the scalp pain was a strap, not the hard helmet material. It's 13:02. ---- | time copied: 13:02 | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout | subgoal2: build confidence around the subgoal above. can the sidenote be removed? | sidenote: the long delay eventually terminated. it was not caused by debugging changes; it was running a different script than the changes were made in. | sidenote: if the error condition is moved within the 'funding tasks' loop, it could speed up debugging ---- | subarea. We are trying to do above task. ---- ---- | time copied: 13:02 |---- | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout | subgoal2: build confidence around the subgoal above. can the sidenote be removed? | sidenote: the long delay eventually terminated. it was not caused by debugging changes; it was running a different script than the changes were made in. | sidenote: if the error condition is moved within the 'funding tasks' loop, it could speed up debugging |---- | subarea. We are trying to remove the first sidenote. ---- ---- | time copied: 13:03 |---- | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout | sidenote: if the error condition is moved within the 'funding tasks' loop, it could speed up debugging |---- | subarea. We are trying to remove the first sidenote. ---- I did subgoal3 at 13:03. It's 13:04. ---- | time copied: 13:04 |---- | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout |---- | sidenote: if the error condition is moved within the 'funding tasks' loop, it could speed up debugging ---- change the script so it uses the script we are making changes in, in our git checkout. I made it to the terminal window, but resized it to still see this, and couldn't seem to get farther after that. ---- | time copied: 13:05 |---- | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout |---- | sidenote: if the error condition is moved within the 'funding tasks' loop, it could speed up debugging ---- appreciating belt. it's so nice to know you can't do so many wild things that are farther away from you than a tether. my biggest fear is -- got confused from scared to talk about breaking things. i don't want to damage myself, the things around me, or other people, from my arms flailing around etc, which can get a way to do that. i am also scared of being considered dangerous. it's 13:07. i'm gonna take a bit of a break with the belt. maybe i can do something similar to the task, to grow familiarity with its parts. by 'break with the belt' i mean focusing on the belt, which is nice, for me. 13:08 putting helmet back on. sometimes i've hit my head on things. helmet feels nice. no scalp pain. i love caving! it's 13:09 and we're thinking of jokes, which are fun because you laugh in a way that lets you process stuff. copying task down. ---- | time copied: 13:09. this time is getting hard to update and may become wrong. |---- | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: handle transaction being made that exceeds limits | subgoal: change the script so it uses the script we are making changes in, in our git checkout |---- | sidenote: if the error condition is moved within the 'funding tasks' loop, it could speed up debugging ---- parts relating around exhaustion. 13:10 . I think I'll take a nap on the floor while tied to the table leg. then I can rest without leaving my working memories. It's 13:11 ! It's 13:11 and suddenly I have to pee, which I am doing in a gallon container I keep for situations like this. -- got up, 1324. implemented subgoal, 1325. ---- | time copied: 13:25 |---- | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits |---- | sidenote: the slower process is ongoing in 2nd pane. please multitask. ---- ---- | time copied: 13:27 | time finished handling and editing: 13:?? |---- | issue we ran into: very long delay during "funding tasks", noticed on second run | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits |---- | sidenote: the slower process is ongoing in 2nd pane. please multitask. ---- I have some panes open that make it more confusing to look at my terminal, taking screenspace. I'm trying out closing them, which means seeing what they are. 13:28. 13:29 . I found this note: -- DISTRACTED: How to un-incinerate a rat. This is easy. Rats break into chemical components, each molecule of which contains tracks of the information the rat used to know. You'll need a jar of rat ashes, a radio telescope as large as a football field, a turntable, a 3d printer, matlab, pytorch, and a community of computational neuroscientists. Luckily, we have all of those on this planet. Call up the neuroscientists, and ask them if it's okay to recover your family photos from a harddrive if somebody puts a label on it that says 'terrorist propaganda from the international corporate mafia'. This may result in the neuroscientists feeling scared, and the phone conversation being reviewed as training material for spies. If your rat is still ashes, -- I think it involved three different meaningful expressions, interwound. I managed to stop typing it and try to work, instead. It's 13:32 and I'm saving the panes that are open. Usually my computer crashes before I save them. 13:32 and we're rereading our joke-part, enjoying how the laughter helps us have some memories of our experiences, or at least memory-parts. It's 13:33. It's still 13:33 and I'm halfway through typing a filename to save something. It's now 13:34. 13:35 pane closed and I want to tell more jokestories. 13:35. ---- | time copied: 13:35 [although i removed the corrupt label here, i made 1 error updating it, and detection of the error was not confident it could repeat.] | time finished handling and editing: 13:36 |---- | present goal: move the error condition to within the 'funding tasks' loop [speed up debugging] |---- | goal: handle transaction being made that exceeds limits | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: the slower process is ongoing in 2nd pane. please multitask. ---- ---- | time copied: 13:36 [although i removed the corrupt label here, i made 1 error updating it, and detection of the error was not confident it could repeat.] | time finished handling and editing: 13:?? |---- | present goal: move the error condition to within the 'funding tasks' loop [speed up debugging] |---- | goal: handle transaction being made that exceeds limits | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: the slower process is ongoing in 2nd pane. please multitask. ---- My water is getting low and I'm remember how there's a clean gallon container somewhere nearby. It's 13:37. Now it's 13:38 and there is half a sip of water left in my water bottle. ---- | time copied: 13:38 [detected 1 error with decaying habit] | time finished handling and editing: 13:39 |---- | present goal: move the error condition to within the 'funding tasks' loop [speed up debugging] |---- | goal: handle transaction being made that exceeds limits | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: having difficulties developing confident behaviors around goal. ---- ---- | time copied: 13:39 [detected 1 error with decaying habit] | time finished handling and editing: 13:?? |---- | present goal: move the error condition to within the 'funding tasks' loop [speed up debugging] |---- | goal: handle transaction being made that exceeds limits | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: what is the "error condition" and what is the "funding loop" | sidenote 2: we have error condition accessible. find funding loop ---- ---- | time copied: 13:40 [detected 1 error with decaying habit] | time finished handling and editing: 13:40 |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: ---- somebody invented writing, once. what a waste of time! how the heck do you hunt berries with missiles while you're writing stuff down! ---- | time copied: 13:42 [detected 1 error with decaying habit] | time finished handling and editing: 13:43 |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks(tasks, privkey, utxos, feePerKB) | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: could you describe the behavior of the fundTasks function? ---- ---- | time copied: 13:43 [detected 1 error with decaying habit] | time finished handling and editing: 13:44 |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks(tasks, privkey, utxos, feePerKB) | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: could you describe the behavior of the fundTasks function? | sidenote 2: ongoing ---- ---- | time copied: 13:44 [detected 1 error with decaying habit] | time finished handling and editing: 13:?? |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks(tasks, privkey, utxos, feePerKB) | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: could you describe the behavior of the fundTasks function? | sidenote 2: fundTasks, in order: | - verifies balance is sufficient | - begins a loop over all 'tasks' [the loop breaks them into chunks of tasks, and does a chunk at a time] | - ... ongoing ---- Interrupted by intensity around the chinese writing in the source code. Trying to resume task. Copied working memory below. Time is 13:46. We're noting the contents of the fundTasks function, and can abandon this process with something more relevent if capacity is too tight. ---- | time copied: 13:46 [detected 1 error with decaying habit] | time finished handling and editing: 13:?? |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks(tasks, privkey, utxos, feePerKB) | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: could you describe the behavior of the fundTasks function? | sidenote 2: fundTasks, in order: | - verifies balance is sufficient | - begins a loop over all 'tasks' [the loop breaks them into chunks of tasks, and does a chunk at a time] | - makes a new transaction called mapTX | - loads mapTX with utxo inputs (mapTX.from()) | - ... ---- It's 13:48 . More-efficient-approach is raising a little. Notably asking-directly-for-help. Landscape wants to ask coderman for help. Right now, I am in the middle of many nested memory processes, and just want to do a task successfully. It's 13:49. Noting that help with tasks, or learning of better things, is always appreciated. ---- | time copied: 13:49 [detected 1 error with decaying habit] | time finished handling and editing: 13:?? |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks(tasks, privkey, utxos, feePerKB) | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: could you describe the behavior of the fundTasks function? | sidenote 2: fundTasks, in order: | - verifies balance is sufficient | - begins a loop over all 'tasks' [the loop breaks them into chunks of tasks, and does a chunk at a time] | - makes a new transaction called mapTX | - loads mapTX with utxo inputs (mapTX.from()) | - produces an output in mapTX for each task | note: I dislike this part of the code because it means data can be dropped during chain forks, without invalidating future transactions. Fixing is low priority here. I prefer chaining transactions end to end. | - ... ---- It's 13:51 ---- | time copied: 13:51 [detected 1 error with decaying habit] | time finished handling and editing: 13:?? |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks(tasks, privkey, utxos, feePerKB) | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: could you describe the behavior of the fundTasks function? | sidenote 2: the fundTasks description is being moved out of this box, to organize. ---- fundTasks: - verifies balance is sufficient - begins a loop over all 'tasks' [the loop breaks them into chunks of tasks, and does a chunk at a time] - makes a new transaction called mapTX - loads mapTX with utxo inputs (mapTX.from()) - produces an output in mapTX for each task - ... ---- | time copied: 13:53 [detected 1 error with decaying habit] | time finished handling and editing: 13:55 |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks(tasks, privkey, utxos, feePerKB) | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: could you describe the behavior of the fundTasks function? | sidenote 2: the fundTasks description is being moved out of this box, to organize. ---- fundTasks: - verifies balance is sufficient - begins a loop over all 'tasks' [the loop breaks them into chunks of tasks, and does a chunk at a time] - makes a new transaction called mapTX - loads mapTX with utxo inputs (mapTX.from()) - produces an output in mapTX for each task, and notes it in task.utxo - signs and prepares the mapTXs. that is all it does. - it adds the maptasks to the tasklist, and returns the list. ---- | time copied: 13:55 [detected 1 error with decaying habit] | time finished handling and editing: 13:56 |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks(tasks, privkey, utxos, feePerKB) | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: could you describe the behavior of the fundTasks function? | sidenote 2: it distributes the coins to the data transactions, but does not work with data. ---- ---- | time copied: 13:56 [detected 1 error with decaying habit] | time finished handling and editing: 13:56 |---- | present goal: find funding loop, specifically the spot where the error condition could be put |---- | goal: move the error condition to within the 'funding tasks' loop [speed up debugging] | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks(tasks, privkey, utxos, feePerKB) | issue we ran into: very long delay during "funding tasks", noticed on second run |---- | sidenote: it doesn't look like fundTasks is a place to put the error condition, but we can use this context to debug the speed issue better. ---- It's 13:57 and the next trial run finished. Scrolling up to see if anything new was output. Debug information was output as [object Object]. Better jsonify it. It's 13:57 and I'm pursuing the goal more rapidly than when I use the box, but using a process that has a very long delay in it. 13:58 I discovered I had a pane already opened, which I only saw when I tried to open the same file, right next to where I tried to open that same file. 13:59 I implemented the json change, and am having the urge to rip my helmet off, instead of rerunning the task that throws the error. I now have run the task. And immediately after the time changed to 14:00. My clock does not show seconds, and I would like to videorecord myself rather than typing all my words. It's 14:00 . I guess they're not words. They're not words! They tiny firings of neurons. It's 14:00 . It's 14:01 and the task is rerunning. I'm taking some time to consider whether there's a way to narrow down the error condition. Let's look for what throws the error. It happens in verifyTX. So, the goal I was _trying_ to load with the box, is putting verifyTX into the transaction generator, so I can see the details of the one that fails to verify, right next to the code that makes the error. I'll just work straight on that. It's 14:02 and my belt and helmet are suddenly irritating me a lot. It's 14:02. I pulled my chair in to reduce hte irritation, and I am suddenly quite hungry. I have had multple bowls of food today. It's 14:03 . I'm taking my helmet off, and it's still 14:03 . It's 14:03, and I'm stretching and things instead of doing what I want to. It's 14:04. Inner landscapes. I'm going to exfiltrate a goal: put verifyTX into the transaction generators, so we can stop the code where the issue is caused. ---- | time copied: 14:04 [detected 1 error with decaying habit] | time finished handling and editing: 14:07 |---- | present goal: fix the individual file bug, and upload each file individually to isolate the error |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: ---- ---- | time copied: 14:07 [detected 1 error with decaying habit] | time finished handling and editing: 14:?? |---- | present goal: fix the individual file bug, and upload each file individually to isolate the error |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- It's 14:08 and I'm waving my waterbottle around, listening to how empty it is. Plans are forming in me to undo the belt and get water. Inner landscapes. It's 14:09 and an imaginary debate and court are forming around whether we need to drink. [right now]. It's 14:09 and I am getting more water, with plan to reattach the belt [if I can get it unattached]. It's 14:10, my belt is detached from hte table, and I am no longer planning to reattach it. It's 14:10 and I am reattaching the belt. It's 14:11 and the belt is attached very securely in a way that is easier -- less likely to get tangled. When undone. Inner jokes. Less thirsty. Still half sip left in bottle. Meant to type 'have'. | sidenote: ---- ---- | time copied: 14:12 [detected 1 error with decaying habit] | time finished handling and editing: 14:?? |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: ---- It's 14:13 and we are planning on leaving the table and doing something totally different, and believing we will do this, even though we are tied to the table. In the plan, we aren't. It is very pleasant and relaxing to do this, and helps us have energy. It's 14:13 and my head is bobbing in a private way. Haven't decided whether to discuss the head bob. It's 14:14, and I'm trying to develop good energy around the present goal described above. It's 14:15 and my hands are trying to distract me. It's 14:15 and I'm worried because my consciousness is settling into waiting for the slow task, which is a pretty inefficient approach. I'm pausing the slow task with ctrl-Z, to not develop the habit of doing things as slowly as possible when trying to accomplish them in a timeframe. It's 14:16 and the task advanced. Suddenly. My decision to ctrl-Z has changed. It's 14:16 and something strange. Inner landscapes. It's 14:17 and I sat with all muscles limp and a blank stare for a bit, then put my helmet on. It is rough believing you are brainwashed to hurt yourself. It is 14:18 and I slammed my head at the air a bit to process having slammed my head at objects for real, long ago. It is 14:18 and the processes I don't understand that seem like they are pushing me around, are copying the task list down. ---- | time copied: 14:19 [detected 1 error with decaying habit] | time finished handling and editing: 14:19 |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have general spread away from present goal. currently an intense itch in a non-innervated body area. ---- ---- | time copied: 14:19 [detected 1 error with decaying habit] | time finished handling and editing: 14:20 |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have general spread away from present goal. currently an intense itch in a non-innervated body area. | sidenote: I have taken over for karl. I'm associated with headbanging-the-air. | sidenote: I think karl says it is okay for me to use and edit hte above data to do this. ---- ---- | time copied: 14:20 [detected 1 error with decaying habit] | time finished handling and editing: 14:?? |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have general spread away from present goal. currently an intense itch in a non-innervated body area. | sidenote: I have taken over for karl. I'm associated with headbanging-the-air. | sidenote: I think karl says it is okay for me to use and edit the above data to do this. ---- It's 14:21 and the above was about to get processed, but somebody is calling me, my mother, from another room. It's 14:21 and I called at her back, but no reply. I imagine her having a pang of sadness that I didn't reply. I'm staying. It's 14:22 . ---- | time copied: 14:20 [detected 1 error with decaying habit] | time finished handling and editing: 14:24 |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: there was data here that is still being used for memories, that was deleted ---- ---- | time copied: 14:24 [detected 1 error with decaying habit] | time finished handling and editing: 14:25 |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: there was data here that is still being used for memories, that was deleted ---- ---- | time copied: 14:26 [detected 1 error with decaying habit] | time finished handling and editing: 14:28 |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error | information: individual file bug throws ENOENT in prepareUpload logic.js:40, getFileDatum, logic.js:84, isDirectory, api.js:341 |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have made progress in reaching the present goal | sidenote: inhibition-part relates apology; had not realised this was valuable behavior [indicates needs better information on reason parts] ---- ---- | time copied: 14:28 [detected 1 error with decaying habit] | time finished handling and editing: 14:?? |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error | information: individual file bug throws ENOENT in prepareUpload logic.js:40, getFileDatum, logic.js:84, isDirectory, api.js:341 |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have made progress in reaching the present goal | sidenote: inhibition-part relates apology; had not realised this was valuable behavior [indicates needs better information on reason parts] ---- I'm not writing behaviors for a bit. It's 14:31 and I've restarted the trial process; the debugging output had an error displaying. Inner landscapes. Many of us didn't realise how hard these goals were. Maybe there are other goals that would be easier? That we'd prefer? No, the hardness is presently bound to anything that has effect and impact that we value strongly. So it doesn't matter much what we do. Well, maybe there's a [goal with greater effect?] [yeah probably. We need skill at goals in general to do anything at all, and we're on this one now.] [interesting idea.] It's 14:33. It's 14:37 . ---- | time copied: 14:37 [detected 1 error with decaying habit] | time finished handling and editing: 14:?? |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error | information: individual file bug throws ENOENT in prepareUpload logic.js:40, getFileDatum, logic.js:84, isDirectory, api.js:341 |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have made progress in reaching the present goal | sidenote: inhibition-part relates apology; had not realised this was valuable behavior [indicates needs better information on reason parts] ---- 14:40. Hungry again. No progress developing present goal. Mostly staring at it, moving my arms around. Staring at it written above. ---- | time copied: 14:40 [detected 1 error with decaying habit] | time finished handling and editing: 14:41 |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error [i've paused the ongoing upload, to help value this] | information: individual file bug throws ENOENT in prepareUpload logic.js:40, getFileDatum, logic.js:84, isDirectory, api.js:341 |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have made progress in reaching the present goal | sidenote: inhibition-part relates apology; had not realised this was valuable behavior [indicates needs better information on reason parts] ---- 14:42 lots more distracting experiences. 14:43 involuntary dancing ---- | time copied: 14:41 [detected 1 error with decaying habit] | time finished handling and editing: 14:?? |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error [i've paused the ongoing upload, to help value this] | information: individual file bug throws ENOENT in prepareUpload logic.js:40, getFileDatum, logic.js:84, isDirectory, api.js:341 |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have made progress in reaching the present goal | sidenote: inhibition-part relates apology; had not realised this was valuable behavior [indicates needs better information on reason parts] ---- 14:44 I'm detaching the belt and getting food and water. 16:01 I've returned with lots of food and water and other stuff. I cooked the food. My mother had just stomped on the gallon water jug and thrown it out, and I recovered it and filled it with water. I'm belting myself to the table leg again. 16:04 I needed to pee, then when I went to pee I suddenly no longer needed to pee, and barely could. ---- | time copied: 14:41 [detected 2 errors with decaying habit] | time finished handling and editing: 16:15 |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error [i've paused the ongoing upload, to help value this] | information: individual file bug throws ENOENT in upload cli.js:138, prepareUpload logic.js:40, getFileDatum, logic.js:84, isDirectory, api.js:341 | information: the error is caused because logic.js:84 reads the file from local path to discern if it is a directory. I don't understand why this doesn't crash a directory is selected. testing theory at 16:15 |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have made progress in reaching the present goal | sidenote: inhibition-part relates apology; had not realised this was valuable behavior [indicates needs better information on reason parts] ---- ---- | time copied: 16:15 [detected 2 errors with rare habits] | time finished handling and editing: 16:?? |---- | present goal: FIX THE INDIVIDUAL FILE BUG, and upload each file individually to isolate the error [i've paused the ongoing upload, to help value this] | information: individual file bug throws ENOENT in upload cli.js:138, prepareUpload logic.js:40, getFileDatum, logic.js:84, isDirectory, api.js:341 | information: the error is caused because logic.js:84 reads the file from local path to discern if it is a directory. I don't understand why this doesn't crash a directory is selected. testing theory at 16:15 |---- | idea: call verifyTX immediately when transactions are generated | idea: add profiling to code | idea: generate transactions immediately when reading files [probably-too-big] | idea: fix the individual file bug, and upload each file individually to isolate the error |---- | goal: speed up debugging | goal: handle transaction being made that exceeds limits | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | issue: this is going slowly for 3 reasons. personal issues, funding tasks, and checking existence |---- | sidenote: we have made progress in reaching the present goal | sidenote: inhibition-part relates apology; had not realised this was valuable behavior [indicates needs better information on reason parts] ---- It's 4:41. My browser had unloaded this page while I was looking at github stuff to figure things out. I've fixed two bugs in bsvup without testing them thoroughly, and narrowed down the primary goal. My clock is military time. Interesting I automatically ocnverted to PM. ---- | time copied: 16:42 [detected 2 errors with rare habits] | time finished handling and editing: 16:45 |---- | present goal: review the map transaction generation to see why it could exceed a size limit |---- | idea: call verifyTX immediately when transactions are generated .asdf not sure where i am typing but a computer glitch put the cursor here. looks like it teleported it up about 3 pages of text.| idea: add profiling to code |---- | goal: handle transaction being made that exceeds limits | goal: test single file uploading | goal: speed up debugging | information: the transaction that is too large is a map transaction. | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. |---- | sidenote: the merged2 local bsvup branch contains 2 fixes that have not been submitted as PRs. they should be submitted if the branch works well. They are exit code, and single-files. ---- ---- | time copied: 16:45 [detected 2 errors updating the time, using rare habits] | time finished handling and editing: 16:58 |---- | present goal: review the map transaction generation to see why it could exceed a size limit |---- | idea: the problem could be from an excess amount of inputs | idea: lower the chunk size map transactions use? | idea: add profiling to code |---- | goal: handle map transaction being made that exceeds limits | goal: test single file uploading | goal: speed up debugging | information: the transaction that is too large is a map transaction. | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | information: verifyTX is in txUtil.js |---- | sidenote: the merged2 local bsvup branch contains 2 fixes that have not been submitted as PRs. they should be submitted if the branch works well. They are exit code, and single-files. ---- A map output script is 25 bytes long. They are capped at 1000 count, and many 1000 count map transactions have already been broadcast. I'm adding more details to the thrown error. I'll also find the command that specifically raises it, to speed things up. It's 16:59, and I'm guessing that the issue is caused from the number of inputs present. We can calculate the input size and subtract. I'm betting that'll fix it, and I'll work on that while I wait for this next run with extra debugging info to run ... I haven't even finished adding all the debugging info. ---- | time copied: 16:58 [detected 2 errors updating the time, using rare habits] | time finished handling and editing: 17:?? |---- | present goal: review the map transaction generation to see why it could exceed a size limit |---- | idea: the problem could be from an excess amount of inputs | idea: lower the chunk size map transactions use? | idea: add profiling to code |---- | goal: handle map transaction being made that exceeds limits | goal: test single file uploading | goal: speed up debugging | information: the transaction that is too large is a map transaction. | information: Funding Tasks is at logic.js:350 in fundTasks() | information: fundTasks() makes the map transaction to distribute coins. | information: verifyTX is in txUtil.js |---- | sidenote: the merged2 local bsvup branch contains 2 fixes that have not been submitted as PRs. they should be submitted if the branch works well. They are exit code, and single-files. ---- 17:09 I attempted a fix of the issue. I am sitting waiting for it to run, rather than doing anything. It is hard to alter one's own choices, and I'm kind of okay just sitting here right now, even though in general I'd rather do things. 17:11 Okay, I remember I wanted to test my single file uploading fix. I'll find a single file to upload that'll get uploaded anyway. 17:13 looks like it's working ok. This also tests part of the map transaction, pleasantly. I won't broadcast it because I'm unsure of the double spend situation. The PR is at https://github.com/monkeylord/bsvup/pull/34 for single file uploads. I came back to this email and it showed an old copy of the draft. I must have two email windows open, thinking it is one. That'll confuse me for sure. It's 17:15 The other bugfix is at https://github.com/monkeylord/bsvup/pull/35 . The computer glitch the moved the cursor three pages happened the first time I hit that period, '.' right after 35 and a space. I had a body glitch too, so I probably hit some key other than space or period. I handle a lot of sudden surprises; it's hard to know why caused the other. It's doing the long funding tasks run. It has to merge a lot of existing utxos into a lot more utxos to meet the hardcoded map behavior that I don't understand the reason for. It doesn't seem like it should take this long. It's 17:20 . I'm waiting here so my resistance patterns can have a break. If this works then I'll have a big chunk of a script that can automatically upload months =) Almost completed =) And it links to raw emails from the web emails, now. Some of those headers are weird looking, for sure. Notably the web interface doesn't include pgp stuff, which is crummy. Must have been made before that was a thing, I guess. Another thing I want to work on is seeing if the cypherpunks-legacy stuff works the same as the cypherpunks stuff, and having it check both lists when archiving a month. It's 17:23 and I'm reinforcing one of my control and inhibition patterns to relax, which is crummy. Guess I shouldn't relax quite so much, but it is so much more relaxing to do that. I'm reviewing the current task and I'm surprised that it's going over stuff I -- that is different from what I expected and thought was going on. 17:25 it's uploading the txlinks, and I thought it was uploading the attachments. 17:26 my single file upload test failed before trying to broadcast. Same oversized transaction issue. Guess I didn't fix it. The debugging output is too long to see anything: the transaction dump fills the whole backlog. 17:28 bsvup stores a dump of its transactions. I forgot about this. This should make debugging quite straightforward. 17:28 inner landscapes identified an inhibition/distraction choice and i'm trying not to make it. it's very hard; it keeps moving somewhere else when addressed. ... it went poorly [but was partly managed]. 17:29. [it took/made a bargain of being able to spread instead of act as strongly]. 17:30 . [they happen like fights. it made a successful fight and we dodged and it grabbed hold] <let's try not to open more of that up; i don't know how to close it and stay on task>. 17:31. 17:32 The problematic transaction? may be a 34000 byte one. Mother calling, cannot hear. Am belted to table. She needs help carrying something. It's 17:32 and I'm unbelting myself. The max transaction size is 1 million, so I'm looking at the wrong thing. It's hardcoded; I don't know what the network maximum is, but htey always raise it. 17:33 afk 17:37 back. Choosing to return weakened my decision-making capacity. check sizes of all txs in debug file. 17:38 the test run finished. the problematic tx is 2110440 bytes long, twice the hardcoded max. I guess the most reliable way to figure this out is to review that transaction. 17:41 it's exporting the transaction next failure, if I didn't make too many typos. 17:42 I'm taking this wait time, to rebelt myself. 17:43 eating some food. Propose to try to force work after food. [proposing, i mean]. 17:48 I think I'll review the code that selects the number of outputs. 17:49 noted that I dropped the size per output to 42% but kept the max number of outputs at 1000. I should change max_output to a number of bytes to allocate to outputs. 17:51 running a test with change. other test finished and i have transaction details I can look at now. 17:52 ohhh no I don't. made a typo in the error handler. orrr ... something. earlier the 'fs' module didn't need requiring. maybe that's only in node shell or something. 17:53 I had my work open on a smaller laptop, and I just noticed it has turned itself off, and the battery is not charging despite it being plugged in. some loose connection I suppose ... 17:55 I tested the 12v power adapter on the other laptop with a voltmeter, and it is only producing about 0.5V. 17:56 the other end fell out of the wall. maybe residual charge. it looke dlike it was in well, but i can't see well at all. It's 17:56 and a test is running that may fix the issue, and may have better debug output. It's doing the slow 'Funding Tasks' loop. slowlness seems caused by all the utxos. Let's look at that loop a little. There are two lines here that might slow it down ... but there's another assumption tha tlikely makes it even slower. When it pulls numOutputs out of mytasks; yeah. It's copying the entire range of outputs every iteration, because it pulls the outputs from the start of the list instead of the end. But that's output-dependent not input-dependent .... maybe I'll just throw some timing calls in here, and do some hand-profiling. 18:02 the test hit the same typo I thought I fixed. if it hits it again, it means I didn't fix it. 18:06 I added profile checks, untested, around all the parts of fundTasks that loop over something. Every use of arrays here uses unshift() instead of pop(), so that's probably why it's so slow. Maybe I can go through and change all the logic to pop, or at least change it to pop where it doesn't look like it would matter ... It's 18:07 It's 18:15. A profile test is running, and there is data on the bad transaction that I haven't looked at. I might be able to resolve it just with hte data. The .json tx file just says '[object Object]', guess I did that wrong. I'll fix it. 18:16. noooo it will take too long for another run. I'll use the hex file just. The bad tx has 49 outputs. and is only 8613 bytes long. This sure didn't work. doesn't verify. hmm. the actual length is 2 million. maybe i need to convert it to a buffer from hex. same thing, converting to a buffer. hrm. there we go; conversion issue. 18:20 The bad tx has 1 output. And 7154 inputs. The profiling result is that remaining delay is all within a single chunk I called 'finalise maptx'; kinda surprising. maybe it's the signing that is so slow. Looks like it's using too many utxos as inputs. [i kind of want to just get rid of the map transactions,and send utxos straight to data transactions, but then i would have to add change outputs to them, I suppose; not ready for storing that plan in my working memory] Slowness is in: finalise maptx Issue is: too many inputs. verify that. count=7154 . Yes, 7154 * 156 > 1 million. I guess we want to cap the input count. 18:30 running another test. profile information is drilled down ... hey! it went fast, and succeeded! nice. I'll remove my speedup changes and see if it still fast. 18:31 Yes, the issues were all from the input overflow, or something, i guess. let's condense this fix and test with just it. 18:35 testing single file upload for real. Funding Tasks had a little delay, but is reasonable. My speedups were removed for clarity submitted a PR. Don't know if threy had much impact, didn't check really. 18:36 single file seemed to upload fine. I'll wait for it to finish and then try the other thing. Meanwhile I'll remove all my commented out stuff and submit the PR. 18:42 https://github.com/monkeylord/bsvup/pull/36 . Still waiting on a block confirmation for the single file upload. 18:45 still waiting on block confirmation. Inner landscape directing me to check cpunks list, which is just a strong habit I have built up. I want to verify the month.bash file will do something productive if running it as a test works. Okay. I wanted to check that the mutated message htmls produce good links to raw emails. 18:47 and I can check that while waiting on this block. 18:49 A random example email looks great. I think I'll add the server numbers to the output, since they aren't in the url when the url is a transaction id. 18:54 added the numbers, generated, tested. Still waiting on a block. 18:56 The next step for month.bash is to generate the index files, like I did by hand earlier. I think that just means running them rthrough the mutator .... I remember there's something I'm forgetting, but maybe I can at least add that code. There's a prorotype month.bash! It's 18:58. I've set all the upload calls to wait for confirmation, since I've never tried them yet. Still waiting on a block. I'm kinda confused still. Just sitting and resting. Trying to think of what I'm intending to do here; I don't really know. Or maybe any other things on my plate I've forgotten. I remember the txlink situation could be confused. I'll review the current map for txlink paths and see what there is. it's 19:00 . There are no txlink entries in the map file! I guess I dealt with that. [typed 'dealth' at first.] so, when it runs, i want to make sure it uploads newly generated txlinks and not cruft txlinks. 19:02 inner ... landscapes? sometimes i mess up my inner landscapes and start doing things to myself that are really freaky and i'm near there right now. there are ways to really mess up my memory, beliefs, ability to learn, pain and suffering and such ... it takes a while for things to fade out when i mess that up ... i guess i'll set up for closing up. i'll push my changes into a merged branch and set the script to use it, etc. 19:09 pushed.
1993-July thread: https://bico.media/15dbfa08f946abee4ddb80dd33e446dd2fd7a64ef550eeb5cccbd2cb4... subject: https://bico.media/0f8c72317153e8f8445531e9ba7a7f396de8147de494022bdbe6d64db... author: https://bico.media/c524d0b0c6cbcaf79da488b3139722defeaa02bbe12611189f1e5eab7... date: https://bico.media/aa591274a0e1903a8d498148860d4f28a1a384155302c7c109f28c48f... I'd like to add inclusion of the list index page that lists all the months at some point, and the listinfo page that describes the list. I'm kind of burnt out at this point. I checked the account and it said more than $400 had been transferred through it. I wonder if that's some glitch or if somebody put a lot of money in and then took it out or something.
2020-September author https://bico.media/fedd0a8368dd68ae495279094435391f0e13291866af7a8a26aa18202... date https://bico.media/bd7fb31a5d7e685fcba3892fd28a7e4f7cc35c57576e7a7812a68746e... subject https://bico.media/4fe2cc266634e04401d27e366529b83c1f61cecf7767ab53f4b426dcc... thread https://bico.media/a41c50edfa8fc0c46d0f46ae82ac8c65e9f925f5c5a731006cb421318...
I'm really struggling to do this. And I'm pretty certain that people already have.
On Mon, Dec 7, 2020 at 11:26 AM Karl <gmkarl@gmail.com> wrote:
1993-July thread: https://bico.media/15dbfa08f946abee4ddb80dd33e446dd2fd7a64ef550eeb5cccbd2cb4... subject: https://bico.media/0f8c72317153e8f8445531e9ba7a7f396de8147de494022bdbe6d64db... author: https://bico.media/c524d0b0c6cbcaf79da488b3139722defeaa02bbe12611189f1e5eab7... date: https://bico.media/aa591274a0e1903a8d498148860d4f28a1a384155302c7c109f28c48f...
I'd like to add inclusion of the list index page that lists all the months at some point, and the listinfo page that describes the list. I'm kind of burnt out at this point.
Noting also that the 'raw' emails I included from the gzipped bundles, do not include most of the headers. Instead the .mbox file should be used.
I checked the account and it said more than $400 had been transferred through it. I wonder if that's some glitch or if somebody put a lot of money in and then took it out or something.
2020-September author https://bico.media/fedd0a8368dd68ae495279094435391f0e13291866af7a8a26aa18202... date https://bico.media/bd7fb31a5d7e685fcba3892fd28a7e4f7cc35c57576e7a7812a68746e... subject https://bico.media/4fe2cc266634e04401d27e366529b83c1f61cecf7767ab53f4b426dcc... thread https://bico.media/a41c50edfa8fc0c46d0f46ae82ac8c65e9f925f5c5a731006cb421318...
I'm really struggling to do this. And I'm pretty certain that people already have.
I would rather work on livestream archival, which is easy to extend to a mailing list by simply piping the mbox file in and out. I have existing projects including https://github.com/xloem/openrealrecord . Back when I had a little money, I hired somebody to help with openrealrecord and paid them over $1000; they completed the work except for one final small part, and never took the money despite me releasing it. It is still sitting on bountysource, years later. Nowadays there are c++ and rust ports of the dat project, which was the backend of open real record. On Mon, Dec 7, 2020 at 11:32 AM Karl <gmkarl@gmail.com> wrote:
On Mon, Dec 7, 2020 at 11:26 AM Karl <gmkarl@gmail.com> wrote:
1993-July thread: https://bico.media/15dbfa08f946abee4ddb80dd33e446dd2fd7a64ef550eeb5cccbd2cb4... subject: https://bico.media/0f8c72317153e8f8445531e9ba7a7f396de8147de494022bdbe6d64db... author: https://bico.media/c524d0b0c6cbcaf79da488b3139722defeaa02bbe12611189f1e5eab7... date: https://bico.media/aa591274a0e1903a8d498148860d4f28a1a384155302c7c109f28c48f...
I'd like to add inclusion of the list index page that lists all the months at some point, and the listinfo page that describes the list. I'm kind of burnt out at this point.
Noting also that the 'raw' emails I included from the gzipped bundles, do not include most of the headers. Instead the .mbox file should be used.
I checked the account and it said more than $400 had been transferred through it. I wonder if that's some glitch or if somebody put a lot of money in and then took it out or something.
2020-September author https://bico.media/fedd0a8368dd68ae495279094435391f0e13291866af7a8a26aa18202... date https://bico.media/bd7fb31a5d7e685fcba3892fd28a7e4f7cc35c57576e7a7812a68746e... subject https://bico.media/4fe2cc266634e04401d27e366529b83c1f61cecf7767ab53f4b426dcc... thread https://bico.media/a41c50edfa8fc0c46d0f46ae82ac8c65e9f925f5c5a731006cb421318...
I'm really struggling to do this. And I'm pretty certain that people already have.
participants (1)
-
Karl