[OT] [mess] Blockchaining Cypherpunks (was Re: <robotic attempts to forcefully influence things>)

Karl gmkarl at gmail.com
Sat Dec 5 14:01:13 PST 2020


TLDR: I didn't get very far today, but it was lots of fun to spend the
whole day trying.  Not sure how well I can repeat it.

It's 2020-12-05T08:26:11-0500, and I'm looking for a recording of
Bach's Well-Tempered Clavier, semi-oppressive music I've listened to a
lot, that helps my brain do more things.  It's semi-oppressive because
its feelings share history mostly from old white male europeans, who
are already sharing most of the history.

I had trouble doing anything but editing my expression above.  It's 08:28.

I'm dropping the goal of finding the music because I have too many
things on my plate now, struggling to resist my experiences.

Let's trim this quote below:

On 12/4/20, Karl <gmkarl at gmail.com> wrote:
> On Thu, Dec 3, 2020, 7:04 PM professor rat <pro2rat at yahoo.com.au> wrote:
>
>> Coderman was just being kind and gentle with Karl, who is feeling fragile
>> and painful.  He is _not_ well...."
>>
<snip> This is actually someone else PR was quoting, but I know he is
also feeling it somewhere in his heart, too.

>> The technical term for that is ' collaboration ' .

<snip>

> I tried out storing a month of the cpunks list on a blockchain, which would
> cost about dollar to store in plaintext if done more efficiently than I
> did.  This system is more $5 a month.  Here are the links:
>
> 2020-September
> author
> https://bico.media/fedd0a8368dd68ae495279094435391f0e13291866af7a8a26aa182028af2df6
> date
> https://bico.media/bd7fb31a5d7e685fcba3892fd28a7e4f7cc35c57576e7a7812a68746e48c15f1
> subject
> https://bico.media/4fe2cc266634e04401d27e366529b83c1f61cecf7767ab53f4b426dcc0590970
> thread
> https://bico.media/a41c50edfa8fc0c46d0f46ae82ac8c65e9f925f5c5a731006cb421318cd524e6
>
> https://github.com/xloem/cpunks.org-bsv-partial.git
>
> Below is how I did it.  I wrote everything I did into this ridiculously
> huge email, and somehow got somewhere.
<snip>

Arright!

It's 08:30 .  I apologise for spamming your list, but it seems to be
letting me store a lot of things on a blockchain, and having so much
amnesia I really, really love to do that.

I'd also like to take some time to express the value for other
conversations going on in this list.  I would love to read them and
express myself in ways that respect them, but I am being selfish, and
working with blockchains instead.  I know they can help my memory
issues.

It's 08:31 and I'm daydreaming about frustration around repeatedly
destroying all my own records.  Let's make a TODO list:

- music? [please-decide: from phone or from computer] [i-pick phone]
[okay it's on the queue]
- add attachments to mailing list uploading
- upgrade bsvup so
[here i got inhibited.  I have food and I am eating it.  I took a
shower and did some laundry this morning, which is incredibly rare for
me.  It's 08:33 .]

- music from phone?
- upgrade bsvup so transactions confirmations are verified. note the
current version so as to return if the new version is broken.
- add attachments to mailing list uploading.

Note: over the course of the night, my tmux session terminated on its
own.  I had some things open that are semi-lost now.  I work in 1 tmux
session.  [having some difficulties controlling text editing.  it's
08:35]

Forgot goal.  Upgrade bsvd -- no.  Find current bsvup version.  Open a
notes pane and place it in.

/usr/lib/node_modules/bsvup/package.json will have this information.

Yesterday I was using bsvup version 1.6.8 .  The npm integrity hash is
sha512-3ov9XVuNmZ34EWsxHnU/zQFpqGvn5U7pDCFoxpk2k7aRSIOyVOmplRWGAL+/celTRdkfL82GZZWjbzuk7pUU+g==
.  The tarball shasum is 5d9017204005a692cce247c7cc33ec834ff98ad2  and
its url is https://registry.npmjs.org/bsvup/-/bsvup-1.6.8.tgz .

Okay!  Let's upgrade bsvup.  First, I'm typing this on my computer now
somehow, instead of on my phone.  I use gmail's basic HTML view, so
that my system freezes and misbehaves at a rate that increases more
slowly.  But I'll want automatic draft saving, so I'll switch to
standard view.

[blips out of existence while saving draft]

[blips back].  It's 08:38 .

Goal was .. upgrade bsvup!

I'm used to using bsvup from its git repo.  I'll use the latest
release from there, maybe.

sudo npm -g uninstall bsvup
cd ~/src/bsvup
git pull

git pull is just hanging, not sure why.

curl -v google.com

also hanging.  Implies no nameserver responses.  The wifi where I'm at
is high strength, and I just loaded the gmail javascript page fine.

google.com now loads; git says 'empty response from server'.  It's
hosted at github.com .

git pull

Already up to date.  Great!

git status

Dang I have a lot of unsaved changes.  I wonder what the latest
version is, and what they are.

git branch

This is the main branch, which is kindly being renamed from a slavery word.

I guess I'll just use npm's latest version for now.  I often need to
fix bugs though.

sudo npm install -g bsvup

I wanted to use my changes, which look harmless, but that seemed much
harder to continue to do.  It's 08:43.

Ohhhh it reinstalled version 1.6.8 .  This is so strange!  I added
transaction confirmations to bsvup so very long ago.  Maybe they're
not on npm?

git blame package.json

The 1.6.8 line is marke March 29, 2020.  It should have this, I think?
 I don't remember?  It's 08:44

Inhibited.  Looking for music on my phone.

It's 08:45 .  I plugged my headphones into my phone, but I'm not sure
how to play music on it.  There must be a music app.  Back to bsvup.

Goal: transaction confirmation.  Check git history from master branch
to see what date it was merged or whatnot.

grep -r onfirm . | grep -v node_modules

No helpful results.

Maybe I can find the branch where I dev'd it.

git branch

Don't see anything helpful.

K.  Goal: transaction confirmation changes.  It's 08:47 .  I remember
.... that I added this ... to a loop that waits.  It relates to ....
unconfirmed transactions.
I changed the unconfirmed transaction mechanism to use a subfolder
instead of a json file, so transaction files could be moved between
subfolders etc.  I'm not certain that this change was in my
contribution.  This subfolders are managed in cache.js .

less cache.js

I recognise the loadTXList here as something I likely added for this.
It's a function, my brain dropped that word.

The saveUnbroadcast function here looks like it includes the changes I
remember.  This is not how 1.6.8 was behaving yesterday.  It was
saving unbroadcasted transactions in a single .json file.  I review
the function again, to verify it doesn't do that.  It's 08:50 .  Yes,
it saves them individually in an 'unbroadcasted' folder.  I copied the
morphemes used by the chinese project owner.

git blame cache.js # it's 08:50, still

My changes are dated 05-10 .  So there simply hasn't been a release
with them, yet.  If I added a package.json file to the uploader repo,
I could have it reference the git branch, and use that.  That'll be
the goal for now I suppose, so we get transaction confirmations.
Inhibited.  08:51

Inner dialogue.  Inner emotional landscape.  Believing you are mind
controlled can be rough.  It's 08:52 .

The spirit of open source is to contribute.  Rather than forking bsvup
and making a new release, we hope that the project owner can find the
harmony to make one, or to at least ask for a new maintainer.  They
invited me to co-maintain the project but I think I was too slow
following up on it or understanding it, and I think I am not a
maintainer anymore.  At least, I don't have their npm account to cut
an npm release at this time.

Right now, let's set up the git r -- let's plan to set up the git
repository to reference a git commit id for bsvup.  First let's test
one of them.  I wonder if npm can install a commit id?

sudo npm install -g
https://github.com/monkeylord/bsvup.git\#e55c90afab674c3eb14f9ecd2a50632fcd3b9472

Hey, that seems to be working great!  I have it written above if it
works and I want to add it to the readme.  It's 08:55.  This person
would probably make a release if I asked them.

k.  Let's test the new bsvup.  Ummmmm if I reupload the same folder
from yesterday, it should say that no files changed, I hope?

While it checks all those files I'm going to continue reviewing my
bsvup changes, to see if I can get the repository in order for
handling new bugs I encounter.

- I've changed the transfer function, so that the fee rate can be set.
It also reports how many utxos are bound together; that's probably
temporary for debugging.
  This looks like a mistake in the fee implementation, missed when it
was improved.  Should be turned into a small PR.  This is in api.js
and cli.js .
- I've added an option -e, --rename to provide for uploading single
files with different names than they have locally.  Calls
logic.prepareUpload in cli.js .
  The rename changes in logic.js aren't easy for me to verify as being correct.

I think I'll separate the two changes and push them as branches.  I'll
try using the transfer fee one as a norm and submit it once I learn it
works.

It's 0901.  Inhibited again.  More inner landscapes.  Eating food as a
distraction.  Maybe I can find a music app installed on my phone
somewhere.  Haven't found yet.  Back to workplay.  0902.

Whoops!  Suddenly very thirsty.  Water's right here.  Okay.  0903.
NOW.  GOAL.  What is goal.  Don't remember.

Relates bsv-up.  Repo is messy.  Split changes out while test runs.

git stash # to save them in case i make some error
git stash apply # to bring them back
git checkout -b # uhhhh .......

looking for what-i-am-doing to make branch name.  What am I doing?
Working with 2 changes.  Want to start with smaller one.  Is described
above.  First item above.  Need name for.  It's 0905.

Sets fee rate for transfer function.  transfer-fee will be branch name.

git checkout -b transfer-fee
git diff # to see changes

I keep git diff open in one pane, and remove the unrelated changes in
another tmux pane.  Tmux was hard to learn (i was used to screen), but
it can do
vertical splits much more readily.  It's also much higher level and
seems buggier and sloppier to me.  I would prefer to use the vsplit
patch to screen, in retrospect.

Hey, the test run finished.  It is trying to upload a ton of existing
files =/  so fixing the existing function would really help here.

Reconsidering approach.  I think to contribute the most, I'm going to
skip this task for now and switch to adding attachments.  Can upload
special subfolders to work around the issue of existing.  I can also
add the tx files to git-annex so that others can reuse them.

First I will check the transactions for unconfirmed.  I _think_ I can
do that, with this new bsvup version, by moving them from .bsv/tx to
.bsv/unconfirmed.  Checking the source to see if that's right.  No,
it's .bsv/unbroadcasted .  I'll move them there and see what it
thinks.

mkdir .bsv/unbroadcasted
mv .bsv/tx/* .bsv/unbroadcasted -v
bsvup

Now bsvup prompts to resume the unbroadcasted transactions.  Let's see
how it goes.

They are all failing with missing inputs.  This is probably a new
server api quirk for transactions that are already onchain.  The
inputs are not in some pool of unspent transactions.  Blargh.  My
confirmations changes aren't working any more.

Okay, I'm just going to push my present changes to a branch that has
htem merged together, to store them somewhere.

git checkout -b transfer-fee-and- # blargh ... what is the other
change?  let's scroll up. renaming.  it's 0911.
git bechkout -b transfer-fee-and-rename
git add *.js
git commit -m 'changes found in my dir'
git push xloem
git checkout master

K, nice and clean to figure out this confirmation issue.  I'll search
for the error message.  'the transaction was rejected by network
rules' .  Not present, must be generated by server.

Inhibition.  It's 09:13.  'MatterCloud API return Errors'.  I'll
search for that.  Happens in api.js , in broadcastInsight function.
This function has a lot of legacy content because the api provider
keeps on changing things and available energy is small.

Looks like I have unsaved api.js changes.  Those should go in the
transfer-fee-and-rename branch.  I noticed my vision was double and
converged my eyes, which is nice.

git checkout transfer-fee-and-rename
vi api.js
(R)ecover
:wq
git diff api.js

No changes.

rm .api.js.swp
git checkout master
vi api.js

Back to workplay.

K.  Refresh goal.  It's 0917.  Trying to figure out how to detect
confirmed transactions with api changes.  Finding what now ....
broadcastInsight.
broadcastInsight is called from the broadcast() function.  I've done
this before so the parts have familiarity.  It's still hard to find
them.
broadcast is called from tryBroadcastAll .  I vaguely recall that a
merge here broke some logic once.  I don't know if that's been
addressed.

Here's the issue.  It's still using BitDB to check if the transaction
exists, and BitDB is that same service the maintainer deprecated.
Obviously the right solution
is to connect to an actual node.  There's even a node API service used
by electrum, where you can connect to many api servers that use all
the same api protocol.  But they don't offer the bitquery service that
bitdb and bitbus do, which are open source.

It doesn't matter what to do here.  I'll try bitbus since bsvup uses
bitdb in other places, too.

I'll try to change _just_ the exists function, to use the abstraction
template I added a while ago, that isn't linked in yet.  I'll also
actually contact the owner of bitdb, and ask them to reboot it.
They're in some 'atlantis' chat ....  websearching 'atlantis bitdb' i
find that it's a Slack.  I bet it's already open if I go to Slack.
It's 0921.

Slack has heavy javascript so I'll use it on my phone which is less
critical than this laptop atm.

I have unread messages from the polyglot maintainer.  There's value
around using electrumsv-sdk .  It's 0923; I have a lot of thoughts
bouncing in.

Contact person who runs bitdb.  Maybe in general chat.  Find their
name to tag them in a message.

Haha there is a recent big long thread from KarlTheProgrammer, another
one like me, but they are expressing criticism that doesn't seem
contextual, which is weird.  Yeah, the chain is full of thousands of
spammy transactions.  It's because we can barely code anything.
Assistance certainly welcome.

This person hasn't sent a message for a while.  I suddenly remember
their name.  I ping them in a new message.  "is there any chance you
could give bitdb a reboot so legacy stuff can get some requests
through?  I am working on upgrading to bitbus, electrumsv, local
nodes, but it is very hard for me; I don't understand the code well.
It would be nice to get a few more transactions through."

It's 0928.  Next task: move the existence-checking function to use the
abstraction I already made.

I'll want to copy the findTx function from bitdb.js, and mutate it for
bitbus.  I did some of this mutation already yesterday for bitfiles,
which still has similar code.

The key bit might be the queryTx function.  This generates a bitquery
request for a single tx id.  bitbus has slightly different bitquery
norms than bitdb.  I'll look up what they are.

It's 09:32 and I'm reading the bitbus query doc at doc.bitbus.network
.  It looks like the "v" version field is not needed.  It mostly links
to straight the bitquery documentation.

That wasn't helpful.  Looks like he bitquery limitations in bitbus are
listed farther down in bitbus's docs.  It's 0935.  It supports "q"
find, limit, sort, and project.  "r" queries and aggregate queries are
not supported.  This should be fine for txExist.  Yes, queryTx doesn't
use "r".  This should go smoothly!  0937.

Getting really pushy, attempts to use my working memory.  It's 09:39.
Relaxing, letting the urge wipe everything just a little.  It's all
nearby on my screen.

Working on making bitbus work to check transaction existence in
monkeylord's thingie ... bsvup.  Ongoing goal: find bitbus usage
within

Inner landscape stuff.  It's rough, believing in mind control.
Distractions.  It's 09:40.

Ongoing goal: find bitbus usage within bitfiles, copy method of
sending request.  Here we are.  It's 09:41 and I'm copying the request
headers over.

I manually set my tabs to 2 spaces.  It's important to copy the style
of an existing computer program, when working on it.

It uses node-fetch.  I want to send a post request, so I'll look up
how to do that with node-fetch.. Pass { 'method': 'POST', 'body': data
}.  09:43.

Existing usage calls an asynchronous function to convert to json
(await result.json()).  Let's see how I can skip over that in
node-fetch, since bitbus is ndjson that wouldn't parse.  await
result.text() .  Great.

0946.  I've made the bitbus function that mirrors bitdb and am trying
to verify that it is what I meant to do.  I'm having trouble focusing.

Water.  More inhibition.  Inner landscape.  More water.   Back to workplay.

The function I made shold be called .............. txQuery?  I think
there's a txExists function below it.  Let's see.  Nope, findTx.  Now,
add findExist.  Uses a queryHash function that I'd better move over.
queryHash uses 'r' queries, unfortunately.  Maybe I can change the
abstraction path to skip this for now.

queryHash looks kind of important.  bsvup uses a hack that looks maybe
like it was made to quickly handle existence checking without
reviewing the whole ecosystem of technologies.  It names the files
based on their checksums, and looks for files with those names to see
if they already exist.  Is that what we want here?

It's vulnerable to people uploading fake transactions.  So it should
check that the data matches the hash.  That's not too hard, but not a
priority atm.

The 'r' filter shows the fields that are expected to be transmuted.
We can move that behavior outside of the query, and do it in
javascript.

The hash used is sha1.  I have enough information now to migrate the
whole function, and the way it is done can inform migrating the rest
later.

It's 09:55 and I'm partway through the migration of the functoin.
Inhibition.  Inner landscape, briefly.  More inner landscape.
Internal issue referenced with frustration.  Inner healing !  Brief.
Sense of everything changing.  Fits of laughter.  Distraction.

It's 09:56.  Inner learning situation.  We take some time to shift our
attitude, to be more respectful and meaningful.  Brief meditation.

Distraction.  Inhibition.  Laughter.  It's 09:57.  I have a sense of
learning what is truly important in the world.  I try to hold it to
the side while continuing to develop software.  It's 09:58.

I feel much better but it's hard to act.  What am I working on?
Migrating findExist.  I copied the 'r' query portions to turn them
into javascript code.  Copying them down into a loop.  The 'r' and
'r2' are handled differently; here's this protocol shift again.  Need
a different loop for each query result.

Inhibition.  10:00a.

10:01a.  Back to workplay!

10:04 .  Struggling to continue.

I should limit the returnde data using a 'project' field, to manage bandwidth.

I see the query is using a '$or' field.  This might be 'aggregate'
which is what it doesn't support.  I'll check.  Maybe not.  10:08.

I think I'll continue by testing it.  I'll need an sha1sum of an
uploaded file, and a txid.  I guess I might as well put it in the
abstraction, and test with that?  Either one.

oh wow the abstraction is better than my copying.  Just test it for now.

1013.  More inhibition.  Testing I find there's a blank line at the
end that is crashing when turning to json.  Probably normal for
ndjson.  Gotta remove that.

1015.  Froze my debug shell with an infinite loop, somehow.  When I
killed the process my ctrl-Ds went through and killed the terminal
too.

1018.  bitbus.findTx returns correct data for a present transaction.
It crashes for a nonpresent one.

1021.  bitbus.findTx passes both adhoc tests.  Looking for an sha1sum.

It takes data, not an sha1sum.  Need data to test.  Maybe copy file
into working dir and open.

1024 findExist fails to find existing file.  I'll check the tx
manually, and see if it should succeed.

1025 The hash is being calculated correctly inside the function.

1027 I can see this hash is also correctly inside the transaction at
https://blockchair.com/bitcoin-sv/address/d-c0c871976a2935de5a1ad2a01ad7f638
.

So, I'm probably sending a bad request to bitbus.  I can debug that
with curl, I suppose.

1029.

1031.  Still trying to craft curl request.  Inner landscape stuff.  It
is rough believing in mind control and human trafficking.  Why,
believing in human trafficking can get you very worried.  Luckily we
have beautiful sunrises.

1036.  My curl request is returning the correct data, so the error is
likely in the procesing of the results, or a typo that is hard to see
or soemthing.  My inner landscape wants to hire me to work in a secret
hacker thing, rather than me working on this.  It is rough believing
in coercion.  Sunrises.

1037. 1038. Distraction.  Goal: output result of query, so we can
bisect where the issue is.  By bisect I mean when divide an array in
the middle, and it's sorted.

1039.  Found what is likely the issue by inspection.
1041.  Fixed another error that is basically the same thing (leaving 2
out when copying a block of code, skipping the cognitive difficulties
of abstracting their similarity out)
1042.  Fixed another error of exactly the same kind, which I fixed and
then put back while fixing the last.

1042.  findExist works correctly now.

Whew.  findExist returns the txid of transactions that could contain
passed data, since they have the sha1 tag.  So downloading those
transactions and checking the content could be done outside the
function if needed, I suppose.  Maybe I'll rename it to
findMightContain in bitbus.

10:44 it is snowing.  Snow break.

Laundry moved to dryer.  Peed.  1048.

There's more laundry.  It's okay.

K.   I made bitbus functions that emulate bitdb functions.  I guess,
for the project, I should mkae the 1 remaining bitdb function a bitbus
function.

I mentioned karma to Zenaan earlier.  When I rename findExist to
findMightContain, that's karmic motion.  It lets people learn about
and act on the danger held in the error inside findExist.  Gives it
more avenues than mysterious exploitation and confusion.

With cultural change and such, the sutff like findMightContain gets
more meta, where people act on pushing influence around because of
their shit, instead of just producing the influence directly from the
shit.  There's a lot of that now it seems.

It's 1054 and I have no idea what I am talking about.

It's 1059.  I found a long-standing bug inside the implementation
where D history is misordered.  Fixing by making sorting more complex.

I am unsure that my change is helpful.  Maybe the bug is a feature.
It's 11:01.  Also, make up your own karmic meaning.  Mine is made up.

It's 11:02.

11:04.  the final function, findD, is working correctly for missing
files, multiply-uploaded files, and single files.

I'll enable one abstraction and just migrate all the ohter uses of
bitdb straight over.

Hmm.  This looks like it is making the project a little more complex,
when it is already spaghettid, because it is not enough refactoring to
reach the point of increased organization.  Unfortunate.  But it's
helpful to introduce abstractions =S

Where do I use this in the code.  Scrolling up.  It's 11:10 .

Scrolling up is confusing.  Let's see.  Trying to resolve findExist
not working.  Let's see where that's used, and change it.

1127.  I ended up moving all the new api funcs into the abstraction.
They are now working using a pluggable backend handler.

1131.  I'm installing my dev folder to see if it works.  I don't
usually install things to test them.

1132. Didn't work.  Running the script by path like normal.

1133. It's confirming transactions great =)  I guess I'll remove the
old functions from the files, so they can use the centralised
abstraction.

1139 I pushed my bsvup changes to github.

1143 I submitted a PR at https://github.com/monkeylord/bsvup/pull/31

Additionally, the new script verified that all uploaded transactions
were accepted into the blockchain.

1144 I'm attempting the reupload again, and this time it is correctly
skipping every file because the file is already on-chain.

Remind myself of goal: attachment data.

1145 wrote something and erased it around pushing myself harder.

1147 I realized to not "make the karma bit explode into harm", I
should modify the output of the script so it reports to the user that
the file it finds could have been spoofed in some subtle way.

1152 thinking on how so many problems would be resolved if we would
share important information better on a huge scale.  similar to the
sha1 thing.  trying to focus on workplay.  next step was ....
attachment data!  I can look at an email with an attachment to see
where that is.

1154 the bsvup reupload finished.  it says the four nonmutated indices
don't have path records on the chain, but do have their data uploaded.
I'm sure I caused that somehow, but I don't know how that would happen
unless they were uploaded with the wrong paths.  Maybe it is because I
removed their txs from the tx folder?  Did I do that?  I think I
removed other txs, not those.

It's generally some human error in things like that.  I'm going to
cancel the upload and focus on attachments.

1156 I haven't downloaded the attachments.  I don't know whether they
are linked if a spider crawls from the front page, so I'll add
mirroring them to the download script.
1201 my wget spider didn't download any attachments and has already
found the big mbox file.  i can download them manually to test
uploading and linking them from the existing script. (i just did
download the september 2020 ones manually)

1205 bsvup is processing uploading my attachments subfolder, which is
slow because the spider downloaded an index file for every single day.
while it does that, i'd like to figure out why the four index files
aren't found with the right D (path) records.

how to do this?  how about i migrate the bitbus changes from bsvup,
over to bitfiles.  i'm used to its interface.

blargh bitfiles has different structure.  maybe I can do a manual
bitquery to find those files, instead.  i can copy the query from that
used by bsvup, and maybe there is an online explorer or i can use the
nodejs shell.  the sha1 is in one specific field; it will be the
'genesis' form of the protocol.

1210.  Inner landscapes.  It is scary to believe in human trafficking,
mind control, and coercion.  Very scary.  But all we are dealing with
is a small debugging task.

1212.  I am sitting blankly, trying to focus on doing a task.  I could
use something to look at to remind me what it is, I think.  I don't
remember what it is.
I am trying to move bsvup-source near wrong-or-missing-D-records of 4
original index files.  That is, I'm trying to ... find the part of
bsvup, in bitbus.js, where
it ... makes a D record query.

I already have bitbus.js open!

Oh no.  I moved the query from here.  It is in ... backends.js .  Here it is.

This is the 'genesis' format query (they renamed the bsv version to
'genesis' when they changed the protocol).  {
    'v': 3,
    'q': {
      'find': {
        'in.e.a': address || undefined,
        'out.s3': key || undefined,
        'out.s2': '19iG3WTYSsbyos3uJ733yK4zEioi1FesNU'
      },
      'project': {
        'out.s3': 1,
        'out.s4': 1,
        'out.s5': 1,
        'out.s6': 1,
        'in.e.a': 1,
        'blk.i': 1
      }
    }
  }

It's hard to format concisely and stay literally-transparent.  Ohhh
because I need to think about hte 'project' parts to figure out how to
mutate the 'find' parts.
Okay.  I guess we'll be setting 'in.e.a', the address, but leaving
'out.s3' unset so it can find the right key, and then'll I want to set
the transaction link, the content link, to the right value.  I'll
check bsvup's generation code to see what that is.

That's in uploadDTask in logic.js?  No, that's just a precursor.
Maybe this is it, in pendTasks.  Oh, it's in txutil.js, as buildDOut.
txUtil.js, capital U.
Looks like the 'value' field holds the transaction id.  I'll see how
it's encoded, probably just however bsv transactions are encoded.
Stored in the filedatum as dValue.  It seems highly likely that it's
the ascii hex transaction id.  Okay, which data index thingy is it.
It's the one right after the D identifier and the key (path).  It's
the 3rd.  Arright, scrolling up to look.  That would make it s4.
Let's craft this.

{
    'v': 3,
    'q': {
      'find': {
        'in.e.a': address,
        'out.s2': '19iG3WTYSsbyos3uJ733yK4zEioi1FesNU',
        'out.s4': datatxid
      }
    }
  }

now I just need the datatxid.  I can use the findMightExist function I
migrated, from the nodejs shell.  I need the content of the files in
question.

1223.  Inner landscapes.  I wonder what we need to be able to do this
stuff without typing it all into an email to the cypherpunks list.

1224.  More distractions.

1224. I'm experiencing fake pain in my right eye.  My right eye
actually has severe pain from an eye issue I have, but I tend not to
feel that pain.  I am instead feeling kind of tame, irritating pain.
I want to erase 'kind of tame' so I do not get worse pain.

1225.  Navigating inner landscapes.  Goal repetition?  What's the
goal?  I don't remember and turn away when I look.  What is it?  Using
the findMightExist functoin to manually find the id of the four
original index files on the blockchain!

[user at localhost bsvup]$ node
> backends = require('./backends.js')

Okay, that's open.  I'll want to move those 4 files into that bsvup
folder to open 'em easy.

1227.  We get to lubricate our eye!  Yay!  If I lubricate my eye 5
times a day for a week, it stops hurting.  Unfortunately when I build
the habit I forget it.  But that can change.

1229.  Back to workplay.

Noticed my attachment upload terminated due to insufficient balance,
which his normal.

The proposed additional payment of 6 million satoshis is reasonable.
Would be nice if it said the total cost.

1232.  Inner landscape started growing jokes.   These can grow big
because they are so much nicer than other stuff.  "We have reports of
brain damage, what do we do?"  "Is the brain damage caused by freaking
out about things too much?" "ummm ... That is the rumor, yes."
[people start freaking out about how to handle it]  "It is an
emergency to go back to workplay!  People are trying to manipulate us
not to, by calling things emergencies!"

1234.  After wildly tapping bright things on my phone as the screen
jumps around, I make it to my bittrex holdings.

1236.  I withdrew around $5 of bsv.  Bittrex is working better for me
than usual, which is great.  Worked better yesterday too, once I got
signed in.

I'll start the upload now so that it doesn't add too many files from
the spider that's still running, and found the attachments subfolders,
but is still just downloading indices.

1237.  Back to ... uh ... manually checking something to do with node.
Oh, I need to copy the files in.

1239!  Copied em!

Luckily, node is one of the interpreters that stores command history,
so I don't have to remember how to open and read from a file.

1240.  Now I get to paste in that bitquery.  Oh, no, first I need the
txid, I see after scrolling up.  Ummm mI find that with
findMightExist.

1242.  Inner landscapes.  We're trying to think about whether or not
this is what we want to be doing right now.  But our attempts to
relate data internally are getting harmed, so we have a strong habit
of storing things externally.  We don't mean to be spamming this list,
and it's really inhibiting of our cognition to have that be the way to
do something we prefer.  1243.  Back to workplay.

1244.  Inner karls.  Karl says, we can do something else if we want
to.  But Karl likes developing behaviors around uploading things to
blockchains, because he has suffered so much around loss of records
and memories.  If you can help us work with fewer interruptions, that
would be great.  1244 Inner landscapes.  We have lubricated karl's
eye, sent him to the bathroom, and he has eaten and drank.  His
laundry is in the wash, but could use more transferring.  How about we
do laundry the next time an interruption grows big.  1245.  Back to
workplay.
1245.  Inner landscapes.   There is a way to force him to urgently
pee.  I'm building it in support of laundry.  It makes him get up.
1245.  Back to workplay.

1246.  I've loaded all the transaction ids for the files into a nodejs
object, to find their keys.  Each has been uploaded twice already!
That means finding the keys of multiple files.  I bet javascript has
an array mapping function.

1248.  Inner landscapes.  I wonder what we want to let come out typing.
1248.  I am trying to map arrays.  What I am trying to do to them.  I
want to pass them to some kind of backends.js functoin that finds
their D keys.  Oh!  I need the bitquery object now.  Json.  Object in
javascript.

Helpful to put it on one line.  { 'v': 3, 'q': { 'find': { 'in.e.a':
address, 'out.s2': '19iG3WTYSsbyos3uJ733yK4zEioi1FesNU', 'out.s4':
datatxid } } }

{ 'v': 3, 'q': { 'find': { 'in.e.a': address, 'out.s2':
'19iG3WTYSsbyos3uJ733yK4zEioi1FesNU', 'out.s4': record.txid } } }

Looks like that's not valid javascript.  Gotta remember how to make
javascript objects.

Works with some fiddling.  I'm going to try to open a notes pane and
migrate some of this there.  Made it at 12:53.

The files are correctly on the blockchain, it looks to me, and I'm
having trouble staying in the notes pane.  It's 12:56.

The transactions
a09af99ae4d381999388845f2a1293ae9fb3828e4a86c8d105f2bdd044f58076 and
59ce4653f4840b7ce3546771877a6c297fc7e4ee7279e3469bcb1d86ac340fe0 have
out[0].s3 = lists.cpunks.org/pipermail/cypherpunks/2020-September/thread.html
... computer glitch . Anyway, that's one of the paths that is
mysteriously reuploading.

13:00.  My body is trying to do things that move me away from the
computer.  Goal I am holding is to run a test reupload (and to not
send it due to doublespends with existing upload in progress), and
debug why this file is not found to exist.

13:02 .  I'm getting ENOENT trying to upload the test file.  It
exists.  I must have made a typo I am missing.
13:03 .  Still getting ENOENT even though autocompletion provided the
file.  Looks like tyhe failed system call is 'stat'.  Maybe I can
'stat' it manually.
13:03 .  stat works fine on the path, size is 101088 bytes. inode is 219152420.

I guess I'll strace bsvup.

=) looks like it's a bsvup bug.  It's trying to read from the basename
of the file, with the same cwd.  What a relief!

13:07.  Inner landscapes. I'm guessing the bug has to do with
uploading a file instead of a folder, so I'll just upload the folder
to handle my urgency.

13:10.  Inner landscapes.  The harmful processes in the mind landscape
have proven to themselves that they can stop this within a reliable
timeframe and are taking action.  What do we do?  [ummmm I'll come
back to the cypherpunks email instead of the notes file.  Let's just
write down things that cause us to stop.] Ok.
13:11 "back to workplay"
13:11 urge to pee, disappears upon writing
13:11 incredible itching in an area of our face where the nerves have
been cut to from surgery, and we cannot feel

I am writing a progress note in my notes pane, rather than here.  It is 13:12.
13:13 random body posturing [themes of authority]

13:17 .  I fixed the bsvup check-for-exists edgecase.  After fixing
it, the test runs much much faster, and I don't know why, maybe
because fewer checks are failing.
13:19 .  realized my fix was in error

13:21 . I discovered that inside bsvup, it may actually check the
content data.  Updating findMightExist names.

13:23 .  Inner landscapes.  They are using an abstract, smooth shape
of urges to cause us to stop.  It goes through our choices and
decision making processes, and small parts of our working memory.  It
develops strength, gains a foothold by manipulating the choices to
choose to support it, and continues to spread.

We'll monologue about it in notes if monologueing is needed.  The
monologue is summarisable as a distraction.  It's 13:24 .

13:25 .  Attempts to move towards the keyboard resulting in postured
stretching of the arms away from it, as if tired.

13:29 .  Lots of anxiety.  Random muscles tightly contracting across body.

I think I'll get food and water and visit the bathroom.  Maybe move
laundry over.

1339.  I am working on code to store messages, old messages that are
well preserved, on a blockchain.  I see blockchains as no more
dangerous than money, but I appreciate how they redistribute it.  I do
see that there are ecological problems, although I suspect the energy
use of blockchains compares reasonably to energy used in the
processing of conventional money.

1340.  The list to which I am drafting this email I consider spam, was
writing about blockchains at their very infancy.  Blockchains use the
kind of cryptography than an independent cryptographer or hobbyist
would come up with.  Doing things that plan for changes in the future.

1341.  I've always been an independent kind of person myself.
Whenever I learned about somehting, I wanted to learn how it was made,
how it worked, how we could do it on our own.  There are _always_ a
_whole bunch_ of things inside the workings of something, that could
meet everyone's use of it way better if they were understood.  Using
tools made by distant researchers, that situation is incredibly strong
today.

1343.  I'm presently trying to debug some emergency blockchain
software.  I know it's emergency software because I'm involved in its
development, and it has one-of-a-kind aspects that fill gaps in what
is available.  I need this software.  I need this software to preserve
my memories and history, and those of countless others, in the face of
brainwashing around hiding of what is true.

1345.  I have a cursor, right now, at line 184 of api.js in the bsvup
project.  This place, like every other place, is a crucial place to
be.  With this cursor, it is possible for me to do work that heals the
world.

1347.  I am working with a karmic flow that bsvup holds.  When a file
is uploaded multiple times, this costs energy, money, and time.  The
software has a bug, where if a file is uploaded multiple times, it may
be uploaded _more_ times in the future, magnifying the error so that
people may find the cause of the issues they run into, because of it.
I have found part of this error.

1348.  I have personal experience producing similar errors.

1348.

I would like to prevent this error from magnifying.  I am happy to
add, in fact I would enjoy adding, a notice to the user of the issue.
The issue is unrepairable, but it represents a problem in the past,
and is part of the beauty of the blockchain.  It shows we were able to
store parts of our spirits in a way that lasts longer than our lives.
This was always true, but now it is visible, in front of our faces.
We have mathematics that give it proof, in a way we don't in other
areas.

1350.  Reviewing error again.

1351.  When working on this kind of software, you run into code, and
even write code, that handles many many many possible errors.  Like
buliding a wall of china, securing the behavior of part of an
algorithm means imagining all the things that could go wrong, all the
ways that people could come through, and blocking them.  This
magnifies the pain of the world, because we never learn the stories
that bring the harm to us.

Without knowing these stories, we will inevitably increase that pain,
because we cannot account for it in our decisions.

1354.  I am currently working on this small piece of code, or trying to:

  var txs = await Promise.all(records.map(record => getTX(record.txid)))
  var matchTXs = await Promise.race(txs.map(tx => {
    return new Promise(async (resolve, reject) => {
      var databuf = await getData(tx).catch(err => {
        log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE)
        return Buffer.alloc(0)
      })
      if (databuf.equals(buf)) resolve(tx)
      else reject(new Error('Not Matched'))
    })
  })).catch(err => {
    log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE)
    return null
  })
  if (matchTX) {
    Cache.saveFileRecord(sha1, records)
    return matchTX
  } else {
    return null
  }

1355.  The code returns a transaction for an existing file, if one is
found.  If many are found, it returns only one.  It would help me work
with the system, to have all the matching transactions, not just one.

  var txs = await Promise.all(records.map(record => getTX(record.txid)))

Just had computer glitches.  The above line retrieves the data for
every transaction found that matches the file.

  var matchTXs = await Promise.race(txs.map(tx => {
    return new Promise(async (resolve, reject) => {
      var databuf = await getData(tx).catch(err => {
        log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE)
        return Buffer.alloc(0)
      })
      if (databuf.equals(buf)) resolve(tx)
      else reject(new Error('Not Matched'))
    })
  })).catch(err => {
    log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE)
    return null
  })

The above segment is confusing to me.  It is a hybrid of multiple
coding styles.  What it does, is it condenses the list of transaction
data, into a single matching transaction, if any of them do.  I would
like to mutate it to return all matching transactions, instead of just
one.  It looks like it could be cleared up with some rewriting, but to
rewrite it safely I would need to understand everything it does, to
preserve its role where it is.  Like must be done with anything else
in the world one replaces.

1358.  Maybe the code can be mutated only a small way, to form an
array instead of a single value.  How is the line reject(new
Error('Not Matched')) handled?  Is this line ...

1359.  We are considering that monkeylord, the chinese software
developer who wrote this program, could be in need of help.  We don't
know much about them.  We probably are in need of help, ourselves.

      else reject(new Error('Not Matched'))

1400.  This line produces an error when matching fails.  Where does
that error go?

1401.  It looks like it would go into Promise.race .  How does
Promise.race handle errors?
1402.  mozilla.org says that Promise.race resolves with the first
result found, whether it is an error or success.  I make a lot of
logical errors nowadays.  A way forward for me is considering whether
this is an error; whether a single record with wrong content, will
mark all records as mismatching.  I expect the error is in my
perception, not in the code, but I am likely to assume that it is in
the code.

      var databuf = await getData(tx).catch(err => {
        log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE)
        return Buffer.alloc(0)
      })

This chunk simply gets the data.  The extra complexity is to handle
when there is no data: it treats missing data as empty data.

1404.  I'm thinking about the await func().catch style.  This is much
more concise than a try{} block.  Hum.  Probably not as clear though.
Let's move on.

1405. We can simplify the segment:

  await Promise.race(txs.map(tx => {
    return new Promise(async (resolve, reject) => {
      var databuf = await getDataOrEmpty(tx)
      if (databuf.equals(buf)) resolve(tx)
      else reject(new Error('Not Matched'))
    })
  })).catch(err => {
    log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE)
    return null
  })

This is simpler.  I edited it a bit to fix a copy error.  14.07.  For
each transaction, make a promise.
Each transaction becomes a promise.  Every transaction has its data
retrieved in parallel.

1408.  I'm thinking about hte choice to run these in parallel here.
For the use of seeing if the data is present, that's great.
But for the use of seeing if a D record is present, it is suboptimal.
The parallelism could be moved around the check for hte D record, so
it could inform the race,
which would retain the speed. Or D records could be enumerated in
advance, and the information included in the racing.  This seems more
complicated.

So, the problem with removing the race, is that the system would slow
down.  The situation it handles is pretty rare.  But notably, that
situation crops up pretty rarely.  So it's reasonable to remove the
race.

I'm inferring that the intention of the function is to simply discern
if the data exists, and is uninformed by the D records concern.

1410.  Let's try to rewrite it to include all matching transactions.

  await Promise.race(txs.map(tx => {
    return new Promise(async (resolve, reject) => {
      var databuf = await getDataOrEmpty(tx)
      if (databuf.equals(buf)) resolve(tx)
      else reject(new Error('Not Matched'))
    })
  })).catch(err => {
    log(` - TX Data not properly resolved. Error: ${err}`, logLevel.VERBOSE)
    return null
  })

We want to filter the transactions based on whether they match.
Matching is an asynchronous process.  Filter functions are usually
synchronous things.  txs is probably an array.  It would make sense to
change the 'race' to an 'all', and store null for mismatching data.
Then we could quickly filter the nulls out.  I'll do that.

1411.  Back to workplay.

1425.  I've fixed the error, and resolved a different problem relating
to my PR being more rapidly accepted than I thought it would be.  My
food bowl is empty, but I don't remember eating it.  The solution I
made does not include any of the concerns that were very present for
me when I was struggling to move forward on it.

1429. PR for matching issue at https://github.com/monkeylord/bsvup/pull/33 .

The attachments uploader is at 2016 =/ .  I'll send it some more money
and then start a process that covers a smaller folder area.

1431.  I had to unlock bittrex 3 times to get the screen.  Oops,
here's a fourth.  Looks like when the connection breaks it makes you
log in 2 times in a row, or something.  I have internet, but it says
'No connection' now, logged in.

1433.  I checked in termux that I do have internet.  I can download a
webpage with curl. Here's some more logins to bittrex.  Still says No
connection.  I'll just reboot the whole phone.

1434.  Reviewing my ongoing processes, the attachment download spider
has gotten into the subfolders and is downloading pdf files, txt
files, sig files, great stuff.  The uploader is still enumerating
folders and is up to 2017.

1436.  Phone rebooted.  Loading bittrex.  Only made me log in once!

1439.  I sent another $10 with bittrex.  That'll help if it wants to
upload scads of cruft files.  I like doing that much more now that
existing file checks are more robust.

1448.  Fixed another bug.  September attachments are enumerating.
Lots of cruft files from wget.   Now they are broadcasting.

The plan was to add links to attachments to individual messages, which
sounds pleasant.  Additionally, I'm also thinking on efficient
archival.  It would be good to just upload the data.  Really, I
imagine that I can't afford to archive the whole mailing list.  It
would still make sense to archive in a format a mail reader could use.
I wonder what the mbox format is.  A quick wikipediaing says that mbox
is a concatenation of raw emails.  If that's true, then the raw email
files could be stored as files, and the mbox files could be generated
based on them using a BCAT transaction to link them together.  That
sounds kind of fun.  But there's value to the web browser interface,
too, for sure.

1453.  Inner landscapes.  Current task is unsure between raw mailfile
stuff and user interface stuff.  I wonder what's in the .gz files.
Looks like raw emails.
Ideally an archiver would automatically run as the system went.  I
don't seem to be quite there yet; don't have a specific plan.  My
personal goal is that I would like all the emails accessible and
reviewable.  So, web interfaces seem okay.  But it would be more
efficient to generate them with javascript from source files.  That's
a lot of dev work.  Better to make something automatic that runs,
processing everything, with little work.  The issue is that the
webpages don't show headers.

1458. Hum, I looked at the first 2013 mail in the mailing box,and the
hostname of the list is al-qaeda.net .  Looks like hacking of the
host, although I suppose I can't know for sure.  Great reason to
archive on blockchains.

Maybe that's a good reason to archive everything.  It's 15:01.

Maybe that was just the domain name!  I really haven't read the -- I
really don't know the culture here.  I wonder when 9/11 was.  Anyway,
enough conspiracy stuff or paranoia or whatever.  Ummmmm I'm not sure
what to do next on this task =/ .  K.  Let's make a monthly uploader.
Next step, mutate links to attachments.  Let's also extract the raw
emails and include them somehow.  Header-matching maybe.  Maybe I can
make a python script?  I'll  just mutate hte attachments for now.

1505.  Lost my goal.  I know I'm doing something but I can't describe
it or act on it.  Something about there we go, attachments.  What
about attachments?  Add them to link mutation.  Okay, so I'm trying to
find an example email with attachments to do the mutation.

1507.  The attachments are already links!  That probably means I can
just run the mutator after uploading them, and they'll be picked up.
Maybe I'll review the code anymore; I'm guessing that might not be the
case.  I'll want to change the removal mapper filter thing.

1510.  Next step: after making new mutated message htmls, I'll want to
make sure that the indices generated link to them.  They'll have the
same paths as the old ones.

I can copy the sed script to find the files that have the old ones,
and delete them, I suppose.  I wonder if I can use sed to extract the
timestamp or something.  I'll look up tx format.  Noooop no timestamp
in a transaction.  Block's got that.  I'm worried that bsvup could
regenerate transactions if I delete them.  I'll look in it to see when
it regenerates them.

1515.  Inner landscape.  Authority wants to know what my goals are
here.  They are ... to preserve records of harm.  And possible
facilitating others finding and using them a little.  Yeah ... also to
make it easy for people to have a canonical reality with regard to
what is on the list.  And wouldn't it be nice if the list could be
somewhere you could store things, like conversations, you didn't want
lost?  Maybe a way to reach people, if there were censorship between
you and them.  It's a bigger problem than bsvup, but you have to start
somewhere.

Transactions are saved in bsvup when they are broadcast, and when
getTX is used.  I remember getTX is used for existence checks.  So
that would match only file content present already, I believe.  Maybe
I can just delete them!  I'll do so using the sed thingy.

1519.  Inner landscapes / inhibition.
1519.  Went to drink water while trying to access my goal, as if the
water would provide the goal.  I'll drink some anyway.  Access
internally.

Goal is to delete my local cache of transactions to do with individual
message.txlink.html files.  so I can just output them all and grep on
that.

1524.  Okay.  Deleted.  I'll update the script so it can use the
unbroadcasted transactions that are still waiting on confirmation, as
link targets.

1528.  Inner landscapes with logical confusion.  A part of the
mutation script seems to be doing the opposite of what is intended.
At the start, it discards everything matching "^mailto", and then the
only output is a mailto link.  Something small I've missed, that has a
big impact.

Was redirecting stderr instead of stdout, to quiet grep.  Looks like I
do that a lot of places.

1532.  Inner landscapes.  It is rough to believe in human trafficking.
Jokes rising around a meanie standing in front of an ice cream stand,
telling all the visitors, "There is no such thing as icecream."
Giving them icecream cones made of plastic.  This really happens in
oppressive dictatorships!  Democracies are like myths.

There is no such thing as a way to preserve information.  There is no
such thing as an investment plan that makes more than 2% return a
year.  There is no such way as being satisfied with your income, or
even living luxuriously without one in friendly community!  There is
no such thing as enjoying being outdoors.  There is no such thing as
getting anything done without college.  There is no such thing things
being true other than what your local marketing campaigns say.  No,
no, there is no such thing as there not being such a thing as
something!  Everything exists, quite truly!

It's 15:36.

The mutation is going very slowly because it keeps testing absolute
links.  Better make them not regenerate the path map.

1541.  I did that.  Two bugs.  Some links are breaking mails.  And
attachments aren't getting mutated.

1557.  I've addressed a lot of bugs but am still not successfully
mutating an attachment.  I'm seeing something wrong around my sc --
(an attachment link).  I'm seeing something wrong around my test
scenario or whatnot ...
https://lists.cpunks.org/pipermail/cypherpunks/attachmens/20200902/ffed989a/attachment.txt
doesn't seem to exist on the internet, but seems reference from a
message. I've copied something wrong.  Ha!  It's missing a 't'.  There
we go.

1601.  Somehow some attachments were missed in my downloading all for
september.  Redownloaded a bunch more.  Uploading the new ones.

1602.  Transactions will need to be generated for the missing
attachments for the mutator to recognise them, atm.  Better to find
one that's already been sent to test with.

1613.  I think I have attachment links working.  I'm waiting on the
previous upload of more attachments to finish.  I guess I can -- I may
have some corrupted messages files from the development process, I - ;
I made an error, where .. I poverwrote the input.  I ... I can mutate
... I want to reupload the folder and see wheter files are marked
already existing, to see whether I corrupted any files.  But this will
take waiting for the present upload.  Until that happens ... I can
process all the files using the new script.

1616.  The mutator is going an order of magnitude faster than ever
before!  One of the later files has a processing issue.  I can work on
that.

1620.  Mutated them all.  I reviewed one and realised that my strategy
of joining lines with lines to make sed match them, mutates the email
bodies.  That's no good ...

1625.  Changed the mutation script to join lines only with appropriate
text around the links.  Good enough for this little project.

Not sure what to do now.  Guess I'll commit what I have.

1635.  I guess I wasn't really expecting to get this far.  The feat of
uploading a month, yesterday, was pretty incredible for me.  It seems
like it's pretty hard for me to do something like this.

I'd like to run a server that maintains archives.  I'll do some daily
stuff like finding food/water/bathroom/laundry and maybe see if I can
start a script that autoarchives.

1652.  Back.  Doing the second upload run where I see if I corrupted
any files.  I guess I could have downloaded them, theoretically, to
compare this too.  I'm going to look for that music.

1657.  1658.  Don't know how to find music.  Upload is going okay.
txlink files are all getting reuploaded.

Hello, email document.

I feel very blank.  I guess it's time to stop this task.


More information about the cypherpunks mailing list