[journal][log] was: Re: How To Blockchain a Git Repository for Public Archive
I am in a state of mind where I can pursue making a script a little bit to automate uploading and even find other people's repositories a little bit. The uploading will be for-pay rather than always-free to ease the number of things to implement. For the same reason, it will use centralized services. The purpose is to recover my records after severe amnesia and possession loss.
If all the .pack and object files in your bare repository are 100,000 bytes or smaller, the upload can be made for free. Otherwise, it will cost a small number of AR coins. You should get AR coins either way: it opens up your options in the future.
If you want to generate a wallet offline, wallet.json is simply a JWK file holding a 4096-bit RSA keys. Arweave has an online process at https://docs.arweave.org/info/wallets/generating-cold-wallet .
# install arkb with yarn or npm yarn global add arkb # or with npm #npm install --global arkb
# check your wallet address and balance if needed arkb balance --wallet ~/wallet.json
# make a dedicated folder to be the upload mkdir deployment_dir
# place the repository in the folder git clone --mirror --bare my/remote/repository deployment_dir/repository.git
# generate dumb HTTP information in the repository so it can be cloned from a gateway git --git-dir deployment_dir/repository.git update-server-info
# might be optional: add .keepme files to any important empty folders find deployment_dir -type d -empty | while read path; do touch $path/.keepme; done
# add an index file to show web browsers, or even entire web content echo 'Clone the repository.git subfolder.' > deployment_dir/index.txt
# deploy with arkb arkb deploy deployment_dir --wallet ~/wallet.json --index index.txt --bundle --debug # or deploy files all <100,000 bytes using node2.bundlr.network #arkb deploy deployment_dir --index index.txt --use-bundlr https://node2.bundlr.network
# when arkb finishes, it will output a url on the https://arweave.net/ gateway
# during times of congestion, it may take multiple runs of arkb for success # bundlr automatically retries if that happens; otherwise you may need to
# after a couple blocks have been mined, the repo should be cloneable
The plan is to try to do the bare minimum so as to have both uploading and enumeration. I'm thinking I could be much less likely to finish If I get distracted focusing on improving part of it.
note regarding a strangeness: https://arweave.net/graphql is working fine for me from the CLI but returning cloudfront timeout errors in Chrome, without regard to whether I enable Tor in my system settings. issues like that are usually a local configuration quirk or bug for me that I haven't found. isn't my priority atm. videolog may exist, haven't looked whether the stream is succeeding.
whoops, getting the errors on the CLI too ;p 0808 got more errors trying different gateways. i guess i'll see if there's a simple path through. 0811 0813 i found talk from dec 8th regarding the gateway being blocked in regions of the US. devs shared the dev gateway, which is working :)
I've drafted a script that uploads a git repository. I've also made a tiny script that retrieves transactions based on tags. Many things are missing. I'm tihnking the most important thing is to make sure it can find very old transactions, which means letting the user provide important tag material. I think it would be helpful to add information on the associated public key to the output. I can maybe mutate this to have it appear like tags. 0822 0903 https://github.com/xloem/savegit untested!
participants (1)
-
Undescribed Horrific Abuse, One Victim & Survivor of Many