Cryptocurrency: Scaling, Privacy [re: on whatever...]
It is woefully unfortunate that the first cryptocurrency Bitcoin-BTC did not offer much (on privacy) therein thus setting and tricking the entire space into abysmally low thought expectations for a decade.
hard to tell if bitcoin's limitations were there on purpose or the author didn't know better.
And some of the sense of need and technology for coin privacy (and other things) probably wasn't exactly well developed in the space yet. Today there's no excuse, and certainly not for any coin to skip devoting a clear section of their whitepapers to their privacy position. And to other classic problems.
At any rate on-chain scaling doesn't...scale
Everything must roll up, so whatever schemes people try to layer and split it up, it's all still one big network... subject to same aggregate laws of bandwidth and cpu.
problem gets worse when you add privacy features.
That's probably still an open question. 1 billion people transacting once a day is 11575 tx/s. That's roughly all humans on the planet once a week. As below, that may be a high estimate. Which is good because if you arbitrarily choke the tx processing core of a coin down to 1Mbps, that allows only 10B/tx. Today Monero and Zcash are something like 2kB/tx. It may be easier to lower tx size under some type of general zero knowledge mask. To model txrates you'd have to play around with various Fermi estimates... - Combine with stats from Visa, cash, cows, metals, securities, etc... - How people might restructure their life's behaviours around what coins can offer... - How people of different ages ability interest will proxy through others... - How people might deploy and keep various wallets, confirms, mines, relays running... - What numbers of each are needed to maintain distributed resiliency... - What happens as governments dissolve under adoption... - If #Open #Audited banks arise to issue own paper... - If hundreds of #Open #Audited decentralized standard crypto clearing houses arise... - If metals and other forms come back into prominance... - If a secure general purpose overlay network helps transform some privacy requirements into secure transport requirements... - What happens when move to UTXO db's is made... - How today's privacy coin tx sizes might improve... - If ledger models are augmented by serialized bill passing... - Many more factors to consider... Even today... 100Mbps core @ 4B ppl/wk ~= 2kB/tx
expect bitcoin developers to fix bitcoin's problems or be outcompeted
Due to their history of positions on not doing anything, combined with forum censorship, and other questions... if BTC ever does do something, probably now clearly only to prevent from dropping below 50% relavance or further under advancing competition, they'll become a laughing stock due to last minute position reversal, at that point mooting remaining vision and trust. That competition is coming. They will have to abdicate their all their dev and other positions off to a new generation of maintainers and advocates in order to truly save face of BTC. Or maybe BTC will remain on top, or maybe the entire cryptocurrency space will go to zero. Have fun :)
On Fri, 13 Dec 2019 23:36:18 -0500 grarpamp <grarpamp@gmail.com> wrote:
1 billion people transacting once a day is 11575 tx/s.
That's roughly all humans on the planet once a week. As below, that may be a high estimate. Which is good because if you arbitrarily choke the tx processing core of a coin down to 1Mbps, that allows only 10B/tx.
haha - arbitrarily...
Today Monero and Zcash are something like 2kB/tx.
so, let's start with bitcoin's 1mbyte blocks - 5 tx/s. Increase transaction rate to 50 tx/s which is still nowhere near 'visa levels'. Now you have 10mbytes blocks. And then use monero-like transactions and you have 100mbytes blocks. Current size of bitcoin's blockchain is 270gbytes. Downloading and processing that amount of data is not trivial. Only some people in the so called 'developed' world can do it. Now, what if blocks are a hundred times bigger? In that case you get 5 terabytes PER YEAR. So at this point we can cue in a fucktard like roger ver who tells us that we can put the chain in the global NSA-GCHQ-anamzon-rothschild data center! YES! We can run 'our' 'alternative' payment system in the NSA-rothschild cloud! And actually mr. nakamoto himself said as much... hmmm - how's the arpadarpanet coming along?
On 12/14/19, Punk-Stasi 2.0 <punks@tfwno.gf> wrote:
haha - arbitrarily...
BTC is arbitrarily choking down their blocksize, haha is that it will still grow unbounded, then there is... youtube: lightning network sucks search: corporation problems A few more good innovations by some other coins, from better groups, a transfer of faith... and BTC is done in. Regardless, coins are really hard to do right, and it's probably still too early to know for sure what right is likely to end up looking like.
100mbytes blocks.
that's 100MB... every 10min = 167kbps, no problem. And if it was problem, light clients query out spv-like services, could add in a sizable dht cloud too.
Current size of bitcoin's blockchain is 270gbytes. Downloading and processing that amount of data is not trivial. Only some people in the so called 'developed' world can do it.
Every house does not have a well, nor LAN, they pool together to build and share, cafe coop, etc then won't even have to wait 25 days at a lowly 1Mbps. If situation is so bad, gold and silver or whatever else they like that is sound and diverse is fine too. Crypto is more in mutually complement friend with metals than in any sort of competition.
Now, what if blocks are a hundred times bigger? In that case you get 5 terabytes PER YEAR.
It has already been suggested that storing entire blockchains, certainly for any sole function pure currency chain, since their genesis... is a legacy first gen tech thought, and quite silly idea to keep doing. a) they can be sharded signed and distributed out to query trees, doesn't solve things long term. b) utxo can be flag cutover coinbase input to empty chain, very unlikely. c) chain protocols will evolve to effectively consensus mine a utxo database, hashed checkpoint series, etc thereby allowing tx's and blocks to be thrown away after the series becomes too confirmed and locked to mine a different one. Storage becomes mooted, the lingering limitation seems to be estimating bandwidth the processing participants need.
data center
Certainly not under todays models of a few fake audited centralized closed corrupt wannabe govbankster crypto corporations. Under a well distributed standardized 24x365 walk in publicly auditable opensource etc etc... maybe. And even that should be tried to be designed out if at all possible.
On Sun, 15 Dec 2019 05:51:44 -0500 grarpamp <grarpamp@gmail.com> wrote:
On 12/14/19, Punk-Stasi 2.0 <punks@tfwno.gf> wrote:
haha - arbitrarily...
BTC is arbitrarily choking down their blocksize, haha
"arbitrary" means "for no reason" but there's a painfully obvious reason to limit the size/rate of growth of so called blockchains. You can argue that a particular value for the limit is arbitrary, which is technically correct...yet irrelevant.
is that it will still grow unbounded, then there is... youtube: lightning network sucks
you expect me to do your side of the arguing for you? You're not even bothering to link your own shills? =) But OK, I don't mind admiting that the LN is less than ideal or that it 'sucks'. Then guess what? Cryptocurrencies suck. Blockchains are 'transparent' by default and don't scale. And if you obfuscate them, then they scale even less! And second layer solutions have their own set of problems.
search: corporation problems
you might want to be somewhat more specific...
A few more good innovations by some other coins, from better groups, a transfer of faith... and BTC is done in.
haha yes. If somebody solves problems that no one has solved yet then..yes, those problems won't be problems anymore.
Regardless, coins are really hard to do right, and it's probably still too early to know for sure what right is likely to end up looking like.
100mbytes blocks.
that's 100MB... every 10min = 167kbps, no problem.
that's 1.66 megabits per second. And I assume that among people who *have* internet access, 1mbit connections are still common.
And if it was problem, light clients query out spv-like services, could add in a sizable dht cloud too.
yes, the GCHQ-rothschild cloud. Go for it.
Current size of bitcoin's blockchain is 270gbytes. Downloading and processing that amount of data is not trivial. Only some people in the so called 'developed' world can do it.
Every house does not have a well, nor LAN, they pool together to build and share, cafe coop, etc then won't even have to wait 25 days at a lowly 1Mbps.
Every house does not have a well
sidenote : every house SHOULD have a well instead of being connected to govcorp's water supply. Now, what was the point of "PEER TO PEER electronic cash"?
Now, what if blocks are a hundred times bigger? In that case you get 5 terabytes PER YEAR.
It has already been suggested that storing entire blockchains, certainly for any sole function pure currency chain, since their genesis... is a legacy first gen tech thought, and quite silly idea to keep doing.
yeah, so what's the 'next generation' solution?
a) they can be sharded signed and distributed out to query trees, doesn't solve things long term. b) utxo can be flag cutover coinbase input to empty chain, very unlikely. c) chain protocols will evolve to effectively consensus mine a utxo database, hashed checkpoint series, etc thereby allowing tx's and blocks to be thrown away after the series becomes too confirmed and locked to mine a different one.
Storage becomes mooted,
The problem isn't storage per se. The thing is, all peers have to validate all transactions. And that may include all past transactions. It may seem pointless to revalidate old transactions but how do you arrive at the current state if you don't?
the lingering limitation seems to be estimating bandwidth the processing participants need.
I don't know. The problem is that 'peers' need to audit the whole system. And that takes resources. So peers start 'delegating' powers because that's more 'efficient', and we end up with the same central tyranny we have now.
data center
Certainly not under todays models of a few fake audited centralized closed corrupt wannabe govbankster crypto corporations.
well that's the only kind of datacenter that has ever existed.
Under a well distributed standardized 24x365 walk in publicly auditable opensource etc etc... maybe. And even that should be tried to be designed out if at all possible.
On Sun, Dec 15, 2019 at 05:51:57PM -0300, Punk-Stasi 2.0 wrote:
On Sun, 15 Dec 2019 05:51:44 -0500 grarpamp <grarpamp@gmail.com> wrote:
a) they can be sharded signed and distributed out to query trees, doesn't solve things long term. b) utxo can be flag cutover coinbase input to empty chain, very unlikely. c) chain protocols will evolve to effectively consensus mine a utxo database, hashed checkpoint series, etc thereby allowing tx's and blocks to be thrown away after the series becomes too confirmed and locked to mine a different one.
Storage becomes mooted,
The problem isn't storage per se. The thing is, all peers have to validate all transactions. And that may include all past transactions. It may seem pointless to revalidate old transactions but how do you arrive at the current state if you don't?
Git seems to do reasonably well in the "only download -valid-, and "recent" (delta to my existing set of) TXNs. If there is concensus on "most recent 'snapshot' point in time" signature or something, earlier history could be discarded by those who are not needing or wanting to validate -all- past TXNs. Surely. Of course, if you want to validate all past TXNs for a particular wallet, you would need the history at least as far back as that wallet's creation. Being a DC dullit, I am not able to answer the question "is it necessary to validate a wallet's entire history (or even further back), just in order to safely transact with that wallet?", though it "feels" to me that it should not be necessary to validate entire wallet history. The thinking/ debate processes are straightforward to walk through - there appears to be a genuine problem with DC legacy (BTC) having sold out...
On Mon, Dec 16, 2019 at 09:55:29AM +1100, Zenaan Harkness wrote:
On Sun, Dec 15, 2019 at 05:51:57PM -0300, Punk-Stasi 2.0 wrote:
On Sun, 15 Dec 2019 05:51:44 -0500 grarpamp <grarpamp@gmail.com> wrote:
a) they can be sharded signed and distributed out to query trees, doesn't solve things long term. b) utxo can be flag cutover coinbase input to empty chain, very unlikely. c) chain protocols will evolve to effectively consensus mine a utxo database, hashed checkpoint series, etc thereby allowing tx's and blocks to be thrown away after the series becomes too confirmed and locked to mine a different one.
Storage becomes mooted,
The problem isn't storage per se. The thing is, all peers have to validate all transactions. And that may include all past transactions. It may seem pointless to revalidate old transactions but how do you arrive at the current state if you don't?
Git seems to do reasonably well in the "only download -valid-, and "recent" (delta to my existing set of) TXNs.
If there is concensus on "most recent 'snapshot' point in time" signature or something, earlier history could be discarded by those who are not needing or wanting to validate -all- past TXNs. Surely.
Of course, if you want to validate all past TXNs for a particular wallet, you would need the history at least as far back as that wallet's creation.
"Of course" he says again, once again proclaiming orthodoxies from the tower of certainty, boldly plonking his sole remaining neurone into the puddle (ocean) of ego, then clamours for another brown paper bag to hide 'is sorry arse within. In the light of such profound "knowledge", always remember to ignore almost everything I say :( <shuffles off, red faced, to make a coffee>
Being a DC dullit, I am not able to answer the question "is it necessary to validate a wallet's entire history (or even further back), just in order to safely transact with that wallet?", though it "feels" to me that it should not be necessary to validate entire wallet history.
The thinking/ debate processes are straightforward to walk through - there appears to be a genuine problem with DC legacy (BTC) having sold out...
yeah, so what's the 'next generation' solution?
a) they can be sharded signed and distributed out to query trees, doesn't solve things long term. b) utxo can be flag cutover coinbase input to empty chain, very unlikely. c) chain protocols will evolve to effectively consensus mine a utxo database, hashed checkpoint series, etc thereby allowing tx's and blocks to be thrown away after the series becomes too confirmed and locked to mine a different one. Storage becomes mooted,
The problem isn't storage per se. The thing is, all peers have to validate all transactions. And that may include all past transactions. It may seem pointless to revalidate old transactions but how do you arrive at the current state if you don't?
these views are aligned with the system I am developing, (correct me if I am wrong) which include a separation between public data and private data. The private network, run on the same nodes that run the public protocol, made of encrypted P2P links celebrating private trading protocols. These trading events would generate public transactions to be settled in the public ledger. Private blockchains could be created among closed groups. both open and restricted blockchains wouldn't reflect the past, just the current state.
the lingering limitation seems to be estimating bandwidth the processing participants need.
I don't know. The problem is that 'peers' need to audit the whole system. And that takes resources. So peers start 'delegating' powers because that's more 'efficient', and we end up with the same central tyranny we have now.
Regarding peers in need to audit the whole system (or just a fraction of it) As part of scalability solution I am spinning around the following idea: as the network grows and reach say 10000 nodes, it would split in two consensus clusters of 5000 nodes each. Each side is isolated network caring each one about half of the ledger. Tx would be relayed to the appropriate cluster or partition. Tx involving 2 clusters would require a temporal connection to a random node in the other cluster to perform an atomic validation. Then, as the 2 clusters grow in nodes reaching 10000 each, another split would happen automatically leaving 4 networks (4 blockchains) caring each one about 1/4th of the global ledger. Applying this model recursively a mass-adoption situation would consist on 1.000.000 network of 10.000 nodes each, totalling 10B nodes, each network caring on one millionth of the global ledger. what problems I am not foreseeing this approach imply? thx
On 16/12/2019 14:22, other.arkitech wrote:
yeah, so what's the 'next generation' solution?
a) they can be sharded signed and distributed out to query trees, doesn't solve things long term. b) utxo can be flag cutover coinbase input to empty chain, very unlikely. c) chain protocols will evolve to effectively consensus mine a utxo database, hashed checkpoint series, etc thereby allowing tx's and blocks to be thrown away after the series becomes too confirmed and locked to mine a different one. Storage becomes mooted,
The problem isn't storage per se. The thing is, all peers have to validate all transactions. And that may include all past transactions. It may seem pointless to revalidate old transactions but how do you arrive at the current state if you don't?
these views are aligned with the system I am developing, (correct me if I am wrong)
I think you are wrong. And you don't make much sense. which include a separation between public data and private data.
The private network, run on the same nodes that run the public protocol, made of encrypted P2P links celebrating private trading protocols.
What the hell does "celebrating" mean here? Talk English.
These trading events would generate public transactions to be settled in the public ledger.
So are they are public, not private? We don't know what people are trading for the money, but them we don't know that with Bitcoin anyway.
Private blockchains could be created among closed groups. both open and restricted blockchains wouldn't reflect the past, just the current state.
How? Sounds impossible. You do know how blockchain currencies work, don't you? Every transaction is public and part of the blockchain. People know that a wallet contains money because the list of transactions is translated into a ledger of accounts of which wallet contains what, in Bitcoin's case, by about 4 groups of people, though you can do it yourself to check that the four groups are not cheating anybody. The only way someone knows that there is any money in a wallet is because it is in the accounts ledger, which is publicly derived from the public blockchain. You can't do it privately. Either nobody would know whether you had any money or not, or someone has to be trusted - and remember the third law of secure systems: only people you trust can betray you. You do know what a blockchain is, don't you? It is a record of past transactions. In order to find the current state you need to process the blockchain. Peter Fairbrother
participants (5)
-
grarpamp
-
other.arkitech
-
Peter Fairbrother
-
Punk-Stasi 2.0
-
Zenaan Harkness