This weekend’s official XT release was made under pretense that a bigger block size will supposedly increase Bitcoin’s capacity and allow more transactions per second, cheaper fees and wider adoption. So why had the decision caused so much controversy and why didn’t the Core developers just implement it already? We all want a large scale Bitcoin with maximum participation, don’t we? As it turns out, the XT developers are not telling the whole truth and are knowingly putting Bitcoin’s future on the line.
Decentralised and Trustless by Design
Satoshi Nakamoto based the Bitcoin protocol on a Peer-2-Peer network design. Instead of releasing a self-sufficient application that runs independently on a server to process and route transactions between wallets, the Bitcoin software runs in P2P client mode and is dependent on other peer clients to build the network and validate its peers’ activities. The reason for this design decision is two-fold:
1) it makes Bitcoin decentralized
2) allows network nodes to build consensus
The reason for the consensus mechanism is because Satoshi Nakamoto’s Bitcoin design solves the problem of digital double-spending. Each fullnode verifies every previous (and each new) transaction, and then communicates its agreement on the validity of transactions (and the blocks containing them) to other nodes. Hence, the network constantly maintains consensus about the truthfulness of nodes’ shared version of history – about the validity of blocks and transactions.
The consensus mechanism is what makes it possible for two parties to transact “trustlessly” – they do not have to trust one another not to double-spend or wonder whether any other party has tampered with history and invalidated chains of interdependent transactions and the associated coins. The Bitcoin protocol’s consensus rules ensure that coins are not double-spent – and a consensus history of coins that had previously been spent is kept in the public block chain ledger.
A benefit of the decentralized nature of Bitcoin’s network is that it is censorship resistant. As with any P2P network, all nodes are equal – there is no single node with greater importance or greater hierarchy, hence, there is no central point of attack. The network cannot, therefore, be shut down like Napster, such as when the FBI gained a warrant for the central Napster server room, walked in and pulled the plug.
As a consequence of the time-consuming validation and consensus process, as well as the mining that builds a cumbersome blockchain for proof of validation, the benefits of Bitcoin come at the cost of certain trade-offs:
- Expensive – mining is resource intensive and wasteful: it consumes electricity, contributes to carbon emmission, and requires expensive processing to generate the Proof of Work that validates the blockchain and its history of interdependent transactions.
- Unreliable – a dynamic block difficulty and luck factor means that blocks are not produced consistently. Users may sometimes wait 5 minutes and sometimes up to an hour for their transaction to be included in a block. During times when more transactions compete for inclusion in a block, those paying higher fees may evict those paying minimum or standard fees – there is no way of knowing, in advance, that this may happen to your transaction.
- Slow – transactions must propagate through the network for validation and then be included in a block (average 10 mins). The mined block must then, in turn, propagate through the network – being passed from node to node – to be validated. A transaction must be 6 blocks deep inside the block chain before it is considered “confirmed” by the protocol and in accordance with Satoshi Nakamoto’s design.
As specified by the Bitcoin whitepaper, the benefits of trustlessness via consensus validation and censorship resistance via decentralization justify the trade-offs described above. The payment protocol is inherently wasteful, slow and unreliable compared to centralized alternatives such as Visa and Paypal.
The speed and capacity trade-offs that Satoshi Nakamoto made in exchange for Bitcoin’s design goals, means that Bitcoin is ill-suited as a competitor for Visa or PayPal because, inherently, Bitcoin does not scale well.
But I Thought…
Perhaps not knowing about the limitations of the protocol has fueled some of the calls for a bigger block size. Perhaps it is an unwillingness to accept that Bitcoin has certain inherent – but unavoidable – constraints. Perhaps it is merely lazy thinking. For example, it is often argued that bigger blocks will create capacity for more transactions and, therefore, more usage and adoption. Think again:
Oh My, What Big Blocks You Have
Anyone can transact via Bitcoin and there is no means of censoring users or limiting their use of the network as long as they make valid transactions.
By increasing block size and making more transactions possible at low fees, what is to stop a large corporation, bank or government from using the newly created capacity as a convenient settlement network? They wouldn’t have to fund or develop their own (well, Ecuador has, for this exact use-case) and perhaps the utility and capacity of the Bitcoin network provides adequate reliability and cost for such a large entity’s use-case that also requires a public, transparent record.
They might pay miners a constant or above-average fee for the service, and in the process they hog x Megabytes of space in each block – leaving the rest of the user-base with 1MB of capacity. What had been the benefit? None for the multitudes and none for adoption of Bitcoin. Discussion about a capacity limit would continue amongst developers and deja vu, all over again.
You see, increased block size capacity will be utilized – by all kinds of entities – not just the girl next-door or the unbanked poor.
Pervasive Global Currency Domination
The current 1 MB blocks allow a theoretical maximum of 7 transactions per second at 250 bytes per transaction.
Let’s scale up to half of Visa’s capacity of around 22,000 transactions per second:
10,000 Bitcoin transactions per second requires 1.6 GB blocks and would result in a block chain that grows by 87 TB (Terabytes) per year, consuming storage of 1.5 TB per week.
(The above example does not take into consideration the impossibility of this theoretical scenario due to lag introduced by transaction and block validation – pushing 1.6GB blocks around the network for consensus seeking).
The effect on Bitcoin mining – the core security mechanism of the network – will be that centralization becomes inevitable. Assuming the protocol could handle this amount of traffic – which it cannot – how many of today’s miners will be able to afford the hardware and infrastructure required? Who, except a wealthy few, could afford to run their own fullnode to contribute validation and security to the network?
Specific Features, Not General Use
The slowdown imposed by consensus seeking and the necessary condition of keeping the network as decentralized as possible, therefore, means that Bitcoin Core (or the stillborn XT) could never compete with Visa. That was never the design goal. Similarly, Visa cannot provide programmable money, trustless payments and censorship resistance.
Custodians Not Pimps
These are the design considerations and technical facts that inform the position of the conservative faction in the block size debate. Bitcoin is not decentralized in its own right. The protocol does not have a ‘decentralizing algorithm‘. The degree of decentralization is the responsibility of the users of the network: as many users as possible should run their own fullnodes, mining diversity needs to be encouraged: many small pools are better than a few large pools. Bear in mind, for example, the risk of a 51% Attack by a single large pool or a cartel of mining operators.
In this context, the responsibility for maintaining a high degree of decentralization, necessitates a custodian responsibility by Bitcoin’s users and its developers.
If the matter is so cut-and-dry and it is true that Bitcoin does not scale, then why have the XT developers not addressed these points? Precisely, and Gavin Andresen’s repeated answer to the oft-repeated question “Why expose Bitcoin to the risk of centralization by increasing block size?” has been a variation on “because when networks reach capacity then Bad Things happen“. Nebulous fear and speculation – no technical reason has been forthcoming.
The question, then, is why would Gavin Andresen, Mike Hearn and Tom Harding willfully expose Bitcoin (and other peoples’ money) to the threat of centralization in trying to scale it to a purpose it was neither designed for nor technologically suitable for?
Lightning Network: Unlimited Bitcoin Transaction Capacity
What the Core developers favor is a tiered network architecture whereby Bitcoin provides a core settlement layer (layer 1) with micro-transaction layers (layer 2) stacked on top. Development of the Lightning Network implies a future where a virtually unlimited number of daily bitcoin transactions will be possible with only a moderate block size increase needed for the protocol.
Lightning Network and the fee market will be explored in a future article.
In closing a pointed remark by Eric Voskuil, libbitcoin maintainer and Core developer:
What you seem to be missing is *why* bitcoin is better money. Have you
considered why is it comparatively inexpensive to transact in a medium
that is based on such a highly inefficient technology?
You might want to consider that these two considerations are not
independent. The reduced cost of transacting (and carrying) Bitcoin is a
direct consequence of its trustless nature. Any compromise in that
nature will eliminate that advantage, and therefore Bitcoin.
Bitcoin is designed to solve only one problem that other systems do not.
To accomplish this it makes significant compromises in other areas. The
benefit of this solution is that it cannot be effectively controlled by
the state. As a result, all of the associated overhead is eliminated.
Hence the net cost benefit despite high technical costs.
– Eric Voskuil
Images from Shutterstock.