Luke-Jr, a bitcoin protocol developer and Blockstream contractor, has for the past few years argued that bitcoin not only should refrain from adding data capacity by increasing the blocksize of 1MB, but should instead lower it to 250 kilobytes per block.
Before he joined Blockstream, however, he argued the blocksize should be dynamically adapted, with Gregory Maxwell, the former CTO of Blockstream and a former bitcoin protocol developer, shutting him up.
In a little known public discussion where Gavin Andresen, the former maintainer of the bitcoin Github, proposed a chain-split fork for multi-sigs as far back as August 24th 2011, Luke-jr says:
“If a block chain split is to occur, it makes sense to try to fix as many
problems as possible:
Replace hard limits (like 1 MB maximum block size) with something that can dynamically adapt with the times. Maybe based on difficulty so it can’t be gamed?
Adjust difficulty every block, without limits, based on a N-block sliding window…
21 million really isn’t enough if Bitcoin ever takes off, even with 100,000,000 units per BTC. Replacing the ‘Satoshi’ 64-bit integers with ‘Satoshi’ variable-size fractions…”
On the same day, on replacing the hard limit Gregory Maxwell just says “too early for that,” with Luke-jr responding:
“Dynamically adapting would be by design never too early/late. Changing from a fixed 1 MB will fork the block chain, which should be a minimized event.”
When the blocksize debate began this dynamic setting of the blocksize was also proposed in 2014-15 by Pieter Wuille, a bitcoin protocol dev and Blockstream employee.
Back in 2011, however, Maxwell was already arguing against any increase, expanding on the “too early” by stating:
“We’re not at maximum size right now (thankfully). We don’t know what the network dynamics would look like at that traffic level. So how could we competently say what the right metrics would be to get the right behavior there? Thats what I meant by too early.”
Why he was thankful the network was not running at full capacity is not clear because a few years later he opened champaign to celebrate very high fees caused by this lack of capacity.
This hardfork idea however was shelved after discussion branched into many matters with Gavin Andresen stating:
“This discussion is convincing me that scheduling a blockchain split is definitely the wrong idea at this time. We can revisit in N months, when we’ve got a roadmap and nice unit tests and a bunch of well-tested patches for fixing all of the things that aught to be fixed when we DO decide a blockchain split is necessary.”
Calling it a blockchain split is one way of putting it, but as it happened capacity was slightly increased through a soft fork.
No dynamically adaptable mechanism was added to this soft fork change, so the network now has to wait for the dev committee to at some point say the blocksize weight should be increased.
If we are ever to expect this communique is not clear, with bitcoin now pretty much stagnant when it comes to scalability even as other blockchains try and address it.