First 1GB Bitcoin Block Has Been Mined on Testnet

0

Bitcoin Unlimited developers, in collaboration with researchers from the University of British Columbia and one participant from nChain, have mined yesterday the first ever 1GB block on a Gigablock Testnet.

Little detail is provided with the full results to be presented at the Scaling Bitcoin Conference on November the 4th this year. But the findings are:

“Admitting transactions into mempool, due to the single-threaded nature of the present design, appears to be the dominant bottleneck.

Upon saturation of mempool admission, mempool decoherence between nodes quickly grows, and Xthin compression ratios decrease, resulting in less efficient block propagation, thereby further degrading node performance.

Very large blocks have been efficiently propagated and verified with Xthin prior to mempool decoherence; block propagation/verification does not appear to be the bottleneck.”

The findings therefore suggest that sending and verifying a 1GB block can be handled by a high-grade laptop. Thus the sending and verifying doesn’t prevent on-chain scaling.

That’s primarily because of Xthin, which compresses blocks so that a 1GB block takes 20 – 50 MB to propagate, according to Peter Rizun, who has given himself the title of Chief Scientist of Bitcoin Unlimited. He said the findings were:

“Our baseline results with BU essentially “as is.” A few days ago we achieved 300 tx/sec sustained thanks to Andrew Stone’s work streamlining mempool admission. I think we’ll hit ~1,000 tx/sec sustained on the next ramp we attempt. Mempool admission is no longer the bottleneck, as we’ve demonstrated mempool admisson rates over 10,000 tx/sec already.”

Bitcoin currently handles 2-3 transactions per second, with a long debate on whether that should be increased to around 4-6 tx/s through segwit2x. Rizun, however, appears to be suggesting it can handle 10,000 tx/s.

The demand is no where near that level at this stage, so giga blocks wouldn’t be needed for some time, with the tests instead aiming to show it can be done following the approval of $300,000 per year funding for the Gigablock Testnet Initiative by Bitcoin Unlimited Members. Andrew Stone, lead developer of Bitcoin Unlimited, says:

“We are not going from 1MB to 1GB tomorrow. The purpose of going so high is to prove that it can be done — no 2nd layer is necessary. By the time we get blocks even over 10MB we’ll have technologies like utxo commitments and partial syncing clients (imagine a node that behaves as a SPV client upon startup, but is transitioning to a full node in the background) which will make your UX much better.”

Gregory Maxwell, Blockstream’s CTO, was not convinced, stating: “Being able to make larger blocks isn’t an accomplishment on a closed, private, centralized network, especially not on top of our considerable optimizations.”

He does not elaborate further to explain why he thinks the testnet was a closed, private and centralized network. The overview paper says:

“There were mining nodes in Toronto (64 GB, 20 core VPS), Frankfurt (16 GB, 8 core VPS), Munich (64 GB, 10-core rack-mounted server with 1 TB SSD), Stockholm (64 GB, 4 core desktop with 500 GB SSD), and central Washington State (16 GB, 4 core desktop).”

So spanning two continents, with more to be added in Asia. But we’ll have to wait and see the full results, including whether the network keeps on running at such high throughput.

Moreover, undoubtedly many will be keeping an eye for any little bias or any little miss-measure. Which is why it would have been preferable if the paper had undergone some scientific peer review in an established journal.

Plus questions remain on whether an ordinary laptop can actually handle such high throughput after five years or ten years of an ever increasing blockchain in the absence of sharding. Questions that would likewise apply to 1MB blocks in a slightly longer timeframe.

But it appears at least one development team has finally returned back to work, coding and researching, as bitcoin’s development somewhat naturally becomes more decentralized.

With the ever increasing interest, and studies showing that an optimal size of 300 applies in human communications, a splinter into different groups was probably inevitable.

Groups that now appears to be competing with each other, keeping developers on their feet. Something that can only benefit the bitcoin network and its users.

 

Comment

100000
  Subscribe  
Notify of