While work on sharding and casper continues with multiple implementations of phase 0 Beacon Chain which may be finalized by the end of the year, the ethereum 1.0 team is apparently planning to continue refining the current Proof of Work (PoW) chain. Casey Detrio, an eth dev, says:
“The first goal is to boost transaction throughput on the mainnet… a client optimization has been recently discovered which is likely to enable a substantial increase to the block gas limit while maintaining a low uncle rate. The optimization is a fix to the way Parity relays blocks (discovered by Alexey Akhunov of turbo-geth fame).
Currently Parity does full verification of block PoW and transaction processing, before relaying a block. The optimization is to only verify the PoW and then start relaying the block, while processing the transactions…
Beyond low-hanging optimizations, more drastic changes for mainnet throughput increases are also being studied. One approach is parallel transaction processing, picking up where an old EIP left off.
Another approach to achieving a big scalability boost on the mainnet, mentioned long ago in the Sharding FAQ, is a change to the PoW protocol:
‘Bitcoin-NG’s design can … increase the scalability of transaction capacity by a constant factor of perhaps 5-50x… [the approach] is not mutually exclusive with sharding, and the two can certainly be implemented at the same time.’
So there are easy optimizations that might yield an immediate (totally wild guess, 2x-5x) throughput boost on the mainnet. And with more comprehensive protocol changes, maybe a 50x boost on the mainnet.”
It looks like all this was sort of put on hold as the now deprecated Hybrid Casper Proof of Stake (PoS) was expected by last summer.
Plans changed to the launching of a new PoS Beacon Chain to make sharding implementation quicker and easier.
That means, for perhaps a long time, there will be a PoW chain connected to a PoS Beacon Chain with the two sort of having parallel roadmaps.
Thus these low hanging fruits to increase scalability have now been revived because the PoW chain will probably be around for perhaps half a decade or more as there will need to be a voluntary transition from the PoW chain to staking Beacon.
Instead of just sitting around waiting for ethereum 2.0, it looks like the PoW chain devs – like the Geth and Parity maintainers – are now sort of laying out their own roadmap.
Little has reached a concrete stage as far as ethereum 1.0 is concerned, so it remains somewhat unclear just how these parallel chains will work.
What is clear is that eth is eth, as in there will be only one eth which can freely move to the PoS chain, but not back to PoW, at least initially.
The PoS chain, however, is “kind of like a half-way between a testnet and a mainnet,” according to Vitalik Buterin, Chief Scientist at the Ethereum Foundation.
Then sharding of data is turned on during phase one in perhaps 2020. Then sharding of state, cross-shards communication, then sharding of shards.
That’s all on the Beacon Chain and may take at least five years for it to reach a final stage. During that time, the PoW chain will keep on running as it does now. Rather than just keeping it there as a sort of waiting to be discarded chains, they’ve re-opened the books basically to see what can be improved.
Scaling is obviously the headline and what everyone cares about with low hanging fruits improvements hopefully to increase capacity soonish.
Then there’s a potential upgrade of the Ethereum Virtual Machine (EVM) to eWASM on the PoW chain. In addition, there’s a potential proposal for rent storage.
Ethereum 1.0 can’t comfortably scale without significantly increasing node costs. That’s because every node needs to have a copy of the entire blockchain, unlike in sharding where nodes are bundled into say 100 nodes handle 100 transactions, with all these 100 nodes then combining into one network.
So if capacity is to increase on the PoW chain, then node costs will increase too. The main cost is storage, according to Detrio. To mitigate that, they are thinking of introducing storage rent.
What this is exactly isn’t very clear but as the name suggests you are renting space on the blockchain rather than having a freehold in perpetuity whereby your smart contract lives forever at no cost to the smart contract issuer.
They are thinking of a mechanism where there’s basically a fee for storage and if the fee is not paid then the contract is deleted.
One potential way of doing it is to leave the contract deleted sort of forever, which is very unlikely to be proposed.
If it is put forward, it is likely it will be the second version where there will be a method to revive the smart contract through providing a Merkle proof of data, according to Prateek Singh.
This is a very big change both from an implementation perspective and an expectations perspective. So it will need quite a bit more discussion with concrete analysis that paint a clear and objective picture of pros and cons.
One pro is obviously that our grandchildren won’t need to keep storing what may be objectively useless smart contracts created just for learning or experience. Another pro might be that the “income” from the “rent” can go towards paying stakers, thus lowering the need for inflation in the long run.
One of the cons against it might be that no one wants to pay rent forever. There could be an option perhaps of a leashold smart contract and a freehold smart contract. Or most of the goals could perhaps be achieved by increasing “the cost of SSTORE when a value is set to non-zero from zero to 40,000 gas.” Buterin says:
“I raised this exact possibility in the meeting, and it still seems reasonable to me. The way opcode prices were originally made is using this spreadsheet that basically just calculated the different costs of processing each opcode (microseconds, history bytes, state bytes…) and assigned a gas cost to each unit of each cost; we can just push up the cost we assign to storage bytes.
It definitely solves the largest first-order problem (storage is not costly enough in an absolute sense) minimally disruptively, and I’m not sure if the other inefficiencies of storage pricing today are bad enough to be worth uprooting the present-day storage model live to fix.
A third possibility that I have not yet seen discussed is to start off by raising the gas limit and increasing the SSTORE cost (possibly greatly increasing it, eg. 4-5x; also NOT increasing refunds to mitigate gastoken), and then start architecting a precompile that manages a cheaper class of temporary storage that follows some rent scheme.”
The meeting mentioned above was a private gathering at Devcon where they tried to organize the work needed to turn the above into a concrete proposal.
The minutes of that meeting were leaked with a little storm in a teacup following due to the private nature of the meeting. Those who suggested the above plan, however, wanted to keep it private because in order to “sell” it they wanted to argue that, according to Detrio:
“The current Ethereum mainnet can sustain growth for three more years. If some drastic breaking changes are not made before then to reduce the disk space burden, then Ethereum as we know it will not survive.
At the other end of the spectrum are researchers whose efforts are focused on scaling Ethereum by launching 2.0 as soon as possible.
They argue that new hard drives can accomodate the current rate of state growth on the 1.0 mainnet, until 2.0 is launched and users migrate from 1.0 contracts to new contracts on 2.0.
They also argue that introducing breaking changes on the mainnet would violate the behavioral expectations that users have about contracts deployed on 1.0, and that the 1.0 network would work just fine with a state size of 70 gigs in three years (the current state size is around 7 gigs, last I checked).”
Meaning there’s three potential choices for ethereum’s ecosystem. Ethereans can wait until sharding and then get scalability. Can get some “quick” scalability, but have to pay rent for it. Or can get the same scalability without rent in the hope that higher SSTORE costs align the incentives to keep the chain within reasonable costs.
All this is at a very… not even draft stage, but three working groups have been set-up to look at it. If they are aiming for June, then presumably they haven’t taken all of December off, so we might get some data and concrete proposals soonish.
At face value we’re neutral as neither choice appears to be “bad,” but presumably some of them are better than others so we’ll see what consensus they reach.