Ethereum researchers are developing a novel strategy aimed at increasing the scalability of the blockchain network. A newly drafted proposal suggests moving transaction data from traditional block structures into dedicated data fields called “blobs.” This technical shift is designed to significantly reduce bandwidth demands and allow the Ethereum network to handle greater data volumes more efficiently.
Technical details and goals behind the Block-in-Blobs proposal
The collaborative Block-in-Blobs proposal first emerged in early 2024 and puts the spotlight on the blob data structure introduced through Ethereum’s EIP-4844 upgrade. Blobs provide a more resource-efficient way of transporting and storing transaction data within the network.
At the heart of the proposal lies the intention to encode both transaction and block data directly within blobs. This structure allows consensus participants, known as validators, to verify data availability by checking cryptographic commitments to the blobs rather than downloading and executing every single transaction. Especially during periods when blocks grow larger and network data volume surges, this brings considerable operational efficiencies that allow the network to scale smoothly.
Blobs were first implemented on Ethereum in March 2024 with the launch of the Dencun upgrade, which featured a process known as proto-danksharding. Instead of storing every transaction directly on-chain, this approach enables more effective data carriage by leveraging blobs, paving the way for future efficiency improvements.
If adopted, the Block-in-Blobs design is expected to enable rollup solutions and zkEVM-based systems to carry out transaction verification in a secure and streamlined manner, further enhancing Ethereum’s capacity to support the next generation of decentralized applications.
Data availability, security, and evolving transaction standards
The Block-in-Blobs approach also aims to address a critical challenge in networks leveraging zkEVMs. Zero-knowledge proofs (zk-proofs) can verify that transactions have been processed correctly; however, they do not guarantee that the underlying transaction data remains accessible on the network. As researcher Toni Wahrstätter has highlighted, without ensuring data availability, transactions might seem technically confirmed, even if vital data is missing.
By transferring transaction data to side blobs, the Block-in-Blobs proposal enables direct and transparent proof of data availability. Validators can continue to verify network security through statistical sampling, rather than having to download all data in full.
The plan would also allow Ethereum to merge its transaction gas and blob data mechanisms. This paves the way for a unified pricing model, known as “data gas,” which could streamline transaction fees and more closely tie data availability costs with the cost of executing transactions. Discussions are underway to establish a system where both data access and execution incur efficient, harmonized fees.
In parallel, the Ethereum community is considering new standards to simplify transaction processing. The ERC-8211 standard, co-developed by Biconomy and the Ethereum Foundation, envisions “programmable workflows,” which can bundle multiple transaction steps. This innovation makes it possible for on-chain operations to gather and validate data within a single transaction, allowing complex multi-step actions to be executed with a single cryptographic signature.
Ongoing advancements in these areas are already paving the way for greater innovation within the Ethereum ecosystem. Technical upgrades targeting data processing and availability are anticipated to expand the capabilities of decentralized finance (DeFi) platforms and other application domains, supporting more sophisticated and efficient transaction models in the future.



