[Application Development] How to decode Optimism blob data? #882
-
Did you check the documentation?
Documentation Feedback (Optional)I tried using the linked batch decoding tool from https://docs.optimism.io/app-developers/tools/build/analytics-tools, but that service is deprecated and no longer working. What type of issue are you experiencing?
Issue DescriptionI'm trying to figure out the L1 tx (ie tx on Ethereum mainnet) that corresponds to my L2 tx (ie tx on Optimism). I understand txs are posted in batches, and so I need to find the batch that included the L2 tx hash. I'm thinking of doing this by correlating the time range my tx was submitted (not sure if there's a better way). The part I'm stuck at now is how to decode the tx data from the raw blob data in the batches posted on L1. How can I go about decoding the blob data? I don't see any tools either I can use Steps to ReproduceSample batch from base: https://blobscan.com/blob/0x019c340e8e7c0e23cff2ed9be5b609c081131b4b22154ddc1ceb60986697f318 Not sure how I can decode the blob data to see which L2 tx's it includes Are you using a specific library or SDK?
Are you using the latest version of the library/SDK?
Environment DetailsNo response Troubleshooting AttemptsThere is a similar discussion for calldata, but not blobs: #98 There is this similar question but doesn't answer my question for how to decode the blobs: #627 What type of support do you need?
Additional InformationNo response FeedbackNo response |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 5 replies
-
FYI, Found the blob encoding protocol from here https://specs.optimism.io/protocol/ecotone/derivation.html#blob-encoding, write a simple decoding logic as below: #!/usr/bin/env python3
import sys
import zlib
import zstandard as zstd
import rlp
def blob_bytes(blob_hex: str) -> bytes:
"""Turn 131 072-byte blob into 130 048-byte linear stream."""
if blob_hex.startswith("0x"):
blob_hex = blob_hex[2:]
blob = bytes.fromhex(blob_hex)
return b"".join(blob[i * 32 + 1 : i * 32 + 32] for i in range(4096))
def decode_op_blob(blob_hex: str):
stream = blob_bytes(blob_hex)
fmt, length = stream[0], int.from_bytes(stream[1:4], "big")
payload = stream[4 : 4 + length]
match fmt:
case 0:
data = zlib.decompress(payload)
case 1:
data = zstd.ZstdDecompressor().decompress(payload)
case 2:
data = payload
case _:
raise ValueError(f"unknown format {fmt}")
return rlp.decode(data) # list of frames / batches
if __name__ == "__main__":
if len(sys.argv) != 2:
print("Usage: python decode_blob_data.py <blob_hex_file>")
sys.exit(1)
blob_hex = open(sys.argv[1]).read().strip()
batches = decode_op_blob(blob_hex)
print(batches) Seems failed to decompress the payload:
Not fully sure why the decompress is failed |
Beta Was this translation helpful? Give feedback.
-
hi you can try this tool https://github.com/ethereum-optimism/optimism/tree/develop/op-node/cmd/batch_decoder To decode blob data and extract L2 transaction hashes, use this tool in two steps:
The final JSON output contains all L2 transaction RLP data, which you can then hash with keccak256() to get transaction hashes. The example you post above from base included 3 batches Batch 0: 4 L2 blocks,1,326 transactions an example transaction hash: 0xb9db5ec95934becd8566105d831f43888bb00444a7ba8521a1f443bbafcafab9 |
Beta Was this translation helpful? Give feedback.
hi you can try this tool https://github.com/ethereum-optimism/optimism/tree/develop/op-node/cmd/batch_decoder
To decode blob data and extract L2 transaction hashes, use this tool in two steps:
batch_decoder fetch
- Downloads blob from L1 beacon API and decodes it into framesbatch_decoder reassemble
- Assembles frames into channels, decompresses data, and decodes batches to extract L2 transactionsThe final JSON output contains all L2 transaction RLP data, which you can then hash with keccak256() to get transaction hashes.
The example you post above from base included 3 batches
Batch 0: 4 L2 blocks,1,326 transactions
Batch 1: 4 L2 blocks,1,173 transactions
Batch 2: 1 L2 block,388 transa…