This is a brief introduction to the polynomial commitment scheme KZG, not a strictly mathematical or cryptographic description.
Aggregated KZG, when used in Data Audit, is a sophisticated cryptographic technique designed to make blockchain networks more efficient and scalable. To explain this in an easier way, let's break it down into simpler concepts:
KZG stands for Kate-Zaverucha-Goldberg, named after the researchers who developed it. It's a method for creating a compact cryptographic "commitment" to a set of data (like a list of transactions in a blockchain block) without revealing the data itself. Think of it as a way to securely lock data in a box and then prove what's inside the box without opening it.
"Aggregated" refers to the process of combining multiple things into a single one. In the context of KZG, it means taking multiple cryptographic commitments and combining them into one. This is like taking several locked boxes and merging them into a single box that still securely contains all the original items.
Data Audit is a technique used in blockchain networks to ensure that the extremely large data is completely stored in the server and has not been lost or tampered with. It's like if you submit a library of books to the cloud, and you can check the integrality of all the books without download it to your laptop.
-
Efficiency: By aggregating KZG commitments, the process of verifying data availability becomes much more efficient. Instead of checking each piece of data individually, validators in the network can check a single aggregated commitment. This is akin to verifying the contents of a single box rather than opening and checking multiple boxes one by one.
-
Scalability: This efficiency directly contributes to scalability. Blockchain networks can handle more transactions and operate faster because the data availability verification process is streamlined. It's like being able to quickly check that all the books in the library are in place without having to look at each book individually.
-
Security: Aggregated KZG commitments maintain the security properties of the individual commitments. Even though the commitments are combined, the data is still securely "locked" and can be verified without being exposed. This ensures that the network remains secure and tamper-proof.
In summary, using aggregated KZG in Data Audit is like having a super-efficient, secure library system. It ensures that all the books (data) are available when needed, without everyone having to hold every book. This makes the blockchain network faster, more scalable, and secure, enabling it to support more users and transactions.
KZG polynomial commitment, KZG for short, is a scheme that allows a prover to prove the value of a polynomial at any position without giving out the polynomial itself.
It is like this:
Prover : I have a polynomial
Verifier : Thanks, I get the
Prover : Yes, let me compute
Verifier : Get it! I have checked your proof
If you have a set of number
and the degree of
For example, we have (3, 2), then we get
it is a line.
If we have (1, 2, 1), then we get
it is a parabola.
We can draw any polynomial on the plane as a curve.
As you may have heard, two points determine a line, and three points determine a parabola. Yes, if we have
If we have two polynomial
Yes, we can perform any linear combination of polynomials, and it is still a polynomial.
If you store a piece of data in a server, is there any way to ensure that the data in the server is properly saved and not lost? The simplest way is to ask the server to send all the data back and you can check it again. But this method is too inefficient. Is there an easy way to accomplish this task? In fact, KZG can achieve it very well.
First, you can encode your data to a polynomial
Next, you are the Verifier, and the server is the Prover :
Verifier : Can you help to comupte the value of polynomial
Prover : Yes, let me compute
Verifier : Get it! I have checked your proof
How about if we store some piece of data so that we have a set of polynomial
At the same, we have the KZG commitment set
Verifier : I have a set
Prover : Yes, let me first comupte
Verifier: Get it! I have checked your proof
If the data is so large that the network transmission fails, then we need to sample the data.
For example, the server stored the data as a polynomial
Don't forget to ask for a KZG proof at the same time to make sure that the server is honest.
If we have 10 shardings of the whole data, then we have 100 points of

