Proof of Validator: A key security puzzle on the road to Ethereum expansion

Today, a new concept was quietly born in the Ethereum research forum: Proof of Validator.

This protocol mechanism allows network nodes to prove that they are Ethereum validators without revealing their specific identities.

What does this have to do with us?

Under normal circumstances, the market is more likely to pay attention to the superficial narratives brought about by certain technological innovations on Ethereum, and seldom researches the technology itself in depth in advance. For example, the Shanghai upgrade, merger, transfer from PoW to PoS and expansion of Ethereum, the market only remembers the narrative of LSD, LSDFi and re-pledging.

But don't forget that performance and security are top priorities for Ethereum. The former determines the upper limit, while the latter determines the bottom line**.

It can be clearly seen that on the one hand, Ethereum has been actively promoting various expansion solutions to improve performance; but on the other hand, on the road to expansion, in addition to practicing internal skills, it is also necessary to guard against external attacks.

For example, if the verification node is attacked and the data is unavailable, then all narratives and expansion plans based on the Ethereum pledge logic may be affected by the whole body. It's just that the impact and risks are hidden behind, and end users and speculators are hard to detect, and sometimes they don't even care.

The Proof of Validator to be discussed in this article may be the key security puzzle on the way of Ethereum expansion.

Since capacity expansion is imperative, how to reduce the possible risks involved in the process of capacity expansion is an unavoidable security issue, and it is also closely related to every one of us in the circle.

Therefore, it is necessary to clarify the whole picture of the newly proposed Proof of Validator. However, because the full text in the technical forum is too fragmented and hard-core, and involves many expansion plans and concepts, Shenchao Research Institute integrates the original posts and sorts out the necessary relevant information, and conducts a survey on the background, necessity and possible impact of Proof of Validator. Learn about reading.

Data Availability Sampling: A Breakthrough for Capacity Expansion

Don't worry, before officially introducing Proof of Validator, it is necessary to understand the logic of Ethereum's current expansion and the possible risks involved.

The Ethereum community is actively promoting several expansion plans. Among them, Data Availability Sampling (DAS for short) is regarded as the most critical technology.

The principle is to divide the complete block data into several "samples", and the nodes in the network only need to obtain a few samples related to themselves to verify the complete block.

This greatly reduces the amount of storage and computation per node. For an easy-to-understand example, this is similar to our sample survey. By interviewing different people, we can summarize the overall situation of the entire population.

Specifically, the implementation of DAS is briefly described as follows:

  • The block producer splits the block data into multiple samples.
  • Each network node only gets a few samples of its attention, not the complete block data.
  • Network nodes can randomly sample and verify whether the complete block data is available by obtaining different samples.

Through this sampling, even if each node only processes a small amount of data, together they can fully verify the data availability of the entire blockchain. This can greatly increase the block size and achieve rapid expansion.

However, this sampling scheme has a key problem: where are the massive samples stored? This requires a whole set of decentralized network to support.

Distributed Hashed Table: Home of samples

This gives the distributed hash table (DHT) the opportunity to show its talents.

DHT can be regarded as a huge distributed database, which uses a hash function to map data into an address space, and different nodes are responsible for accessing data in different address segments. **It can be used to quickly find and store samples in massive nodes. **

Specifically, after DAS divides block data into multiple samples, these samples need to be distributed to different nodes in the network for storage. DHT can provide a decentralized method to store and retrieve these samples, the basic idea is:

  • Using a consistent hash function, the samples are mapped into a huge address space.
  • Each node in the network is responsible for storing and providing data samples within an address range.
  • When a certain sample is needed, the corresponding address can be found through the hash, and the node responsible for the address range can be found in the network to obtain the sample.

For example, according to certain rules, each sample can be hashed into an address, node A is responsible for the address of 0-1000, and node B is responsible for the address of 1001-2000.

Then the sample with address 599 will be stored in node A. When this sample is needed, look up the address 599 through the same hash, and then look up the node A responsible for the address in the network, and get the sample from it.

This method breaks the limitations of centralized storage and greatly improves fault tolerance and scalability. This is exactly the network infrastructure needed for DAS sample storage.

Compared with centralized storage and retrieval, DHT can improve fault tolerance, avoid single point of failure, and enhance network scalability. In addition, DHT can also help defend against attacks such as "sample hiding" mentioned in DAS.

Pain points of DHT: Sybil attack

However, DHT also has an Achilles' heel, which is the threat of Sybil attacks. Attackers can create a large number of fake nodes in the network, and the surrounding real nodes will be "overwhelmed" by these fake nodes.

By analogy, an honest hawker is surrounded by rows and rows of counterfeit products, and it is difficult for users to find genuine products. In this way, the attacker can control the DHT network and make the samples unavailable.

For example, to get a sample at address 1000, you need to find the node responsible for this address. However, after being surrounded by tens of thousands of fake nodes created by the attacker, the request will be continuously directed to the fake nodes, and cannot reach the node that is actually responsible for the address. The result is that the sample cannot be obtained, and both storage and verification fail.

In order to solve this problem, it is necessary to establish a high-trust network layer on DHT, which is only participated by validator nodes. But the DHT network itself cannot identify whether a node is a validator.

This seriously hinders the expansion of DAS and Ethereum. Is there any way to resist this threat and ensure the trustworthiness of the network?

Proof of Validator: A ZK solution to guard the security of expansion

Now, let's get back to the main point of this article: Proof of Validator.

In the Ethereum Technology Forum, George Kadianakis, Mary Maller, Andrija Novakovic, and Suphanat Chunhapanya jointly proposed this proposal today.

Its general idea is that if we can come up with a way to allow only honest verifiers to join the DHT in the DHT expansion plan in the previous section, then the malicious person who wants to launch a Sybil attack must also ** Pledge a large amount of ETH, which significantly increases the cost of doing evil economically**.

In other words, this idea is more familiar to us: I want to know that you are a good person and can identify bad people without knowing your identity.

In this kind of proof scenario with limited information, zero-knowledge proof can obviously come in handy.

Therefore, Proof of Validator (hereinafter referred to as PoV) can be used to establish a highly credible DHT network composed only of honest verification nodes, effectively resisting Sybil attacks.

The basic idea is to have each verification node register a public key on the blockchain, and then use zero-knowledge proof technology to prove that it knows the **private key corresponding to this public key. key. This is equivalent to taking out your own identity certificate to prove that you are a verification node.

In addition, PoV also aims to hide the identity of the verifier on the network layer for the anti-DoS (denial of service) attack of the verification node. That is, the protocol does not expect an attacker to be able to tell which DHT node corresponds to which validator.

So how to do it? The original post used a lot of mathematical formulas and derivations, so I won’t go into details here. We give a simplified version:

In terms of specific implementation, Merkle tree or Lookup table is used. For example, use the Merkle tree to prove that the registration public key exists in the Merkle tree of the public key list, and then prove that the network communication public key derived from this public key matches. The whole process is realized by zero-knowledge proof, and the actual identity will not be revealed.

Skipping these technical details, the final effect of PoV is:

** Only authenticated nodes can join the DHT network, which greatly increases its security and can effectively resist Sybil attacks and prevent samples from being deliberately hidden or modified. PoV provides a reliable basic network for DAS, which indirectly helps Ethereum achieve rapid expansion. **

However, the current PoV is still in the stage of theoretical research, and there is still uncertainty about whether it can be implemented.

However, several researchers in this post have conducted experiments on a small scale, and the results show that the efficiency of PoV in proposing ZK proofs and the efficiency of verifiers in receiving proofs are not bad. It is worth mentioning that their experimental equipment is just a notebook, which is only equipped with an Intel i7 processor 5 years ago.

Finally, the current PoV is still in the stage of theoretical research, and there is still uncertainty about whether it can be implemented. Regardless, it represents an important step towards greater scalability for blockchains. As a key component in the Ethereum expansion roadmap, it deserves the continued attention of the entire industry.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate app
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)