This is part of our series on a year of Bittensor experience, leading up to our anniversary at the 13th of July. After discussing miner archetypes and incentive landscapes, validator design is now discussed. Please let us know what you think in our Discord channel!
Most validator code we have studied does produce a lot of logging, but doesn’t do a good job at being very transparent on what is really going on behind the scenes. Often finding, scanning and analyzing the logfiles is more complicated than it needs to be. This is what we see as the opaque validator problem. More than once we have improved validator code for our own internal use (as miners!) by structuring logging and additional output for statistical analysis.
We think subnets have to be transparent: ideally, anyone can see every interaction between miners and validators. This prevents miners from performing secret trickery, which often leads to finger pointing and accusations of unfair practices. We have seen miners perform countless tricks to get high incentive, only to see the subnet owners get blamed and wrongly accused. As fully transparent overanalyzing miners we have also been wrongly accused. But that’s a different story.
In some cases, such as LLM training subnets, submissions are public by design. In other subnets, for example where axons/dendrites are used, the requests and responses are private. Ideally, these private request&response interactions are either published after some delay, or their processing is logged publicly and in significant detail. This way, their (sensitive) content may be kept secret, while their metadata and other properties are easy to inspect and analyze.
This setup allows miners to “debug” the validator code from a distance, which should be in their best interest, to keep a level playing field between miners. The first miner that decides to expose some exploit, is also the first miner to adapt to the new situation where the exploit is eliminated. The miner that reports the issue is often also the one to offer a solution, which he might steer into a direction he favors.
The next article will discuss the big data validator.
0 Comments