On incentive (part 3)

In this post, we revisit model comparison metrics. Should we compare sample-by-sample, or group several samples together? Or does it make more sense to “pack” samples (i.e. join multiple samples with an EOS token in between)? Does length matter? And what’s up with these “pages” in the dataset? TL;DR A Read more…

Growing models

In this blog, and in our Discord channel, we discuss training in detail. A topic that is often overlooked, is how to grow a model. Especially in incentivized, collaborative and distributed training, this is a key ingredient. This post explores the concept of model growth with concrete Python code examples Read more…

On training (part 1)

One of the first objectives for subnet 29 was to see whether the sample packing during training makes a difference for the achievable loss. TL;DR Sample packing makes significant difference. Per-token loss of a trained model is generally higher when training on packed samples, and evaluating on single samples. Or, Read more…

On incentive (part 1)

Disclaimer: The text below is written for a wide audience. This means that the tech vocabulary is deliberately kept simple and, to some readers, the explanations may appear a bit long-winded. Week zero The biggest changes we implemented in our first week of existence are the modifications to the incentive Read more…

Building the subnet

So we registered a subnet. Time to turn all our ideas into actual plans, and start building right away. Have a working validator up and running within a week. Build a website, register at GitHub, W&B, HuggingFace, open a Discord channel. With so many ideas and so little time, we Read more…

On an average Saturday morning…

Saturday morning, 8:12 am. Awake. Kids screaming. Weekend. Coffee. What’s happening in the world… Laptop. Checking some messages. Chain fully online again? nice. Wait, what?! But that means… [digging through subtensor code] https://github.com/opentensor/subtensor/blob/main/pallets/subtensor/src/root.rs#L1229 less than 200 Tao to register the next network! 🤯 This is my chance. All those ideas, Read more…