ChainThink reports that, according to official announcements, the Bittensor subnet Templar (SN3) completed the largest decentralized LLM pre-training to date, Covenant-72B, on March 10.
Covenant-72B is a 72-billion-parameter language model pre-trained by the Templar team on Bittensor Subnet 3, entirely using general internet data without reliance on centralized data centers. The model achieved a score of 67.1 on the MMLU (zero-shot) benchmark, outperforming centralized baselines such as LLaMA-2-70B and LLM360 K2 under identical evaluation conditions. It is the largest fully permissionless, collaborative language model to date, with over 70 distinct nodes contributing computational resources throughout its operation. The team has released all weights and checkpoints under the Apache License.
In response to this news, Bittensor (TAO) and its subnet tokens rose broadly, with TAO up 54.8% over the past two weeks. The subnet token τemplar surged 194% in the last seven days and is now trading at $19.30.

