ChainCatcher report: Vitalik Buterin has posted in support of AI company Anthropic’s commitment to two ethical principles: “not developing fully autonomous weapons” and “not engaging in mass surveillance in the United States,” praising its resolve in the face of government pressure. Vitalik believes that, in an ideal world, such high-risk applications should be limited to the level of open-source LLM access—that is, equal and public access for all. Even achieving just 10% progress in this direction could reduce the risks of autonomous weapons and privacy violations, promoting safer AI development. Previously, it was reported that the Pentagon recently threatened to terminate its partnership with Anthropic, potentially costing a $200 million contract, after Anthropic refused to provide AI technology for military use without human oversight.
Vitalik Buterin Praises Anthropic for Its Ethical Stance Against Mass Surveillance
ChaincatcherShare






Vitalik Buterin praised AI company Anthropic for its ethical stance against mass surveillance and autonomous weapons. He suggested that limiting high-risk AI to open-source models could reduce risks. The Pentagon reportedly threatened to terminate a $200 million contract after Anthropic refused to provide AI for military use without human oversight. The move underscores ongoing tensions between AI and crypto developments and government crypto regulation.
Source:Show original
Disclaimer: The information on this page may have been obtained from third parties and does not necessarily reflect the views or opinions of KuCoin. This content is provided for general informational purposes only, without any representation or warranty of any kind, nor shall it be construed as financial or investment advice. KuCoin shall not be liable for any errors or omissions, or for any outcomes resulting from the use of this information.
Investments in digital assets can be risky. Please carefully evaluate the risks of a product and your risk tolerance based on your own financial circumstances. For more information, please refer to our Terms of Use and Risk Disclosure.