Tether Data Launches QVAC Fabric LLM to Enable LLM Training on Consumer Devices

iconKuCoinFlash
Share
Share IconShare IconShare IconShare IconShare IconShare IconCopy

As reported by HashNews, Tether Data has announced the launch of QVAC Fabric LLM, a new large language model inference runtime and fine-tuning framework. The technology allows users to run, train, and customize large language models directly on consumer-grade GPUs, laptops, and smartphones, eliminating the need for high-end cloud servers or NVIDIA systems. QVAC Fabric LLM supports models such as Llama3, Qwen3, and Gemma3, and expands the llama.cpp ecosystem. Training is supported on GPUs from AMD, Intel, NVIDIA, Apple, and mobile chips. The framework is open-sourced under the Apache 2.0 license, with pre-built binaries and ready-to-use adapters available on Hugging Face, enabling developers to fine-tune models with just a few commands.

Disclaimer: The information on this page may have been obtained from third parties and does not necessarily reflect the views or opinions of KuCoin. This content is provided for general informational purposes only, without any representation or warranty of any kind, nor shall it be construed as financial or investment advice. KuCoin shall not be liable for any errors or omissions, or for any outcomes resulting from the use of this information. Investments in digital assets can be risky. Please carefully evaluate the risks of a product and your risk tolerance based on your own financial circumstances. For more information, please refer to our Terms of Use and Risk Disclosure.