Tether releases QVAC cross-platform BitNet LoRA framework: supports training AI models with billions of parameters on consumer-grade devices
Odaily reported that, according to an official announcement, Tether has launched a cross-platform BitNet LoRA fine-tuning framework within QVAC Fabric, enabling training and inference optimization for Microsoft BitNet (1-bit LLM). This framework significantly reduces computing power and memory requirements, allowing billion-parameter models to be trained and fine-tuned on laptops, consumer-grade GPUs, and smartphones.
This solution achieves, for the first time, BitNet model fine-tuning on mobile GPUs (including Adreno, Mali, and Apple Bionic). Tests show that a 125M parameter model can be fine-tuned in about 10 minutes, while a 1B model can be completed in roughly 1 hour, and even on mobile devices, it can scale up to a 13B parameter model.
Additionally, the framework supports heterogeneous hardware such as Intel, AMD, and Apple Silicon, and for the first time enables 1-bit LLM LoRA fine-tuning on non-NVIDIA devices. In terms of performance, BitNet models achieve inference speeds on mobile GPUs that are 2 to 11 times faster than CPUs, while memory usage is reduced by up to 77.8% compared to traditional 16-bit models.
Tether stated that this technology is expected to break the dependence on high-end computing power and cloud infrastructure, promote the decentralization and localization of AI training, and provide a foundation for new application scenarios such as federated learning.
Disclaimer: The content of this article solely reflects the author's opinion and does not represent the platform in any capacity. This article is not intended to serve as a reference for making investment decisions.
You may also like
Analysis: The Fed's Rate Cut Timing Remains a "Black Box," While Energy Shocks May Erode Room for Easing
Tally announces shutdown, CEO says regulatory changes have weakened DAO demand
Moody's launches blockchain data integration platform Token Integration Engine
Jensen Huang at GTC told Yang Yuanqing, "This year belongs to you."
