Tether launched the first cross-platform LoRA fine-tuning framework for Microsoft’s BitNet models, through its QVAC Fabric platform, with the goal of enabling the training and inference of language models of up to one billion parameters on consumer hardware, including laptops, conventional GPUs and modern smartphones.
Source: Read the original article

