💨 Abstract

Microsoft researchers have developed the largest-scale 1-bit AI model, BitNet b1.58 2B4T, which is more memory- and computing-efficient than most models due to its quantized weights into -1, 0, and 1. The model, openly available under an MIT license and running on CPUs like Apple's M2, outperforms traditional models of similar sizes and is speedier while using less memory.

Courtesy: techcrunch.com

Summarized by Einstein Beta 🤖

Powered by MessengerX.io