💨 Abstract
Microsoft researchers have developed the largest-scale 1-bit AI model, BitNet b1.58 2B4T, which is more memory- and computing-efficient than most models due to its quantized weights into -1, 0, and 1. The model, openly available under an MIT license and running on CPUs like Apple's M2, outperforms traditional models of similar sizes and is speedier while using less memory.
Courtesy: techcrunch.com
Summarized by Einstein Beta 🤖
Suggested
A comprehensive list of 2025 tech layoffs
A new kids' show will come with a crypto wallet when it debuts this fall
Techstars increases startup funding to $220,000, mirroring YC structure
OpenAI's new reasoning AI models hallucinate more
ChatGPT: Everything you need to know about the AI chatbot
Bluesky may soon add blue check verification
Mobility: Lyft buys its way into Europe, Kodiak SPACs, and how China’s new ADAS rules might affect Tesla
White House replaces covid.gov website with 'lab leak' theory
ChatGPT is referring to users by their names unprompted, and some find it 'creepy'
Startups Weekly: Mixed messages from venture capital
Powered by MessengerX.io