W
Microsoft says its newest compact “small language model,” Phi-2, is bigger and better.
The company has been working on training AI models on much smaller data sets comprised only of “textbook-quality” data, as part of its Phi models.
Microsoft says in a research blog that Phi-2, which is about twice as big as its predecessor, Phi 1.5, continues to perform on par or better than certain larger open-source Llama 2 models, including one with 13 billion parameters.
Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.
Loading comments
Getting the conversation ready...
Most Popular
Most Popular
- Meta’s historic loss in court could cost a lot more than $375 million
- Apple raises the Mac Mini’s starting price
- How the internet’s favorite squirrel dad made the hottest camera app of 2026
- These reusable digital Polaroids are a clever way to cover a fridge in memories
- AI music is flooding streaming services — but who wants it?












