HomeLatest NewsMicrosoft’s 1.3 Billion Model Outperforms Llama 2

Microsoft’s 1.3 Billion Model Outperforms Llama 2

-

Reading Time: < 1 minute

Microsoft Research has done it once again. After outperforming Meta’s LLaMa with phi-1 in July, the researchers have now introduced phi-1.5, a cutting-edge language model of 1.3 billion parameters that outperforms Llama 2’s 7 billion parameters model on several benchmarks. Microsoft has decided to open source the model.  The phi-1.5 model, comprising a staggering 1.3 […]

Views: 0

Related articles

Stay Connected

0FansLike
0FollowersFollow
3,911FollowersFollow
22,100SubscribersSubscribe
spot_img

Latest posts