Microsoft’s 1.3 Billion Model Outperforms Llama 2

0
131
Reading Time: < 1 minute

Microsoft Research has done it once again. After outperforming Meta’s LLaMa with phi-1 in July, the researchers have now introduced phi-1.5, a cutting-edge language model of 1.3 billion parameters that outperforms Llama 2’s 7 billion parameters model on several benchmarks. Microsoft has decided to open source the model.  The phi-1.5 model, comprising a staggering 1.3 […]

How useful was this post?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.

Views: 0