News

Microsoft launched the next version of its lightweight AI model Phi-3 Mini, the first of three small models the company plans to release. Phi-3 Mini measures 3.8 billion parameters and is trained ...
Last week, Meta said that when its final Llama 3 model launches later in 2024, it will have been trained on 700 billion parameters. Also: Microsoft unveils Phi-2, a small language model that packs ...
Microsoft announced that its Phi-3.5-Mini-Instruct model, the latest update to its Phi-3 model family, is now available. The Phi family is Microsoft's assorted compact micro models that can run on ...
Microsoft has unveiled a series ... but things are changing fast GitHub Copilot launches new AI tools, but also limits on its premium models Phi-3.5-MoE, a 42-billion parameter Mixture of Experts ...
Microsoft just release Phi 3.5 mini, MoE and vision with 128K context, multilingual & MIT license! MoE beats Gemini flash, Vision competitive with GPT4o? > Mini with 3.8B parameters, beats Llama3 ...
Microsoft has revealed the newest addition to its Phi family of generative AI models. Called Phi-4, the model improves in several areas over its predecessors, Microsoft claims, particularly in ...
Microsoft has announced the launch ... Additionally, the Phi 4 Mini Reasoning AI model contains approximately 3.8 billion parameters, yet it demonstrates strong potential for educational ...
Microsoft has launched a series of AI ... science and coding applications. Meanwhile the Phi 4 mini reasoning model has 3.8 billion parameters and was trained on around a million synthetic ...
Microsoft Corp. has developed a small language model that can solve certain math problems better than algorithms several times its size. The company revealed the model, Phi-4, on Thursday.
and Claude 3.5 Haiku. These smaller AI models are often faster and cheaper to run, and their performance has gradually increased over the last several years. In this case, Microsoft attributes Phi-4's ...