PHI 3 MINI - SCI &
TECH
News: Microsoft unveils Phi-3-mini,
its smallest AI model yet: How it compares to bigger models
What's in the news?
●
Recently, Microsoft unveiled the latest version of
its ‘lightweight’ AI model that is the Phi-3-Mini.
Phi-3-mini:
●
It is the smallest
AI model developed by Microsoft.
●
It is believed to be the first in a series of three
smaller models planned by Microsoft.
Features:
●
It performed well in various benchmarks, such as language, reasoning, coding, and
mathematics, outperforming other models of similar and larger sizes.
●
It has the ability to support a context window of up to 128K tokens. This
allows it to handle extensive conversation data with minimal impact on quality.
●
It is a 3.8B
language model. It is accessible on platforms like Microsoft Azure AI
Studio, Hugging Face, and Ollama.
●
It comes in two
variants - one with a 4K content-length and another with a 128K token
context window.
Difference between
Phi-3-mini and LLMs:
●
Compared to large language models (LLMs),
Phi-3-mini represents a smaller, more
streamlined version.
●
Smaller AI models like this offer cost-effective development and operation,
particularly on devices like laptops and smartphones.
●
They are well-suited
for resource-constrained environments, such as on-device and offline
inference scenarios.
●
They are also ideal
for tasks requiring fast response times, such as chatbots or virtual
assistants.
●
Phi-3-mini can be tailored for specific tasks,
achieving high accuracy and efficiency.
●
SLMs typically undergo targeted training, requiring
less computational power and energy compared to LLMs. They also excel in
inference speed and latency due to their compact size, making them appealing to
smaller organizations and research groups.