Happy Friday from the Building Startups Newsletter!
While the giants continue to build and release AI models, Metaās recent splurge on developing AI showed a significant shift in its market value.
Before we dive inā¦
We keep hearing the word āparametersā being used to assess AI language models. Itās important you know what it means in case you donāt.
Parameters relate to the number of variables a model knows in decision-making from its training datasets.
Letās go!šš¼
Metaās AI Splurge (and share price fall!)šØ
Metaās CEO Mark Zuckerberg recently stated that the company will continue to heavily invest in AI and turn it into āthe leading AI company in the world.ā
Zuckerberg also did mention that it would be a while before their AI investments reaped increased revenues.
Despite the tech giantās stable earning numbers, the company shares took a hit of nearly 15% on Thursday post this announcement.
Only last week Meta had introduced new versions of its AI-powered smart assistant software, which have been integrated into its apps.
While investors are concerned about Meta jumping into another expensive venture while its AR and VR businesses continue to lose billions each quarter, some experts do see the promise, reflected in the warm reception of Meta AI and Llama 3.
In case youāre not aware, Llama 3 is the latest in Metaās open generative AI models and is considerably powerful. Itās current two versionsā 8 billion parameters and 70 billion parametersā are available for broad use.
Speaking of AI modelsā¦
Microsoftās New Phi-3 is Compact and Can Run on Smartphones
Earlier this week Microsoft also announced the Phi-3 Mini, a freely available, lightweight small language AI model that can be operated locally, allowing it to run for free on a smartphone without connected to the internet!
It is intended to be easier and less expensive than existing heavyweight large language models such as OpenAI's GPT-4 Turbo.
While the mini has 3.8 billion parameters, there are also two other models on the way, the Phi-3 Small (7B parameters) and Phi-3 Medium (14B parameters).
Apple Drops AI Model, Hints AI Coming to iPhone
And it isnāt just Microsoft, Apple also released 4 very small language models, collectively called OpenELM (Open-source Efficient Language Models) on HuggingFace on Wednesday.
What separates OpenELM is its capacity to conduct AI-powered tasks without the need for cloud servers, making it perfect for on-device use. This may be a hint of how GenAI will come to the iPhone this yearā¦
There are 4 different sizes: 270 million parameters; 450 million parameters; 1.1 billion parameters; and 3 billion parameters.
Small variants are less expensive to run compared LLMs like GPT-4 Turbo, and are optimised for use on devices such as phones and laptops.
What are your thoughts on these new language models and which one are you excited to try?
Talk to me in the comments belowšš¼