- GPT Weekly
- Posts
- Google’s Gemini, Meta’s Llama-3, Microsoft’s Indemnity
Google’s Gemini, Meta’s Llama-3, Microsoft’s Indemnity
PLUS: Run LLM as a Torrent, Re-thinking LLM in education and more
Happy Monday!
This week we’ve got:
🔥Top 3 news: Google’s Gemini, Meta’s Llama-3, Microsoft’s Indemnity
🗞️Interesting reads - Running LLMs as a torrent, Re-thinking LLM in education and more
🧑🎓Learning - All about RAGs and Fine tuning Llama-2
New - I am adding all the news stories I collected in this week as a notion document. This will be available for free for next 2 editions. After that it will be available only for subscribers with 1 referral. Please see at the end of the newsletter.
Let’s get started.
🔥Top 3 AI news in the past week
1. Google’s Gemini
Google has provided select companies with early access to its new LLM called Gemini. This model will be competing with OpenAI's GPT-4.
This move signifies Google's increased investment in generative AI. This can happen because Google has the highest number of GPU available at its disposal.
Two weeks ago, SemiAnalysis posted an article where it claimed that Gemini will beat GPT-4 by 5x. The claim was made on the back of the fact that Google has the highest number of GPU available at its disposal.
Earlier Deepmind’s CEO had claimed that Gemini will merge techniques from AlphaGo into an LLM. This combination will enhance the system's problem-solving and planning abilities.
Additionally, Google has indexed over 100,000,000 gigabytes of data. All this put together means Gemini is going to be an interesting release.
It is interesting though that someone claiming to have access to Gemini. He says that the performance is equal to GPT-4. There is clarity on the tests used to come to the conclusion.
2. Meta’s Llama-3
SemiAnalysis also says that Meta will have the 2nd largest number of H100 GPUs in the world. And it is using it as a tactic to attract talent.
So, it wasn’t surprising when WSJ reported Meta is also working on an improved and powerful LLM.
Meta is currently building its infrastructure to house the H100 GPUs. So, the plan is to train this model in-house. And the planned training start date for this model is 2024.That means the model will be released after Gemini. And very late compared to OpenAI’s GPT-4.
The foreseeable advantage is that Meta is committed to open source share the model weights.
As a consumer of these models, an open model has some good use cases. One example is that fine tuning Llama-2 to classify recipes using the HF dataset costs $19. In comparison, GPT-4 costs $23k and GPT-3 costs $1k.
3. Microsoft’s Indemnity
Microsoft has introduced a Copilot Copyright Commitment. It aims to address concerns related to the use of its AI-powered Copilot services and generated content. This is an extension of existing indemnity support to commercial Copilot services. The aim is to provide customers with legal protection against copyright infringement claims.
So, say if a third party sues a customer for copyright infringement. And they get adverse judgement or settlement. Microsoft will assume responsibility.
OpenAI is currently fighting Sarah Silverman etc and doesn’t provide such assurances. Given that AI copyright is still a gray area, every gen AI company will need to provide such assurances. Otherwise they might risk falling behind.
This clause is applicable for only the paid versions of Microsoft's Copilot services.
🗞️10 AI news highlights and interesting reads
I don’t talk about tools but this idea is pretty cool. Run LLMs like a torrent.
LLMs are going to disrupt education. Teachers need to rethink traditional assignments and cheating. But it can also help with great new assignments like simulating history.
Another article on how GPT-4 isn’t getting worse. Remember though that GPT-4 is non-deterministic so these tests might not prove anything.
Understanding the product domain along with human behavior combined with AI is going to be the winning combination. One of the examples are insurance agents and wealth managers. ANT has released a LLM for wealth managers and insurance agents.
Personalized AI to replace Siri by a developer who had zero web experience. The product is now earning $216k ARR.
A rather downer look at the news we talk about above. Google and Meta have tons of your data. While they train AI on your data, you can’t do anything about it.
After Sarah Silverman et al, now Pulitzer prize winners are suing OpenAI.
Google has added a semantic layer on top of Data Commons. This is cool.
Japan is building its own ChatGPT. Non-English GPT is costly. So, it will not be surprising if other countries follow the same policy.
🧑🎓3 Learning Resources
That’s it folks. Thank you for reading and have a great week ahead.