Open Source Alternative To ChatGPT, Google Follows Microsoft Investing $400M In Anthropic, ChatGPT Is Available In Bing

published8 months ago
4 min read

Open Source Alternative To ChatGPT, Google Follows Microsoft Investing $400M In Anthropic, ChatGPT Is Available In Bing

The Search Wars Have Begun

This week machine learning advances at breakneck speed - as always!

The search engine Bing gets a generative AI makeover. Google and Microsoft are duking it out over dominance in the search market. And language models might become more accessible to everyone in the near future.

We have a lot to cover this week. Let’s go!

1) Google Invests $400M In Generative AI Company Anthropic

The deal apparently took place in late 2022 already but was kept secret until this week.

Google has acquired a minority stake in the startup and both companies announced that they will deepen their relationship. This mirrors Microsoft placing a large bet on generative AI by investing $10B in OpenAI.

There are no plans known to date about Google integrating Antropic’s models into its products.

Further, Anthropic has stated that the deal does not require them to use Google’s cloud platform. However, the team at Anthropic also said that already favor the search giant’s cloud. Hence, Google might see a significant portion of the money flow back, through Anthropic’s cloud spend.

What Does This Mean?

The investment was done by Google’s cloud division to ensure they do not fall behind Microsoft.

Today, ChatGPT uses still too many resources to support the billions of daily search queries that Google receives. However, as we will see below, chat-based interfaces are likely to be the future of search engines

2) Microsoft integrates ChatGPT into Bing

Microsoft makes a big move by integrating ChatGPT with Bing.

The search engine still has an interface for navigational queries. However, a user’s more complex information desires can now be answered via OpenAI’S models. An example could be: “Recipe for a vegetarian three-course meal with a chocolate desert”

What Does This Mean?

The original ChatGPT reached 100M users within a month.

The allure of OpenAI’s model will likely cause an uptick in the usage of Bing. For the foreseeable future, many queries will still be better answered by normal ranking-based searches. However, this application of GPT leads to more funding as well as pressure to make models smaller.

3) New Open-Source Version Of ChatGPT

GPT is getting competition from open-source.

A group of researchers, around the youtuber Yannic Kilcher, have announced that they are working on Open Assistant. The goal is to produce a chat-based language model that is much smaller than GPT-3 while maintaining similar performance.

If you want to support them, they are crowd-sourcing training data here.

What Does This Mean?

Current language models are too big.

They require millions of dollars of hardware to train and use. Hence, access to this technology is limited to big organizations. Smaller firms and universities are effectively shut out from the developments.

Shrinking and open-sourcing models will facilitate academic research and niche applications.

Projects such as Open Assistant will help to make language models a commodity. Lowering the barrier to entry will increase access and accelerate innovation.

4) Bard AI announcement from Google

Google releases Bard a competitor to ChatGPT which is based on LaMDA (Language Model For Dialogue Application).

In contrast to the publicly available version of ChatGPT, Google’s Bard can apparently draw upon information from the internet to generate text. How this is done exactly is not public. However, integrating information retrieval allows for using an order of magnitude smaller models while maintaining performance.

If they are using information retrieval to enhance their models, the quality of retrieved information is important for model performance. In retrieving relevant information Google has a clear edge here over Bing.

What Does This Mean?

Google wants to bring conversational AI to its search engine.

The recent advancements in generative models mark a tectonic shift in how we interact with information on the internet. This could be a chance for competitors to take a bite from Google’s search monopoly.

Google apparently has no intention to lose a single user to Bing.

It could turn out that LaMDA’s use of information retrieval becomes the norm for language models in search applications. If that is the case, language models will become fancy summarizers of search results.

Google’s superior information retrieval might allow them to maintain an edge over Bing.

Such exciting times to be alive!

Next week will be an essay again. Please tell me how you liked this weeks format. I am toying with the idea of sending a news update like this on Tuesdays.

As always, I really enjoyed making this for you and I sincerely hope you found it useful!

Thank your for reading!

113 Cherry St , #92768, Seattle WA
Unsubscribe · Preferences

Welcome to The Decoding ⭕!

Machine learning evolves at a mind-boggling speed. Staying up to date is hard! The Decoding is a weekly 5-minute newsletter keeping you in the loop. Sign up below to get smarter about machine learning and the data economy!

Read more from Welcome to The Decoding ⭕!