You may have noticed that Google has invested deeply in AI tools for many years. Lately, it’s brought more of those tools to the public, encouraging developers to rely on Google platforms for their complex AI creations. That often means chatbots and other uses of large language models (LLMs),which brings us to PaLM 2.

This carefully designed LLM is made to power advanced AIs with lots of cool tricks, and you can use itright from your smartphone. The important update to Google’s coding platform includes models for many purposes and is one of the most important AI tools that developers can access. Here’s an overview.

Google Assistant using Bard on a phone.

What is PaLM 2?

PaLM 2 is Google’s enormous LLM that’s currently in charge of most predictive and generator AI tech Google uses. If you’ve ever had Gmail recommend a way to finish an email or recently tried out the chatbot Google Bard, you’ve tapped into the power of PaLM 2. Released in 2023, it’s an upgrade to the first version of the LLM, which was released in 2022.

As an LLM or large language model, PaLM 2 is designed to synthesize and train with a vast amount of data, learning how things like language work and teaching itself when it makes mistakes. However, it’s a bit more focused than other powerful LLMs. For example, based on training parameters, PaLM 2 is around a 10th of the size of GPT-4 that powers ChatGPT. But it’s also made to do more specific things.

Google’s LLM offering explanations for a German translation.

What does this AI language model do?

With its smaller training sets, smaller than the original PaLM, Google is focusing more on targeted, accurate AI training rather than cramming as much information into an AI as possible. The theory goes that carefully crafted training parameters with human feedback can train an AI more efficiently with a lower footprint, which can make AIs more developer and consumer-friendly at the same time.

So far, this approach appears to work well, at least in certain AI fields. We don’t know all the technology that’s gone into creating and training PaLM 2, but we know that itbeats GPT-4 in the WinoGrande “commonsense” test(90 to 88) to interpret what human language is talking about. That’s impressive work for an LLM of this size while saving a ton of server space, making it an interesting choice for responding to human-based concepts like idioms and poetry.

Google’s AI model solving a logic problem.

What’s been updated in PaLM 2?

So, it’s good at human language and related concepts, but what has been updated in PaLM 2? A few standouts include:

Even more programming languages: PaLM 2 supports over 20 programming languages for developers, including C, C++, and Python. That also taps into its AI capabilities, allowing it to generate code for these languages or translate code to a different code, potentially saving everyday developers an immense amount of time.

Multiple versions: Google offers several versions of PaLM 2 based mostly on size. Smaller versions run faster and are less expensive to implement, ideal for Android app use and similar examples. And an internet connection isn’t always required. The versions are named after animals: Gecko, Otter, Bison, and Unicorn. Gecko, the lightest version, can process 20 tokens a second without an internet connection.

Multilingual capabilities: PaLM 2 excels at human languages, including language translation for 100 languages. It isn’t putting expert translators out of business. Still, it can do things past AI translators weren’t capable of, like understanding local idioms and offering a potential replacement in another language. For example, it might translate, “That’s a load of crap,” to “That’s not good and I don’t believe you.” And with specific training, it could become better at certain languages.

Does PaLM 2 only work with language and code?

Yes and no. The version of PaLM 2 available to developers is not multimodal and can’t easily be customized to different types of content or data. Its focus is human-based text, and that specialization is one reason it’s good at reasoning capabilities and the subtleties of language.

Google has showcased PaLM 2 doing other things. Notably, it showed examples of the LLM analyzing X-rays and other medical scans with highly accurate results, as well as watching cybersecurity data and identifying incoming threats. But it’s unclear how much work it took to make PaLM 2 multimodal in these ways or what the associated time and cost would be for a business to do the same on its own. For now, unless Google releases a specific PaLM 2 tool of its own, it’s smart to think of PaLM 2 as a text-only model.

I’m not a developer. Can I use PaLM 2?

You probably are. Google has been inviting its users to join its AI testing program for over a year, bringing PaLM 2 technology to Gmail, Google Docs, and other Google software. That rollout continues, so Google users will likely encounter it soon, even if they haven’t already. It wouldn’t be a surprise if parts of PaLM 2 technology are used in other Google tools, working behind the scenes on analysis and interpretation. You canbring up Google Bardany time you want to see how PaLM 2 allows conversational AI or uses its capabilities when you ask Google Assistant a question.

I am a developer. Can I use PaLM 2?

You canuse the PaLM API, along with the broad Vertex AI platform for AI development. These tools at least partially use the PaLM 2 LLM.

Like many AIs, PaLM 2 is running into big questions about how it’s been trained. While Google has aResponsible AI Practices manifesto, it’s light on details about how it trained PaLM 2. Did it utilize medical scans without asking the permission of patients? Did it train on author works without getting the permission of authors?

These major AI headaches apply to PaLM 2 just like they do to other LLMs, and we don’t have a lot of answers yet. With the human-centered design approach Google claims, there are hopefully more safeguards for problems like these.

Is PaLM 2 a better choice for my project than GPT-4?

This question is hard to answer without knowing more about any specific AI project. But currently, PaLM 2 is faster and more app-friendly than GPT-4 from a development standpoint. GPT-4 has a broader range but poorer reasoning abilities. However, its huge number of training parameters allows it to spot more distinctions or flaws, and it tends to beat PaLM 2 when it comes to tasks like fixing existing code.

Neither are open source projects, but it’s important to note many ChatGPT plugins exist to narrow the AI’s focus to particular projects.

PaLM 2 is still in the early stages of application, and its tools are limited, especially on the third-party front. Do your own research to find out more about which option is the best or most cost-efficient option.

PaLM 2’s journey to grasp human concepts

PaLM 2 shows the power of carefully applied, human-screened AI training for specific tasks. In some ways, it’s the opposite end of the spectrum from the models that power chatbots like ChatGPT, but it’s still powerful in its own way.

The next big question is what PaLM 2 can be trained to do in various industries. These are still early days. Google has given some examples, but we’ll have to wait and see how important PaLM 2 will prove. It could become the backbone of translation and coding services around the world, for example. These are big topics! If you want to learn more than what we explore here, stop byour guide on how LLMs work, as well aswhat you should know about Google Bard.