16.3 C
New York
Friday, November 8, 2024

Introducing Meta Llama 3.2 on Databricks: faster language models and powerful multimodal models


We are excited to partner with Meta to launch the latest models in the Llama 3 series in the Databricks data intelligence platform. The small textual models in this version of Llama 3.2 allow customers to build fast systems in real time, and the larger multimodal models mark the first time Llama models gain visual understanding. Both provide key components for Databricks customers to build AI composite systems that enable data intelligence, connecting these models with your business data.

As with the rest of the Llama series, Llama 3.2 models are available today from Databricks Mosaic AI, allowing you to safely and efficiently fit them into your data, and easily connect them to your GenAI applications with Mosaic AI. Door and Agent framework.

Get started using Llama 3.2 on Databricks today! Implement the model and use it in Mosaic AI Playgroundand use Tile AI Model Training to customize the models in your data. Sign up for this webinar to delve into Llama 3.2 from Meta and Databricks.

This year, Llama has achieved 10x growth, further supporting our belief that open source models drive innovation. Together with Databricks Mosaic AI solutions, our new Llama 3.2 models will help organizations develop data intelligence by working accurately and securely with a company’s proprietary data. We are excited to continue working with Databricks to help businesses customize their AI systems with their business data. – Ahmad Al-Dahle, Director of GenAI, Meta

What’s new in Llama 3.2?

The Llama 3.2 series includes smaller models for use cases that require super-low latency and multi-modal models to enable new visual understanding use cases.

  • Call-3.2-1B-Instruction and Call-3.2-3B-Instruction They are designed specifically for low-latency, low-cost enterprise use cases. They excel at “simpler” tasks such as entity extraction, multilingual translation, summarization, and RAG. When fine-tuning your data, these models are a fast and economical alternative for specific tasks relevant to your business.
  • Flame-3.2-11B-Vision-Instruction and Call-3.2-90B-Vision-Instruct Enable businesses to use the powerful, open Llama series for visual understanding tasks such as document analysis and product description generation.
  • Multi-modal models also come with a new Llama guard safety model, Call-Guard-3-11B-Visionenabling the responsible deployment of multimodal applications.
  • All models support the Llama 3.1 series’ expanded 128k context length, for handling very long documents. Long context simplifies and improves the quality of RAG and agent applications by reducing dependency on fragmentation and recovery.

Additionally, Meta is launching the flame pilea software layer to facilitate the creation of applications. Databricks hopes to integrate its APIs into the Llama Stack.

Faster and cheaper

The new small models in the Llama 3.2 series provide an excellent new option for latency and cost sensitive use cases. There are many generative AI use cases that do not require the full power of a general-purpose AI model, and coupled with data intelligence from your data, smaller, task-specific models can open up new use cases that require low latency or cost. such as code completion, real-time summarization, and high-volume entity extraction. Accessible in Unit CatalogYou can easily swap the new models into your applications built on Databricks. To improve the quality of the models in your specific task, you can use a more powerful model, such as Meta Llama 3.1 405B, to generate synthetic training data starting from a small set of initial examples and then use the synthetic training data to tune Llama 3.2 1B or 3B to achieve high quality and low latency in your data. All of this can be accessed in a unified experience in Databricks Mosaic AI.

Adjusting Llama 3.2 on your data in Databricks is just a simple command:

# First install the databricks genai package

# pip install databricks-genai --upgrade

from databricks.model_training import foundation_model as fm

finetuning_run = fm.create(
    # model to fine-tune
    model='meta-llama/Llama-3.2-1B-Instruct',
    # where to register your finetuned model for deployment
    train_data_path='',
    # spark cluster to prepare your UC table for training
    data_prep_cluster_id='',
    # table containing your finetuning data
    register_to='catalog.schema.table' 
)

View training on the Mosaic AI model documents for more information and tutorials!

New open multimodal models

The Llama 3.2 series includes powerful, open multimodal models that allow for both visual and textual input. Multimodal models open up many new use cases for enterprise data intelligence. In document processing, they can be used to analyze scanned documents along with text input to provide a more complete and accurate analysis. In e-commerce, they enable visual search where users can upload a photo of a product to find similar items based on generated descriptions. For marketing teams, these models simplify tasks like generating image-based social media captions. We are excited to offer use of these models. in data bricksAnd stay tuned for more information on this front!

Here is an example of how to ask Llama 3.2 to parse a table into JSON representation:

Image (Table 2 of the balsa paper):

Message: Parse the table into a JSON representation.

Production:

> Assistant: **Step 1: Identify the methods and datasets in the table.**

The methods are RAFT w.o CoT and RAFT. The datasets are PubMed, HotpotQA, 
HuggingFace, Torch Hub, and TensorFlow.

**Step 2: Create a JSON object with the methods as keys.**

```json
{
  "RAFT w.o CoT": {},
  "RAFT": {}
}
```

**Step 3: Add the datasets as subkeys under each method.**

```json
{
  "RAFT w.o CoT": {
    "PubMed": 68.30,
    "HotpotQA": 25.62,
    "HuggingFace": 59.07,
    "Torch Hub": 86.56,
    "TensorFlow": 83.21
  },
  "RAFT": {
    "PubMed": 73.30,
    "HotpotQA": 35.28,
    "HuggingFace": 74.00,
    "Torch Hub": 84.95,
    "TensorFlow": 86.86
  }
}
```

**Step 4: The final JSON representation of the table is:**

```json
{
  "RAFT w.o CoT": {
    "PubMed": 68.30,
    "HotpotQA": 25.62,
    "HuggingFace": 59.07,
    "Torch Hub": 86.56,
    "TensorFlow": 83.21
  },
  "RAFT": {
    "PubMed": 73.30,
    "HotpotQA": 35.28,
    "HuggingFace": 74.00,
    "Torch Hub": 84.95,
    "TensorFlow": 86.86
  }
}
```

Customers innovate with Databricks and open models

Many Databricks customers are already leveraging Llama 3 models to drive their GenAI initiatives. We are all excited to see what they will do with Llama 3.2.

  • “Databricks’ scalable model management capabilities allow us to seamlessly integrate advanced open source LLMs like Meta Llama into our productivity engine, allowing us to bring new AI technologies to our customers quickly.” – Bryan McCann, Co-Founder/CTO, You.com
  • “Databricks Mosaic AI allows us to offer enhanced services to our clients that demonstrate the powerful relationship between advanced AI and effective data management, while facilitating the integration of cutting-edge GenAI technologies such as Meta Llama that future-proof our services.” “. Colin Wenngatz, vice president, data analytics, MNP
  • “The Databricks Data Intelligence platform allows us to securely deploy next-generation AI models like Meta Llama within our own environment without exposing sensitive data. This level of control is essential to maintain data privacy and meet healthcare standards.” – Navdeep Alam, CTO, Abacus Insights
  • “Thanks to Databricks Mosaic AI, we can orchestrate rapid optimization and fine-tuning of instructions for open source LLMs like Meta Llama that ingest domain-specific language from a proprietary corpus, improving the performance of behavioral simulation analysis and increasing our operational efficiency.” Chris Coughlin, Senior Manager, Assessment Content Design and Development at Development Dimensions International

Introduction to Llama 3.2 on Databricks Mosaic AI

Follow the implementation instructions to try Llama 3.2 directly from your workspace. For more information, see the following resources:

Attend the next Databricks GenAI Webinar on 10/8/24: The Shift to Data Intelligence, where Ash Jhaveri, VP of Meta, will discuss open source AI and the future of Meta Llama models.

Related Articles

Latest Articles