Skip to content

Latest commit

 

History

History
 
 

llama_adapter_v2_multimodal

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 

LLaMA-Adapter-V2 Multi-modal

News

  • [May 26, 2023] Initial release.

Setup

  • setup up a new conda env and install necessary packages.

    conda create -n llama_adapter_v2 python=3.8 -y
    pip install -r requirements.txt
  • Obtain the LLaMA backbone weights using this form. Please note that checkpoints from unofficial sources (e.g., BitTorrent) may contain malicious code and should be used with care. Organize the downloaded file in the following structure

    /path/to/llama_model_weights
    ├── 7B
    │   ├── checklist.chk
    │   ├── consolidated.00.pth
    │   └── params.json
    └── tokenizer.model
    

Usage

Here is a simple inference script for LLaMA-Adapter V2. The pre-trained model will be downloaded directly from Github Release.

import cv2
import llama
import torch
from PIL import Image

device = "cuda" if torch.cuda.is_available() else "cpu"

llama_dir = "/path/to/LLaMA/"

model, preprocess = llama.load("BIAS-7B", llama_dir, device)

prompt = llama.format_prompt("Please introduce this painting.")
img = Image.fromarray(cv2.imread("../docs/logo_v1.png"))
img = preprocess(img).unsqueeze(0).to(device)

result = model.generate(img, [prompt])[0]

print(result)

The output will look like the following:

The painting features a cute white lama, or llama, standing on a wooden floor. The llama is holding a variety of tools and accessories, such as a paintbrush, a pencil, a ruler, a pair of scissors, and a paint can. The llama is dressed in a suit, which adds a touch of sophistication to the scene. The painting is a creative and whimsical representation of a person or animal holding various tools and accessories, making it an interesting and unique piece of art.

Online demo

We provide an online demo at OpenGVLab.

You can also start it locally with:

python gradio_app.py

Models

You can check our models by running:

import llama
print(llama.available_models())

Now we provide BIAS-7B, which fine-tunes the bias and norm parameters of LLaMA. We will include more pretrained models in the future, such as the LoRA fine-tuning model LoRA-7B and partial-tuning model PARTIAL-7B.