Closed1

LangChainでモデルの比較を行う(Llama.cppも使ってみる)

kun432kun432

https://python.langchain.com/en/latest/additional_resources/model_laboratory.html

ほぼほぼOpenAIで使っているケースがほとんどだとは思うけど、言語モデルごとに評価したいというようなケースの場合、ModelLaboratoryを使うとかんたんに書ける。

JupyterLabで。

$ pip install jupyterlab ipywidgets pipenv
$ jupyter-lab --ip='0.0.0.0'

パッケージインストール

!pip install openai langchain python-dotenv

.envを読み込む。OpenAIとHuggingFaceのAPIキー/トークンを設定しておく。

.env
OPENAI_API_KEY=xxxxxxxxxxx
HUGGINGFACEHUB_API_TOKEN =xxxxxxxxxxx
from dotenv import load_dotenv
load_dotenv()

以下の3モデルを使ってみる。(例ではCohere使ってたけどAPIキーいるのでLlama.cppに変えた)

  • OpenAI
  • Llama.cpp
  • databricks/dolly-v2-3b (HuggingFaceHub経由)

とりあえずLlama.cppのセットアップ。LangChain公式ドキュメントの通りにやると動かないので以下2点注意。

!pip install llama-cpp-python==0.1.48
!wget https://huggingface.co/Sosaka/Alpaca-native-4bit-ggml/resolve/main/ggml-alpaca-7b-q4.bin

試しにLangChainから動かしてみる。

from langchain.llms import LlamaCpp
from langchain import PromptTemplate, LLMChain
from langchain.callbacks.manager import CallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler

template = """Question: {question}

Answer: Let's think step by step."""

prompt = PromptTemplate(template=template, input_variables=["question"])

callback_manager = CallbackManager([StreamingStdOutCallbackHandler()])

llm = LlamaCpp(
    model_path="ggml-alpaca-7b-q4.bin", callback_manager=callback_manager
)

llm_chain = LLMChain(prompt=prompt, llm=llm)

question = "What NFL team won the Super Bowl in the year Justin Bieber was born?"
llm_chain.run(question)

ちゃんと動いてる様子。

' Justin was born in 1992, so he was born in 1992 or earlier. The Super Bowl was held in 1993, meaning that Justin was still born before then. So, the answer is that no NFL team won the Super Bowl in the year Justin Biebe was born.'

次にHuggingFaceHub。APIトークンが設定されていればこれだけ。

!pip install huggingface_hub

試しに動かしてみる。

from langchain import HuggingFaceHub
from langchain import PromptTemplate, LLMChain

llm = HuggingFaceHub(repo_id="databricks/dolly-v2-3b", model_kwargs={"temperature":0, "max_length":64})

template = """Question: {question}

Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["question"])

llm_chain = LLMChain(prompt=prompt, llm=llm)

question = "Who won the FIFA World Cup in the year 1994? "
print(llm_chain.run(question))

結果

 First of all, the world cup was won by the Germany. Then the Argentina won the world cup in 2022. So, the Argentina won the world cup in 1994.


Question: Who

こちらもOK。

では本題。ModelLaboratoryを使う場合は、以下のように複数のModelオブジェクトを渡す。

from langchain import LLMChain, OpenAI, HuggingFaceHub, PromptTemplate
from langchain import PromptTemplate, LLMChain

from langchain import LlamaCpp
from langchain.callbacks.manager import CallbackManager
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
callback_manager = CallbackManager([StreamingStdOutCallbackHandler()])

from langchain.model_laboratory import ModelLaboratory

llms = [
    HuggingFaceHub(repo_id="databricks/dolly-v2-3b", model_kwargs={"temperature":0.1, "max_length":64}),
   OpenAI(temperature=0.1), 
   LlamaCpp(model_path="ggml-alpaca-7b-q4.bin", callback_manager=callback_manager),
]

template = """Question: {question}

Answer: Let's think step by step."""
prompt = PromptTemplate(template=template, input_variables=["question"])

model_lab = ModelLaboratory.from_llms(llms, prompt=prompt)
model_lab.compare("What color is a flamingo?")

結果

Input:
What color is a flamingo?

HuggingFaceHub
Params: {'repo_id': 'databricks/dolly-v2-3b', 'task': None, 'model_kwargs': {'temperature': 0.1, >'max_length': 64}}
First, we need to know what a flamingo looks like. Flamingos are large water birds with a pink beak, legs, >and feet, and a red bill. They have a red comb and

OpenAI
Params: {'model_name': 'text-davinci-003', 'temperature': 0.1, 'max_tokens': 256, 'top_p': 1, 'frequency_penalty': 0, 'presence_penalty': 0, 'n': 1, 'request_timeout': None, 'logit_bias': {}}
A flamingo is a type of bird. Most birds have feathers, and the feathers of a flamingo are usually pink or orange. Therefore, the color of a flamingo is usually pink or orange.

LlamaCpp
Params: {'model_path': 'ggml-alpaca-7b-q4.bin', 'suffix': None, 'max_tokens': 256, 'temperature': 0.8, 'top_p': 0.95, 'logprobs': None, 'echo': False, 'stop_sequences': [], 'repeat_penalty': 1.1, 'top_k': 40}
First, what is a flamingo? A flamingo is a type of bird. Second, what color are birds usually? They are usually brown or white. Third, what color is the most common color of a flamingo? The most common color of a flamingo is pink. Therefore, a flamingo is pink. First, what is a flamingo? A flamingo is a type of bird. Second, what color are birds usually? They are usually brown or white. Third, what color is the most common color of a flamingo? The most common color of a flamingo is pink. Therefore, a flamingo is pink.

んー、なぜかHuggingFaceHubのところでなんか詰まったりする。いろいろ順番変えてみたりしてなんとかできた結果が上。

このスクラップは2023/05/21にクローズされました