LLMs
info
前往集成以获取与LLM提供商的内置集成的文档。
大型语言模型(LLMs)是LangChain的核心组件。LangChain不提供自己的LLMs,而是提供与许多不同LLMs交互的标准接口。
入门
有很多LLM提供商(OpenAI、Cohere、Hugging Face等)- LLM
类旨在为所有这些提供商提供标准接口。
在本教程中,我们将使用OpenAI LLM包装器,尽管强调的功能对于所有LLM类型都是通用的。
设置
首先,我们需要安装 OpenAI Python 包:
pip install openai
使用 API 需要一个 API 密钥,您可以通过创建帐户并转到 此处 获取。一旦我们获得密钥,我们将希望通过运行以下命令将其设置为环境变量:
export OPENAI_API_KEY="..."
如果您不想设置环境变量,可以在初始化 OpenAI LLM 类时直接通过 openai_api_key
命名参数传递密钥:
from langchain.llms import OpenAI
llm = OpenAI(openai_api_key="...")
否则,您可以不使用任何参数进行初始化:
from langchain.llms import OpenAI
llm = OpenAI()
__call__
: string in -> string out
使用 LLM 的最简单方法是可调用的:输入一个字符串,获得一个字符串完成结果。
llm("Tell me a joke")
'Why did the chicken cross the road?\n\nTo get to the other side.'
generate
: batch calls, richer outputs
generate
lets you can call the model with a list of strings, getting back a more complete response than just the text. This complete response can includes things like multiple top responses and other LLM provider-specific information:
llm_result = llm.generate(["Tell me a joke", "Tell me a poem"]*15)
len(llm_result.generations)
30
llm_result.generations[0]
[Generation(text='\n\nWhy did the chicken cross the road?\n\nTo get to the other side!'),
Generation(text='\n\nWhy did the chicken cross the road?\n\nTo get to the other side.')]
llm_result.generations[-1]
[Generation(text="\n\nWhat if love neverspeech\n\nWhat if love never ended\n\nWhat if love was only a feeling\n\nI'll never know this love\n\nIt's not a feeling\n\nBut it's what we have for each other\n\nWe just know that love is something strong\n\nAnd we can't help but be happy\n\nWe just feel what love is for us\n\nAnd we love each other with all our heart\n\nWe just don't know how\n\nHow it will go\n\nBut we know that love is something strong\n\nAnd we'll always have each other\n\nIn our lives."),
Generation(text='\n\nOnce upon a time\n\nThere was a love so pure and true\n\nIt lasted for centuries\n\nAnd never became stale or dry\n\nIt was moving and alive\n\nAnd the heart of the love-ick\n\nIs still beating strong and true.')]
您还可以访问返回的特定于提供程序的信息。此信息在不同提供程序之间是标准化的。
llm_result.llm_output
{'token_usage': {'completion_tokens': 3903,
'total_tokens': 4023,
'prompt_tokens': 120}}