LangChain+Ollama调用本地Deepseek大模型

1.安装ollama的package包;

# install package
pip install -U langchain-ollama

2.我们直接使用ChatOllama实例化模型,并通过invoke进行调用;

from langchain_ollama import ChatOllama

llm = ChatOllama(model="deepseek-r1")
messages = [
    ("system", "你是一个很有用的翻译助手,请将以下句子翻译成英语。"),
    ("human", "我爱编程。")
]
message = llm.invoke(messages)
print(message.content)

3.通过流式方式调用大模型;

from langchain_ollama import ChatOllama

msgs = [
    ("human", "LLM是什么?")
]
llm = ChatOllama(model="deepseek-r1")
for chunk in llm.stream(msgs):
    print(chunk.content, end='')

4.我们可以直接使用chain链接prompt和llm进行调用;

from langchain_ollama.chat_models import ChatOllama
from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "你是一个很有帮助的翻译助手,请将用户的输入从{input_language}成{output_language}"
        ),
        (
            "human",
            "{input}"
        )
    ]
)

llm = ChatOllama(model="deepseek-r1")
chain = prompt | llm
msg = chain.invoke(
    {
        "input_language":"中文",
        "output_language":"英文",
        "input":"我爱编程。"
    }
)
print(msg.content)

5.通过tool标记函数,并使用bind_tools来绑定函数,来实现tools的调用;

from typing import List

from langchain_ollama import ChatOllama
from langchain_core.tools import tool

    # """校验用户的历史住址.

    # Args:
    #     user_id (int): 用户的id.
    #     addresses (List[str]): 以前居住的地址列表.
    # """

@tool
def validate_user(user_id: int, addresses: List[str]) -> bool:
    """Validate user using historical addresses.

    Args:
        user_id (int): the user ID.
        addresses (List[str]): Previous addresses as a list of strings.
    """

    return True

llm = ChatOllama(model="qwen3:0.6b").bind_tools([validate_user])
result = llm.invoke(
    "请校验一下用户123,他以前在"
    "河南省郑州市和"
    "北京市西城区住过"
)
print(result.tool_calls)

你可能感兴趣的:(langchain,ollama,deepseek,大模型,LLM)