Spring AI支持DeepSeek的各种AI语言模型。您可以与DeepSeek语言模型交互,并基于DeepSeek模型创建多语言会话助手。
您需要使用DeepSeek创建一个API密钥来访问DeepSeek语言模型。
在DeepSeek注册页面创建一个帐户,并在API密钥页面上生成一个令牌。
Spring AI项目定义了一个名为Spring.AI.deepseek.api-key的配置属性,您应该将其设置为从api密钥页面获得的api密钥的值。
您可以在application.properties文件中设置此配置属性:
spring.ai.deepseek.api-key=
为了在处理API密钥等敏感信息时增强安全性,可以使用Spring Expression Language(SpEL)引用自定义环境变量:
# In application.yml
spring:
ai:
deepseek:
api-key: ${DEEPSEEK_API_KEY}
# In your environment or .env file
export DEEPSEEK_API_KEY=
您还可以在应用程序代码中以编程方式设置此配置:
// Retrieve API key from a secure source or environment variable
String apiKey = System.getenv("DEEPSEEK_API_KEY");
Spring AI工件发布在Spring Milestone和Snapshot存储库中。请参阅工件存储库部分,将这些存储库添加到您的构建系统中。
为了帮助进行依赖关系管理,Spring AI提供了一个BOM(物料清单),以确保在整个项目中使用一致版本的Spring AI。请参阅依赖管理部分,将Spring AI BOM添加到您的构建系统中。
Spring AI为DeepSeek聊天模型提供Spring Boot自动配置。要启用它,请将以下依赖项添加到项目的Maven pom.xml文件中:
org.springframework.ai
spring-ai-starter-model-deepseek
或保存到您的Gradle build.Gradle文件。
dependencies {
implementation 'org.springframework.ai:spring-ai-starter-model-deepseek'
}
Refer to the Dependency Management section to add the Spring AI BOM to your build file. |
前缀spring.ai.retry用作属性前缀,允许您为DeepSeek Chat模型配置重试机制。
spring.ai.retry.max-attempts |
Maximum number of retry attempts. |
10 |
spring.ai.retry.backoff.initial-interval |
Initial sleep duration for the exponential backoff policy. |
2 sec. |
spring.ai.retry.backoff.multiplier |
Backoff interval multiplier. |
5 |
spring.ai.retry.backoff.max-interval |
Maximum backoff duration. |
3 min. |
spring.ai.retry.on-client-errors |
If false, throws a NonTransientAiException, and does not attempt a retry for |
false |
spring.ai.retry.exclude-on-http-codes |
List of HTTP status codes that should not trigger a retry (e.g. to throw NonTransientAiException). |
empty |
spring.ai.retry.on-http-codes |
List of HTTP status codes that should trigger a retry (e.g. to throw TransientAiException). |
empty |
前缀spring.ai.depseek用作属性前缀,允许您连接到deepseek。
Property | Description | Default |
---|---|---|
spring.ai.deepseek.base-url |
The URL to connect to |
api.deepseek.com |
spring.ai.deepseek.api-key |
The API Key |
- |
前缀spring.ai.depseek.chat是属性前缀,用于配置deepseek的聊天模型实现。
Property | Description | Default |
---|---|---|
spring.ai.deepseek.chat.enabled |
Enables the DeepSeek chat model. |
true |
spring.ai.deepseek.chat.base-url |
Optionally overrides the spring.ai.deepseek.base-url to provide a chat-specific URL |
api.deepseek.com/ |
spring.ai.deepseek.chat.api-key |
Optionally overrides the spring.ai.deepseek.api-key to provide a chat-specific API key |
- |
spring.ai.deepseek.chat.completions-path |
The path to the chat completions endpoint |
/chat/completions |
spring.ai.deepseek.chat.beta-prefix-path |
The prefix path to the beta feature endpoint |
/beta/chat/completions |
spring.ai.deepseek.chat.options.model |
ID of the model to use. You can use either deepseek-coder or deepseek-chat. |
deepseek-chat |
spring.ai.deepseek.chat.options.frequencyPenalty |
Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model’s likelihood to repeat the same line verbatim. |
0.0f |
spring.ai.deepseek.chat.options.maxTokens |
The maximum number of tokens to generate in the chat completion. The total length of input tokens and generated tokens is limited by the model’s context length. |
- |
spring.ai.deepseek.chat.options.presencePenalty |
Number between -2.0 and 2.0. Positive values penalize new tokens based on whether they appear in the text so far, increasing the model’s likelihood to talk about new topics. |
0.0f |
spring.ai.deepseek.chat.options.stop |
Up to 4 sequences where the API will stop generating further tokens. |
- |
spring.ai.deepseek.chat.options.temperature |
Which sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. We generally recommend altering this or top_p, but not both. |
1.0F |
spring.ai.deepseek.chat.options.topP |
An alternative to sampling with temperature, called nucleus sampling, where the model considers the results of the tokens with top_p probability mass. So 0.1 means only the tokens comprising the top 10% probability mass are considered. We generally recommend altering this or temperature, but not both. |
1.0F |
spring.ai.deepseek.chat.options.logprobs |
Whether to return log probabilities of the output tokens or not. If true, returns the log probabilities of each output token returned in the content of the message. |
- |
spring.ai.deepseek.chat.options.topLogprobs |
An integer between 0 and 20 specifying the number of most likely tokens to return at each token position, each with an associated log probability. logprobs must be set to true if this parameter is used. |
- |
您可以覆盖ChatModel实现的常用spring.ai.depseek.base-url和spring.ai-depseek.api-key。spring.ai.deepseek.chat.base-url和spring.ai.deepseek.chat.api-key属性(如果设置)优先于通用属性。如果你想为不同的模型和不同的模型端点使用不同的DeepSeek帐户,这很有用。
All properties prefixed with spring.ai.deepseek.chat.opti |
DeepSeekChatOptions.java提供了模型配置,如要使用的模型、温度、频率惩罚等。
启动时,可以使用DeepSeekChatModel(api,options)构造函数或spring.ai.deepseek.chat.options.*属性配置默认选项。
在运行时,您可以通过向Prompt调用添加新的、特定于请求的选项来覆盖默认选项。例如,要覆盖特定请求的默认型号和温度:
ChatResponse response = chatModel.call(
new Prompt(
"Generate the names of 5 famous pirates. Please provide the JSON response without any code block markers such as ```json```.",
DeepSeekChatOptions.builder()
.withModel(DeepSeekApi.ChatModel.DEEPSEEK_CHAT.getValue())
.withTemperature(0.8f)
.build()
));
In addition to the model-specific DeepSeekChatOptions, you can use a portable ChatOptions instance, created with the ChatOptionsBuilder#builder(). |
创建一个新的Spring Boot项目,并将Spring ai starter模型deepseek添加到pom(或gradle)依赖项中。
在src/main/resources目录下添加一个application.properties文件,以启用和配置DeepSeek聊天模型:
spring.ai.deepseek.api-key=YOUR_API_KEY
spring.ai.deepseek.chat.options.model=deepseek-chat
spring.ai.deepseek.chat.options.temperature=0.8
Replace the api-key with your DeepSeek credentials. |
这将创建一个DeepSeekChatModel实现,您可以将其注入到您的类中。这是一个使用聊天模型生成文本的简单@Controller类的示例。
@RestController
public class ChatController {
private final DeepSeekChatModel chatModel;
@Autowired
public ChatController(DeepSeekChatModel chatModel) {
this.chatModel = chatModel;
}
@GetMapping("/ai/generate")
public Map generate(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
return Map.of("generation", chatModel.call(message));
}
@GetMapping("/ai/generateStream")
public Flux generateStream(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
var prompt = new Prompt(new UserMessage(message));
return chatModel.stream(prompt);
}
}
聊天前缀完成遵循聊天完成API,其中用户为模型提供助手的前缀消息以完成消息的其余部分。
使用前缀补全时,用户必须确保消息列表中的最后一条消息是DeepSeekAssistantMessage。
下面是一个完整的Java代码示例,用于完成聊天前缀。在这个例子中,我们将助手的前缀消息设置为“``python\n”,以强制模型输出python代码,并将stop参数设置为['']],以防止模型进行额外的解释。
@RestController
public class CodeGenerateController {
private final DeepSeekChatModel chatModel;
@Autowired
public ChatController(DeepSeekChatModel chatModel) {
this.chatModel = chatModel;
}
@GetMapping("/ai/generatePythonCode")
public String generate(@RequestParam(value = "message", defaultValue = "Please write quick sort code") String message) {
UserMessage userMessage = new UserMessage(message);
Message assistantMessage = DeepSeekAssistantMessage.prefixAssistantMessage("```python\\n");
Prompt prompt = new Prompt(List.of(userMessage, assistantMessage), ChatOptions.builder().stopSequences(List.of("```")).build());
ChatResponse response = chatModel.call(prompt);
return response.getResult().getOutput().getText();
}
}
deepseek推理机是deepseek开发的一种推理模型。在给出最终答案之前,模型首先生成一个思维链(CoT),以提高其响应的准确性。我们的API为用户提供访问deepseek-resoroner生成的CoT内容的权限,使他们能够查看、显示和提取这些内容。
您可以使用DeepSeekAssistantMessage获取deepseek推理器生成的CoT内容。
public void deepSeekReasonerExample() {
DeepSeekChatOptions promptOptions = DeepSeekChatOptions.builder()
.model(DeepSeekApi.ChatModel.DEEPSEEK_REASONER.getValue())
.build();
Prompt prompt = new Prompt("9.11 and 9.8, which is greater?", promptOptions);
ChatResponse response = chatModel.call(prompt);
// Get the CoT content generated by deepseek-reasoner, only available when using deepseek-reasoner model
DeepSeekAssistantMessage deepSeekAssistantMessage = (DeepSeekAssistantMessage) response.getResult().getOutput();
String reasoningContent = deepSeekAssistantMessage.getReasoningContent();
String text = deepSeekAssistantMessage.getText();
}
在每一轮对话中,模型都会输出CoT(推理内容)和最终答案(内容)。在下一轮对话中,前几轮的CoT不会连接到上下文中,如下图所示:
请注意,如果输入消息序列中包含reasoning_content字段,API将返回400错误。因此,在发出API请求之前,您应该从API响应中删除reasoning_content字段,如API示例所示。
public String deepSeekReasonerMultiRoundExample() {
List messages = new ArrayList<>();
messages.add(new UserMessage("9.11 and 9.8, which is greater?"));
DeepSeekChatOptions promptOptions = DeepSeekChatOptions.builder()
.model(DeepSeekApi.ChatModel.DEEPSEEK_REASONER.getValue())
.build();
Prompt prompt = new Prompt(messages, promptOptions);
ChatResponse response = chatModel.call(prompt);
DeepSeekAssistantMessage deepSeekAssistantMessage = (DeepSeekAssistantMessage) response.getResult().getOutput();
String reasoningContent = deepSeekAssistantMessage.getReasoningContent();
String text = deepSeekAssistantMessage.getText();
messages.add(new AssistantMessage(Objects.requireNonNull(text)));
messages.add(new UserMessage("How many Rs are there in the word 'strawberry'?"));
Prompt prompt2 = new Prompt(messages, promptOptions);
ChatResponse response2 = chatModel.call(prompt2);
DeepSeekAssistantMessage deepSeekAssistantMessage2 = (DeepSeekAssistantMessage) response2.getResult().getOutput();
String reasoningContent2 = deepSeekAssistantMessage2.getReasoningContent();
return deepSeekAssistantMessage2.getText();
}
DeepSeekChatModel实现了ChatModel和StreamingChatModel,并使用低级DeepSeekApi客户端连接到DeepSeek服务。
将spring ai deepseek依赖项添加到项目的Maven pom.xml文件中:
org.springframework.ai
spring-ai-deepseek
或保存到您的Gradle build.Gradle文件。
dependencies {
implementation 'org.springframework.ai:spring-ai-deepseek'
}
Refer to the Dependency Management section to add the Spring AI BOM to your build file. |
接下来,创建一个DeepSeekChatModel并将其用于文本生成:
var deepSeekApi = new DeepSeekApi(System.getenv("DEEPSEEK_API_KEY"));
var chatModel = new DeepSeekChatModel(deepSeekApi, DeepSeekChatOptions.builder()
.withModel(DeepSeekApi.ChatModel.DEEPSEEK_CHAT.getValue())
.withTemperature(0.4f)
.withMaxTokens(200)
.build());
ChatResponse response = chatModel.call(
new Prompt("Generate the names of 5 famous pirates."));
// Or with streaming responses
Flux streamResponse = chatModel.stream(
new Prompt("Generate the names of 5 famous pirates."));
DeepSeekChatOptions提供聊天请求的配置信息。DeepSeekChat选项。Builder是一个流畅的选项生成器。
DeepSeekApi是DeepSeek API的轻量级Java客户端。
下面是一个简单的代码片段,展示了如何以编程方式使用API:
DeepSeekApi deepSeekApi =
new DeepSeekApi(System.getenv("DEEPSEEK_API_KEY"));
ChatCompletionMessage chatCompletionMessage =
new ChatCompletionMessage("Hello world", Role.USER);
// Sync request
ResponseEntity response = deepSeekApi.chatCompletionEntity(
new ChatCompletionRequest(List.of(chatCompletionMessage), DeepSeekApi.ChatModel.DEEPSEEK_CHAT.getValue(), 0.7f, false));
// Streaming request
Flux streamResponse = deepSeekApi.chatCompletionStream(
new ChatCompletionRequest(List.of(chatCompletionMessage), DeepSeekApi.ChatModel.DEEPSEEK_CHAT.getValue(), 0.7f, true));
有关更多信息,请参阅DeepSeekApi.java的JavaDoc。
DeepSeekApiIT.java测试提供了一些如何使用轻量级库的一般示例。