Class SpringAIOllamaService
java.lang.Object
com.bytedesk.ai.springai.service.BaseSpringAIService
com.bytedesk.ai.springai.providers.ollama.SpringAIOllamaService
- All Implemented Interfaces:
SpringAIService
@Service
@ConditionalOnProperty(name="spring.ai.ollama.chat.enabled",
havingValue="true",
matchIfMissing=false)
public class SpringAIOllamaService
extends BaseSpringAIService
-
Field Summary
FieldsModifier and TypeFieldDescriptionprivate org.springframework.ai.ollama.OllamaChatModel
private org.springframework.ai.ollama.api.OllamaApi
Fields inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
faqService, messagePersistCache, messageSendService, robotMessageCache, robotRestService, springAIVectorService, threadRestService, uidUtils
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionprivate org.springframework.ai.ollama.OllamaChatModel
configureModelWithTimeout
(org.springframework.ai.ollama.OllamaChatModel model, long timeoutMillis) private org.springframework.ai.ollama.OllamaChatModel
根据机器人配置创建动态的OllamaChatModelprivate org.springframework.ai.ollama.api.OllamaOptions
根据机器人配置创建动态的OllamaOptionsprotected String
generateFaqPairs
(String prompt) org.springframework.ai.ollama.OllamaChatModel
boolean
protected void
processPrompt
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply) protected void
processPromptSSE
(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter) protected String
processPromptSync
(String message) Methods inherited from class com.bytedesk.ai.springai.service.BaseSpringAIService
buildKbPrompt, createDynamicOptions, generateFaqPairsAsync, generateFaqPairsSync, handleSseError, isEmitterCompleted, persistMessage, sendMessage, sendSseMessage, sendStreamEndMessage, sendStreamMessage, sendStreamStartMessage, sendWebsocketMessage
-
Field Details
-
bytedeskOllamaChatModel
@Autowired(required=false) @Qualifier("bytedeskOllamaChatModel") private org.springframework.ai.ollama.OllamaChatModel bytedeskOllamaChatModel -
ollamaApi
@Autowired @Qualifier("bytedeskOllamaApi") private org.springframework.ai.ollama.api.OllamaApi ollamaApi
-
-
Constructor Details
-
SpringAIOllamaService
public SpringAIOllamaService()
-
-
Method Details
-
createDynamicOptions
根据机器人配置创建动态的OllamaOptions- Parameters:
llm
- 机器人LLM配置- Returns:
- 根据机器人配置创建的选项
-
createDynamicChatModel
根据机器人配置创建动态的OllamaChatModel- Parameters:
llm
- 机器人LLM配置- Returns:
- 配置了特定模型的OllamaChatModel
-
processPrompt
protected void processPrompt(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply) - Specified by:
processPrompt
in classBaseSpringAIService
-
generateFaqPairs
- Specified by:
generateFaqPairs
in classBaseSpringAIService
-
processPromptSync
- Specified by:
processPromptSync
in classBaseSpringAIService
-
processPromptSSE
protected void processPromptSSE(org.springframework.ai.chat.prompt.Prompt prompt, RobotProtobuf robot, MessageProtobuf messageProtobufQuery, MessageProtobuf messageProtobufReply, org.springframework.web.servlet.mvc.method.annotation.SseEmitter emitter) - Specified by:
processPromptSSE
in classBaseSpringAIService
-
configureModelWithTimeout
private org.springframework.ai.ollama.OllamaChatModel configureModelWithTimeout(org.springframework.ai.ollama.OllamaChatModel model, long timeoutMillis) -
isServiceHealthy
public boolean isServiceHealthy() -
getChatModel
public org.springframework.ai.ollama.OllamaChatModel getChatModel()
-