
Embeds Koog with other frameworks, featuring Spring AI integration through a specialized client utilizing Spring AI's ChatClient. Supports prompt execution and response handling using Koog's DSL.
Koog-sauce is a missing ingredient that connects Koog with other frameworks.
SpringAiLLMClient, which uses Spring AI's ChatClient under the hood.Langchain4jLLMClient for seamless integration with LangChain4j, supporting both standard and streaming interactions.Add the dependency to your build.gradle.kts file:
dependencies {
// Core library
implementation("me.kpavlov:koog-sauce:[LATEST]")
// For Spring AI integration
implementation("me.kpavlov:koog-sauce-spring-ai:[LATEST]")
implementation("org.springframework.ai:spring-ai-openai:1.0.0")
// For LangChain4j integration
implementation("me.kpavlov:koog-sauce-langchain4j:[LATEST]")
implementation("dev.langchain4j:langchain4j:0.24.0")
implementation("dev.langchain4j:langchain4j-open-ai:0.24.0")
// Koog library
implementation("ai.koog:koog:0.4.0") // or newer
}./gradlew buildOr using the Makefile:
make build// Create Spring AI ChatClient
val chatClient = org.springframework.ai.chat.client.ChatClient.builder(
org.springframework.ai.openai.OpenAiChatModel
.builder()
.openAiApi(
org.springframework.ai.openai.api.OpenAiApi
.builder()
.apiKey("your-api-key")
.build(),
).build(),
).build()
// Create SpringAiLLMClient
val llmClient = me.kpavlov.koog.sauce.spring.ai.chat.SpringAiLLMClient(chatClient)
// Build a prompt using Koog DSL
val prompt = ai.koog.prompt.dsl.Prompt.build("myPrompt") {
system("You are a helpful assistant")
user("Tell me about Kotlin Multiplatform")
}
// Define the model to use
val model = ai.koog.prompt.llm.LLModel(
ai.koog.prompt.llm.LLMProvider.OpenAI,
"gpt-4.1-nano",
listOf(ai.koog.prompt.llm.LLMCapability.Completion),
100500
)
// Execute the prompt
suspend fun executePrompt() {
val responses = llmClient.execute(prompt, model)
// Process the response
val response = responses.first()
println("Response: ${response.content}")
}See the complete example.
// Create LangChain4j ChatModel
val chatModel = dev.langchain4j.model.openai.OpenAiChatModel.builder()
.apiKey("your-api-key")
.modelName("gpt-4")
.build()
// Create Langchain4jLLMClient
val llmClient = me.kpavlov.koog.sauce.langchain4j.Langchain4jLLMClient(chatModel = chatModel)
// Build a prompt using Koog DSL
val prompt = ai.koog.prompt.dsl.Prompt.build("myPrompt") {
system("You are a helpful assistant")
user("Tell me about LangChain4j")
}
// Define the model to use
val model = ai.koog.prompt.llm.LLModel(
ai.koog.prompt.llm.LLMProvider.OpenAI,
"gpt-4",
listOf(ai.koog.prompt.llm.LLMCapability.Completion),
100500,
)
// Execute the prompt
suspend fun executePrompt() {
val responses = llmClient.execute(prompt, model)
println("Response: ${responses.first().content}")
}See the complete example.
// Create a prompt executor
val promptExecutor = ai.koog.prompt.executor.model.PromptExecutor(
llmClient = llmClient,
defaultModel = model
)
// Create an AI agent using the builder
val agent = me.kpavlov.koog.sauce.agents.core.agent.AIAgent<String, String> {
this.promptExecutor = promptExecutor
this.strategy = YourCustomStrategy() // Implement AIAgentStrategy
this.agentConfig = YourAgentConfig() // Implement AIAgentConfigBase
this.toolRegistry = ToolRegistry.builder()
.registerTool(YourCustomTool()) // Add your tools
.build()
}
// Use the agent
suspend fun executeAgent() {
val result = agent.execute("Tell me about Kotlin")
println("Agent result: $result")
}These examples demonstrate how to:
make build: Build the projectmake test: Run testsmake clean: Clean the projectmake publish: Publish to Maven Localmake doc: Generate KDoc documentationmake help: Show help message./gradlew testOr using the Makefile:
make testThe project includes comprehensive tests for both Spring AI and LangChain4j integrations:
SpringOpenAiTest demonstrates how to use and test the Spring AI integration with OpenAI models.Langchain4jLLMClientTest tests the standard LangChain4j LLM client functionality.StreamingLangchain4jLLMClientTest tests the streaming capabilities of the LangChain4j LLM client.These tests serve as additional examples of how to use the integrations in your own projects.
./gradlew dokkaGeneratePublicationHtmlOr using the Makefile:
make docThe documentation is automatically generated and published to GitHub Pages when changes are pushed to the main branch. You can access the latest documentation at:
https://kpavlov.github.io/koog-sauce/
This project is licensed under the MIT License - see the LICENSE file for details.
Koog-sauce is a missing ingredient that connects Koog with other frameworks.
SpringAiLLMClient, which uses Spring AI's ChatClient under the hood.Langchain4jLLMClient for seamless integration with LangChain4j, supporting both standard and streaming interactions.Add the dependency to your build.gradle.kts file:
dependencies {
// Core library
implementation("me.kpavlov:koog-sauce:[LATEST]")
// For Spring AI integration
implementation("me.kpavlov:koog-sauce-spring-ai:[LATEST]")
implementation("org.springframework.ai:spring-ai-openai:1.0.0")
// For LangChain4j integration
implementation("me.kpavlov:koog-sauce-langchain4j:[LATEST]")
implementation("dev.langchain4j:langchain4j:0.24.0")
implementation("dev.langchain4j:langchain4j-open-ai:0.24.0")
// Koog library
implementation("ai.koog:koog:0.4.0") // or newer
}./gradlew buildOr using the Makefile:
make build// Create Spring AI ChatClient
val chatClient = org.springframework.ai.chat.client.ChatClient.builder(
org.springframework.ai.openai.OpenAiChatModel
.builder()
.openAiApi(
org.springframework.ai.openai.api.OpenAiApi
.builder()
.apiKey("your-api-key")
.build(),
).build(),
).build()
// Create SpringAiLLMClient
val llmClient = me.kpavlov.koog.sauce.spring.ai.chat.SpringAiLLMClient(chatClient)
// Build a prompt using Koog DSL
val prompt = ai.koog.prompt.dsl.Prompt.build("myPrompt") {
system("You are a helpful assistant")
user("Tell me about Kotlin Multiplatform")
}
// Define the model to use
val model = ai.koog.prompt.llm.LLModel(
ai.koog.prompt.llm.LLMProvider.OpenAI,
"gpt-4.1-nano",
listOf(ai.koog.prompt.llm.LLMCapability.Completion),
100500
)
// Execute the prompt
suspend fun executePrompt() {
val responses = llmClient.execute(prompt, model)
// Process the response
val response = responses.first()
println("Response: ${response.content}")
}See the complete example.
// Create LangChain4j ChatModel
val chatModel = dev.langchain4j.model.openai.OpenAiChatModel.builder()
.apiKey("your-api-key")
.modelName("gpt-4")
.build()
// Create Langchain4jLLMClient
val llmClient = me.kpavlov.koog.sauce.langchain4j.Langchain4jLLMClient(chatModel = chatModel)
// Build a prompt using Koog DSL
val prompt = ai.koog.prompt.dsl.Prompt.build("myPrompt") {
system("You are a helpful assistant")
user("Tell me about LangChain4j")
}
// Define the model to use
val model = ai.koog.prompt.llm.LLModel(
ai.koog.prompt.llm.LLMProvider.OpenAI,
"gpt-4",
listOf(ai.koog.prompt.llm.LLMCapability.Completion),
100500,
)
// Execute the prompt
suspend fun executePrompt() {
val responses = llmClient.execute(prompt, model)
println("Response: ${responses.first().content}")
}See the complete example.
// Create a prompt executor
val promptExecutor = ai.koog.prompt.executor.model.PromptExecutor(
llmClient = llmClient,
defaultModel = model
)
// Create an AI agent using the builder
val agent = me.kpavlov.koog.sauce.agents.core.agent.AIAgent<String, String> {
this.promptExecutor = promptExecutor
this.strategy = YourCustomStrategy() // Implement AIAgentStrategy
this.agentConfig = YourAgentConfig() // Implement AIAgentConfigBase
this.toolRegistry = ToolRegistry.builder()
.registerTool(YourCustomTool()) // Add your tools
.build()
}
// Use the agent
suspend fun executeAgent() {
val result = agent.execute("Tell me about Kotlin")
println("Agent result: $result")
}These examples demonstrate how to:
make build: Build the projectmake test: Run testsmake clean: Clean the projectmake publish: Publish to Maven Localmake doc: Generate KDoc documentationmake help: Show help message./gradlew testOr using the Makefile:
make testThe project includes comprehensive tests for both Spring AI and LangChain4j integrations:
SpringOpenAiTest demonstrates how to use and test the Spring AI integration with OpenAI models.Langchain4jLLMClientTest tests the standard LangChain4j LLM client functionality.StreamingLangchain4jLLMClientTest tests the streaming capabilities of the LangChain4j LLM client.These tests serve as additional examples of how to use the integrations in your own projects.
./gradlew dokkaGeneratePublicationHtmlOr using the Makefile:
make docThe documentation is automatically generated and published to GitHub Pages when changes are pushed to the main branch. You can access the latest documentation at:
https://kpavlov.github.io/koog-sauce/
This project is licensed under the MIT License - see the LICENSE file for details.