
Unofficial client for interacting with the Ollama API, enabling chat functionalities through OpenAPI-defined requests and responses, with customizable components for specific needs.
An Ollama client SDK that allows you to easily interact with the Ollama API.
The supported API follows the Ollama OpenAPI.
Add the dependency to your Gradle configuration:
implementation("org.nirmato.ollama:nirmato-ollama-client-ktor:0.2.0")
// example using ktor CIO engine
implementation("io.ktor:ktor-client-cio:3.1.3")or to your Maven pom:
<dependency>
<groupId>org.nirmato.ollama</groupId>
<artifactId>nirmato-ollama-client-ktor</artifactId>
<version>0.2.0</version>
</dependency>
<!-- example using ktor CIO engine -->
<dependency>
<groupId>io.ktor</groupId>
<artifactId>ktor-client-cio</artifactId>
<version>3.1.3</version>
</dependency>The OllamaClient class contains all the methods needed to interact with the Ollama API. An example of calling Ollama:
val ollamaClient = OllamaClient(CIO) {
httpClient {
// ktor HttpClient configurations
defaultRequest {
url("http://localhost:11434/api/")
}
}
}
val request = chatRequest {
model("tinyllama")
messages(listOf(Message(role = USER, content = "Why is the sky blue?")))
}
val response = ollamaClient.chat(request)See the samples directory for complete examples.
The OllamaApi interface provides the following methods:
chat(chatRequest): Send a chat request and get a response.chatStream(chatRequest): Stream the chat response.generateEmbed(generateEmbedRequest): Generate embeddings for a given input.createModel(createModelRequest): Create a new model.pullModel(pullModelRequest): Pull a model from the registry.pushModel(pushModelRequest): Push a model to the registry.listModels(): List all available models.For a complete list of methods, please refer to the OllamaApi interface.
Contributions are welcome! Please feel free to submit a Pull Request.
The source code is distributed under the Apache License 2.0.
An Ollama client SDK that allows you to easily interact with the Ollama API.
The supported API follows the Ollama OpenAPI.
Add the dependency to your Gradle configuration:
implementation("org.nirmato.ollama:nirmato-ollama-client-ktor:0.2.0")
// example using ktor CIO engine
implementation("io.ktor:ktor-client-cio:3.1.3")or to your Maven pom:
<dependency>
<groupId>org.nirmato.ollama</groupId>
<artifactId>nirmato-ollama-client-ktor</artifactId>
<version>0.2.0</version>
</dependency>
<!-- example using ktor CIO engine -->
<dependency>
<groupId>io.ktor</groupId>
<artifactId>ktor-client-cio</artifactId>
<version>3.1.3</version>
</dependency>The OllamaClient class contains all the methods needed to interact with the Ollama API. An example of calling Ollama:
val ollamaClient = OllamaClient(CIO) {
httpClient {
// ktor HttpClient configurations
defaultRequest {
url("http://localhost:11434/api/")
}
}
}
val request = chatRequest {
model("tinyllama")
messages(listOf(Message(role = USER, content = "Why is the sky blue?")))
}
val response = ollamaClient.chat(request)See the samples directory for complete examples.
The OllamaApi interface provides the following methods:
chat(chatRequest): Send a chat request and get a response.chatStream(chatRequest): Stream the chat response.generateEmbed(generateEmbedRequest): Generate embeddings for a given input.createModel(createModelRequest): Create a new model.pullModel(pullModelRequest): Pull a model from the registry.pushModel(pushModelRequest): Push a model to the registry.listModels(): List all available models.For a complete list of methods, please refer to the OllamaApi interface.
Contributions are welcome! Please feel free to submit a Pull Request.
The source code is distributed under the Apache License 2.0.