implementation("community.flock.aigentic:ollama:0.8.0")
Integrates Ollama's local large language models into applications, enabling AI functionalities without cloud dependencies.