implementation("community.flock.aigentic:ollama:0.5.0")
Integrates with Ollama, enabling local execution of large language models without cloud dependency.