implementation("community.flock.aigentic:ollama:0.9.1")
Facilitates local execution of large language models, ensuring privacy, offline capability, and flexible model support.