2 years ago6 viewstechnologyPresentation at 2024 SouthEast LinuxFest about using Ollama for local large language models, discussing benefits and challenges of cloud-based LLM access.