Has anyone gotten this to work with LM Studio, Ollama, or any of the other production-quality local LLM servers? If you have it working and it is available as a Free and Open Source Software (FOSS) package, please reply back to this thread. Thank you. Rich Lysakowski, Ph.D. AI-Local Activator for Off-Cloud Computing