Nvidia on NixOS WSL and Ollama Performance

Viewed 10
The discussion centers around running Ollama (a tool for managing AI models) on Nvidia's NixOS within the Windows Subsystem for Linux (WSL). While some users question the practicality of this setup given that Ollama runs effectively on Windows, others share their experiences and preferences. One user highlights that they prefer using NixOS at work for consistency with their laptop, despite performance issues with the Windows file system. They note the advantages of Ollama's Windows installer, particularly for those with Nvidia GPUs. The conversation also points to the need for improved support for AMD GPUs on Windows and the challenges of managing older hardware for machine learning tasks. Overall, the thread showcases the complexities of using different operating systems for AI-related tasks, with varied insights on productivity and performance.
0 Answers