Learn how to run open large language models like Gemma, Llama or DeepSeek locally to perform AI inference on consumer hardware.