Autoplay
Autocomplete
Previous Lecture
Complete and Continue
Local LLMs via Ollama & LM Studio - The Practical Guide
New section
Welcome To The Course! (2:06)
What Exactly Are "Open LLMs"? (6:27)
Why Would You Want To Run Open LLMs Locally? (6:52)
Popular Open LLMs - Some Examples (3:43)
Where To Find Open LLMs? (4:47)
Running LLMs Locally - Available Options (7:17)
Check The Model Licenses! (4:04)
Understanding Hardware Requirements & Quantization
LLM Hardware Requirements - First Steps (4:21)
Module Introduction (1:20)
Deriving Hardware Requirements From Model Parameters (5:34)
Quantization To The Rescue! (6:50)
Does It Run On Your Machine? (5:50)
LM Studio Deep Dive
Running Locally vs Remotely (1:08)
Module Introduction (2:03)
Installing & Using LM Studio (3:09)
Finding, Downloading & Activating Open LLMs (9:04)
Using the LM Studio Chat Interface (4:53)
Working with System Prompts & Presets (3:26)
Managing Chats (2:32)
Power User Features For Managing Models & Chats (6:28)
Leveraging Multimodal Models & Extracting Content From Images (OCR) (2:48)
Analyzing & Summarizing PDF Documents (3:27)
Onwards To More Advanced Settings (1:52)
Understanding Temperature, top_k & top_p (6:32)
Controlling Temperature, top_k & top_p in LM Studio (4:45)
Managing the Underlying Runtime & Hardware Configuration (4:17)
Managing Context Length (5:21)
Using Flash Attention (5:08)
Working With Structured Outputs (5:29)
Using Local LLMs For Code Generation (2:35)
Content Generation & Few Shot Prompting (Prompt Engineering) (5:21)
Onwards To Programmatic Use (2:25)
LM Studio & Its OpenAI Compatibility (6:00)
More Code Examples! (5:04)
Diving Deeper Into The LM Studio APIs (2:10)
Using the Python / JavaScript SDKs
Ollama Deep Dive
Installing & Starting Ollama (2:08)
Module Introduction (1:41)
Finding Usable Open Models (2:56)
Running Open LLMs Locally via Ollama (7:43)
Adding a GUI with Open WebUI (2:12)
Dealing with Multiline Messages & Image Input (Multimodality) (2:38)
Inspecting Models & Extracting Model Information (3:31)
Editing System Messages & Model Parameters (6:01)
Saving & Loading Sessions and Models (3:35)
Managing Models (5:42)
Creating Model Blueprints via Modelfiles (6:22)
Creating Models From Modelfiles (3:26)
Making Sense of Model Templates (6:39)
Building a Model From Scratch From a GGUF File (6:37)
Getting Started with the Ollama Server (API) (2:12)
Exploring the Ollama API & Programmatic Model Access (5:18)
Getting Structured Output (2:56)
More Code Examples! (4:53)
Using the Python / JavaScript SDKs
Course Roundup
Roundup (1:44)
Using Flash Attention
Lecture content locked
If you're already enrolled,
you'll need to login
.
Enroll in Course to Unlock