LLM Workshop - SWIB 2025

Learn to run powerful AI models on your own hardware

Welcome to the Local LLM Workshop! This comprehensive guide teaches you how to set up and use large language models (LLMs) locally on your personal computer. Run AI without relying on cloud services, protecting your privacy while eliminating subscription costs.

Workshop Editions

What You’ll Learn

Why Local LLMs?

  • Privacy First - Your data never leaves your computer
  • No Ongoing Costs - Free after initial setup
  • Always Available - Works offline once models are downloaded
  • Full Control - Customize everything to your needs

Get Started

Workshop Overview

This workshop is designed for both newcomers and those with prior exposure to LLMs. Through hands-on activities, you’ll learn:

Part 1: Fundamentals (2 hours)

  • Understanding LLM architecture and concepts
  • Setting up Ollama with its native GUI
  • Choosing and running open-source models
  • Writing effective prompts

Part 2: Intermediate (2 hours)

  • Automating LLM interactions with Python
  • Building semantic search with vector embeddings
  • Creating RAG (Retrieval-Augmented Generation) systems
  • Integrating external data sources

Prerequisites

Before attending, ensure you have:

  • Hardware: 8 GB RAM minimum, 20 GB free storage
  • OS: Windows 11, macOS 12+, or Linux
  • Internet: For downloading models
  • Python 3.8+: For Part 2 activities

View detailed setup instructions →

Resources


Ready to begin? Start with the current workshop →