Zero to Hero in Ollama: Create Local LLM Applications

$0.99
Instructor:
Start-Tech Academy
Category:

Are you looking to build and run customized large language models (LLMs) right on your own system, without depending on cloud solutions? Do you want to maintain privacy while leveraging powerful models similar to ChatGPT? If you’re a developer, data scientist, or an AI enthusiast wanting to create local LLM applications, this course is for you!

This hands-on course will take you from beginner to expert in using Ollama , a platform designed for running local LLM models. You’ll learn how to set up and customize models, create a ChatGPT-like interface, and build private applications using Python—all from the comfort of your own system.

 

What you’ll learn
  • Install and configure Ollama on your local system to run large language models privately.
  • Customize LLM models to suit specific needs using Ollama’s options and command-line tools.
  • Execute all terminal commands necessary to control, monitor, and troubleshoot Ollama models
  • Set up and manage a ChatGPT-like interface using Open WebUI, allowing you to interact with models locally
  • Deploy Docker and Open WebUI for running, customizing, and sharing LLM models in a private environment.
  • Utilize different model types, including text, vision, and code-generating models, for various applications.
  • Create custom LLM models from a gguf file and integrate them into your applications.
  • Build Python applications that interface with Ollama models using its native library and OpenAI API compatibility.
  • Develop a RAG (Retrieval-Augmented Generation) application by integrating Ollama models with LangChain.
  • Implement tools and agents to enhance model interactions in both Open WebUI and LangChain environments for advanced workflows.
Course content
8

8 sections • 28 lectures • Total duration 3 hours 9 minutes

Introduction
Open WebUI - ChatGPT like interface for Ollama models
Types of Ollama Models and their capabilities
Using Ollama with Python
Using Ollama with LangChain in Python
Creating RAG application using Ollama and LangChain
Using Tools and Agents with Ollama models
Conclusion