Experiment with an AI application locally using Podman AI Lab

Prerequisites

This lab requires Podman Desktop with the Podman AI Lab extension enabled.

Introduction

In this lab, you will experiment with AI applications using Podman AI Lab. You will explore various AI applications and integrations, including Llama Stack.

Scenario

You are part of the Python AI Development team. Your responsibility is to experiment with AI applications using Podman AI Lab before building an AI agent in an enterprise environment.

Explore Podman

  • Open Podman Desktop.

  • Click Podman AI Lab. Your screen should look similar to this:

    podman screen

Set up and explore AI tools

Select an AI Model

  • Click ModelsCatalog

podman model catalog

  • Select and download the model: ibm-granite/granite-3.3-8b-instruct-GGUF

podman download model

  • Once finished, check the Downloads tab.

model downloaded

Now you can create a service to allow applications to consume the model easily.

Create a model service

Podman AI Lab allows you to create model services and playgrounds to build AI applications. The model service is used for inference, allowing AI applications to consume it via HTTP.

podman serving playgrounds

  • Click Models → Services

  • Click New Model Service

    new model service

  • Review the information, then click Create Service

    create model service

  • When the model service is ready, click the start icon.

    model service start

  • The service is now started and ready to be consumed:

    model service started

Explore LLama Stack

  • Select Llama Stack from Podman AI Lab

    podman llamastack

  • Select Start Llama Stack container:

    llama stack start

  • Llama Stack will begin building the container. Once finished, all steps will appear in green.

    llama stack running container

  • Click Explore Llama-Stack Environment

    Podman LLama Stack Explore

  • Explore the Llama Stack UI and enter the question, "What is an AI agent" in the chat box:

LLama Stack UI

Use the Podman AI Lab recipe to build a chatbot

Podman AI Lab provides many recipes you can use as a starting point to build your own applications, explore AI tools, or learn about AI Lab.

anatomy recipe

  • Click Recipe Catalog.

recipe click

  • Explore the different recipes available Recipe Catalog and select the ChatBot using Llama Stack by clicking on More Details

podman recipe list

Take time to explore the recipe.

  • Click the start icon.

chatbot start

  • Click Start chatbot recipe to build the chatbot.

start recipe

  • The process will take a few seconds:

podman recipe starting

  • Once the chatbot is ready, Click Open Details

chatbot ready

  • To explore the chatbot, click the Open AI App icon.

open chatbot

  • Next, explore and test the chatbot.

chatbot running

  • Congratulations, you have built an AI chatbot integrated with LLama Stack using Podman AI Lab.

  • Next, review the source code.

    • In Podman AI Lab, click AI APPSRunning

ailab running

  • Then, click Open Recipe

chatbot recipe

  • Review the Summary section.

ailab chatbot summary

  • In the Repository section, click containers/ai-lab-recipe

open repository

  • Confirm to open external website

open external website

  • The GitHub repository includes all the recipes displayed in Podman AI Lab: ai-lab-recipes.

ailab recipes

Conclusion

Podman AI Lab is a great resource for experimenting with AI applications, from learning from recipes, testing locally and try different models.

Next

Next, you will learn how to build AI applications like an AI agent in an enterprise environment. This environment must include best practices, starting with security and following organizational guidelines in OpenShift.