Skip to content

sroecker/AgenticAI-HandsOn

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Agentic AI Hands-On Lab

Agentic AI Hands-On Lab for the AI xpress Techtalk on June 25th 2025 in Böblingen.

For the older version from Summit Connect Darmstadt 2024 see here.

This Agentic AI Hands-On Lab offers a practical introduction to building autonomous AI agents using the Agno framework and Ollama. Agno simplifies the creation, deployment, and monitoring of AI agents by integrating memory, knowledge, and tools into large language models (LLMs). Ollama complements this by allowing everyone to run LLMs locally, ensuring data privacy and reducing latency. Together, they provide a robust environment for quickly developing and experimenting with Agentic AI applications using local models.

Aim of this hands-on lab

You'll learn the basic concepts and capabilties of using local models for agentic AI. Agentic AI represents a significant advancement, combining the versatility of LLMs with the precision of traditional programming. This fusion allows AI systems to autonomously perform tasks, make decisions, and interact with external environments, thereby enhancing their utility across various applications. By participating in this hands-on experience, you will gain valuable insights into the practical aspects of Agentic AI, positioning yourself at the forefront of this transformative technology.

Things to know

Most of the examples are adapted from the Agno cookbook with a few modifications (e.g context size, temperature) to make them run better with local models and Ollama. For information on how to deploy Ollama on OpenShift/Kubernetes see our previous LLM app dev hands-on lab from 2023.

Installation

Install uv:

curl -LsSf https://astral.sh/uv/install.sh | sh

Install required packages:

uv sync

Run jupyter lab with:

uv run --with jupyter jupyter lab

Examples

In order to run the examples you will need to install Ollama and download the appropriate models.

The gemini examples can be run by creating a free API key and setting the environment variable GOOGLE_API_KEY:

export GOOGLE_API_KEY=AI...

About

Agentic AI Hands-On Lab for Summit Connect Darmstadt 2024

Resources

License

Stars

Watchers

Forks

Packages

No packages published