TL;DR Learn how to build your own AI agents and interact with them via easy to use web UIs.
This documentation site contains various tutorials and resources for learning how to build AI agents with LangChain and LangGraph as well as Gradio web UIs to facilitate interactions. It also covers how to build local servers in Docker to power agents and their tools, including an Ollama server for LMs, a SearXNG server for a metasearch engine tool, and a Milvus server for a vectorstore.
For more details about how to build agents and other easily digestible modules, check it out here.
There are various directions you can take in navigating these tutorials, as each one can be its own standalone lesson.
-
Learn how to build local servers to power both the tools for your agents and their decision making and response generating processes.
- An Ollama server to host the LMs needed for your agents to make decisions and generate information
- A SearXNG server to host a metasearch engine that can be used as a tool for your agents to search the web
- A Milvus server to host a vectorstore that can be used as a tool for your agents to retrieve information from your personal docs
- A multi-server stack that combines the servers you need all in one place
-
Learn how to build chatbots and specialized agents as well as easy to use web UIs that make interactions a lot more intuitive.
- A simple chatbot without any memory or tools
- An agent that can remember past conversation history
- A document agent that can retrieve information from Markdown files
- A code agent that can retrieve information from both Markdown and Python files and is specialized for coding tasks.
-
Learn advanced RAG techniques to improve the information retrieval of your agents.
-
Learn about various aspects of AI including deep learning, natural language processing, MCPs, and generative models.
Take the code and use it, dive into the code and try to understand it, or just learn about AI; these tutorials were made for you (and me 😊)!
├── docs/ # All main documentation files
├── includes/ # Abbreviation definitions for acronyms, etc.
├── overrides/ # Overrides of default pages
├── third-party/ # Third-party licenses and code for attribution
├── mkdocs.yml # Main documentation configurations
└── requirements.txt # Required Python libraries
This site was made using Material for MKDocs. The tutorials use various third-party software and libraries including:
- Caddy: Reverse proxy for SearXNG server
- Docker: Building and running local servers
- Gradio: Building web UIs
- LangChain: Lots of various uses pertaining to creating tools, chatbots, and agents
- LangGraph: Lots of various uses pertaining to creating agents
- Milvus: Local vectorstore setup and run in Docker
- Ollama: Local LM server setup and run in Docker
- Ollama Python library: Interacting with the Ollama server via a local Python environment
- PyMilvus: Interacting with the Milvus server via a local Python environment
- Requests: Interacting with the SearXNG server via a local Python environment
- SearXNG: Metasearch engine source code
- searxng-docker: Local metasearch engine setup and run in Docker
- Valkey (acting through the [Redis][redis] API): Data storage for SearXNG server
This documentation site is a work in progress. If you'd like to suggest or add improvements, clarify your confusion, help others understand, or share your own relevant projects, feel free to contribute through discussions. Check out the contributing guidelines to get started.
This site is licensed under MIT. However, some of the third-party libraries are licensed differently, check out the notice for more details.
