Learn how to build a local AI assistant using llama-cpp-python. This guide covers installing the model, adding conversation memory, and integrating external tools for automation, web scraping, and real-time data retrieval.
Learn how to run Large Language Models (LLMs) locally using Ollama and integrate them into Python with langchain-ollama. A step-by-step guide for setting up and generating AI-powered responses.
In this tutorial we will demonstrate how to expire old dynamodb items using a aws dynamodb ttl feature, where we can delete items when they are older than a configurable threshold.