Usage
Last updated
Last updated
Welcome to Chatformers, a Python library that leverages advanced language models for chatbot creation and interaction with built-in memory management using vector stores. Chatformers simplify integrating large language models (LLMs) into your applications, supporting configurable memory through providers like Chroma, Qdrant, Pggvector, OpenAI, Groq, Ollama, Mem0AI, etc.
Installation
Getting Started
Configuration
Usage Example
Optional Features
Adding Memories
Retrieving Memories
Documentation
License
Before using the library, ensure you have the necessary dependencies installed.
Additionally, ensure that you have the required API key from , , or any OpenAI compatible library.
To begin, import the necessary modules and set up the API key.
Set your GROQ_API_KEY
environment variable:
Configure the OpenAI client to communicate with the GROQ LLM service.
You can configure the chatbot character with specific attributes.
Initialize the chatbot with the OpenAI client, model name, character configuration, and memory configuration.
Below is an example of a chatbot conversation where the user asks the bot a question and receives a response based on previous chats.
The chatbot responds based on previous conversations stored in memory and any additional context provided by the user.
Chatformers allow you to embed memories directly into the vector database, making future interactions more contextual.
You can retrieve the memories associated with a specific user to understand the context better.
You can also query for related memories based on specific prompts.
For further details on the configuration and advanced features of Chatformers, refer to the following resources:
Chatformers is open-source software licensed under the MIT License.
Chatformers use for memory management. Refer to the for more details.