Example v>=1.0.4

Getting Started

Step 1: Importing the Library

To begin, import the necessary modules and set up the API key.

from chatformers.chatbot import Chatbot
import os
from openai import OpenAI

Set your GROQ_API_KEY environment variable:

os.environ["GROQ_API_KEY"] = "<API_KEY>"
GROQ_API_KEY = os.getenv("GROQ_API_KEY")

Step 2: Initializing the OpenAI Client

Configure the OpenAI client to communicate with the GROQ LLM service.

groq_base_url = "https://api.groq.com/openai/v1"
client = OpenAI(base_url=groq_base_url, api_key=GROQ_API_KEY)

Step 3: Setting Up Chatbot Character

You can configure the chatbot character with specific attributes.

character_data = {
    "name": "Julia",
    "description": "You are on an online chatting website, chatting with strangers."
}

Step 4: Configuring Chatformers

Chatformers uses mem0 for memory management. Refer to the mem0 documentation for more details.

Step 5: Creating the Chatbot Instance

Initialize the chatbot with the OpenAI client, model name, character configuration, and memory configuration.

Usage Example

Basic Chat Interaction

Below is an example of a chatbot conversation where the user asks the bot a question and receives a response based on previous chats.

Output

The chatbot responds based on previous conversations stored in memory and any additional context provided by the user.

Optional Features

Adding Memories

Chatformers allow you to embed memories directly into the vector database, making future interactions more contextual.

Retrieving Memories

You can retrieve the memories associated with a specific user to understand the context better.

You can also query for related memories based on specific prompts.

Complete Code-

Last updated