Building a Conversational Chatbot with LangChain4j¶
LangChain4j makes building stateful, conversational applications in Java incredibly simple, especially with its high-level AiServices API. It allows you to create a chatbot that remembers previous interactions with minimal boilerplate code.
This tutorial will guide you through creating a conversational chatbot that uses the Voidon API and manages conversation history automatically.
The Power of AiServices and ChatMemory
In LangChain4j, you define a chatbot's capabilities with a simple Java interface. By binding a ChatMemory instance to this service, the framework automatically handles the loading and saving of the conversation history for every call, making your chatbot stateful.
<dependencies><dependency><groupId>dev.langchain4j</groupId><artifactId>langchain4j</artifactId><version>0.30.0</version><!-- Use the latest version --></dependency><dependency><groupId>dev.langchain4j</groupId><artifactId>langchain4j-open-ai</artifactId><version>0.30.0</version><!-- Use the same version --></dependency><!-- Add a logger to see LangChain4j's output --><dependency><groupId>org.slf4j</groupId><artifactId>slf4j-simple</artifactId><version>2.0.12</version></dependency></dependencies>
We start by creating a Java interface that defines our chatbot's behavior. We can use the @SystemMessage annotation to give it a personality and instructions.
packagecom.voidon.chatbot;importdev.langchain4j.service.SystemMessage;publicinterfaceChatbot{@SystemMessage("You are a helpful and friendly AI assistant. Your name is Voidy.")Stringchat(StringuserMessage);}```###Step2:ConfiguretheLanguageModelandMemoryInourmainclass,weconfigurethe`OpenAiChatModel`topointtotheVoidonAPI.Crucially,wealsocreatea`ChatMemory`instance.`MessageWindowChatMemory`isagreatchoiceasitkeepsaslidingwindowofthemostrecentmessages.`src/main/java/com/voidon/chatbot/ChatbotExample.java````javapackagecom.voidon.chatbot;importdev.langchain4j.memory.ChatMemory;importdev.langchain4j.memory.chat.MessageWindowChatMemory;importdev.langchain4j.model.chat.ChatLanguageModel;importdev.langchain4j.model.openai.OpenAiChatModel;publicclassChatbotExample{publicstaticvoidmain(String[]args){// 1. Configure the model to use the Voidon APIChatLanguageModelmodel=OpenAiChatModel.builder().baseUrl("https://api.voidon.astramind.ai/v1").apiKey("your-voidon-api-key").modelName("auto").build();// 2. Create a memory component to store the conversation history// This will keep the last 10 messages of the conversationChatMemorychatMemory=MessageWindowChatMemory.withMaxMessages(10);}}
packagecom.voidon.chatbot;importdev.langchain4j.memory.ChatMemory;importdev.langchain4j.memory.chat.MessageWindowChatMemory;importdev.langchain4j.model.chat.ChatLanguageModel;importdev.langchain4j.model.openai.OpenAiChatModel;importdev.langchain4j.service.AiServices;importjava.util.Scanner;publicclassChatbotExample{publicstaticvoidmain(String[]args){// 1. Configure the model to use the Voidon APIChatLanguageModelmodel=OpenAiChatModel.builder().baseUrl("https://api.voidon.astramind.ai/v1").apiKey("your-voidon-api-key").modelName("auto").build();// 2. Create a memory componentChatMemorychatMemory=MessageWindowChatMemory.withMaxMessages(10);// 3. Build the AiService (our chatbot)Chatbotchatbot=AiServices.builder(Chatbot.class).chatLanguageModel(model).chatMemory(chatMemory).build();// 4. Interact with the chatbot in a loopScannerscanner=newScanner(System.in);System.out.println("Chatbot is ready! Type 'exit' to end.");while(true){System.out.print("You: ");StringuserInput=scanner.nextLine();if("exit".equalsIgnoreCase(userInput)){System.out.println("Chatbot: Goodbye!");break;}Stringresponse=chatbot.chat(userInput);System.out.println("Chatbot: "+response);}scanner.close();}}// Example Conversation:// You: Hi, my name is Jane.// Chatbot: Hello Jane! It's a pleasure to meet you. How can I assist you today?// You: what is my name?// Chatbot: Your name is Jane.
You call the chatbot.chat() method with your message.
The AiServices proxy intercepts this call. It first retrieves all previous messages from the ChatMemory instance.
It constructs a complete prompt including the @SystemMessage, the entire chat history, and your new message.
This complete prompt is sent to the Voidon LLM.
The LLM generates a response based on the full context.
The AiServices proxy receives the response. Before returning it to you, it saves both your original message and the AI's new response into the ChatMemory for future turns