LangChain4j is a powerful and idiomatic Java library for building applications with Large Language Models. It provides high-level APIs that make creating complex agents with tools simple and elegant. It fully supports any OpenAI-compatible API, including Voidon.
This tutorial demonstrates how to build a weather agent using the AiServices feature of LangChain4j, which automatically creates an agent implementation from a simple Java interface.
Why use a framework like LangChain4j?
LangChain4j handles the entire tool-use lifecycle automatically. You simply annotate your Java methods with @Tool, and the AiService will make them available to the language model, execute them when requested, and use the results to formulate a final answer.
<dependencies><dependency><groupId>dev.langchain4j</groupId><artifactId>langchain4j</artifactId><version>0.30.0</version><!-- Use the latest version --></dependency><dependency><groupId>dev.langchain4j</groupId><artifactId>langchain4j-open-ai</artifactId><version>0.30.0</version><!-- Use the same version --></dependency><!-- Add a logger to see LangChain4j's output --><dependency><groupId>org.slf4j</groupId><artifactId>slf4j-simple</artifactId><version>2.0.12</version></dependency></dependencies>
A tool is just a method in a class annotated with @Tool. LangChain4j will automatically parse the method signature and Javadoc to create a description for the language model.
packagecom.voidon.agent;importdev.langchain4j.model.chat.ChatLanguageModel;importdev.langchain4j.model.openai.OpenAiChatModel;publicclassAgentExample{publicstaticvoidmain(String[]args){// Configure the model to use the Voidon APIChatLanguageModelmodel=OpenAiChatModel.builder().baseUrl("https://api.voidon.astramind.ai/v1").apiKey("your-voidon-api-key").modelName("auto").build();}}
With AiServices, you don't write the agent logic yourself. Instead, you define an interface that represents the agent. LangChain4j will create the implementation for you.
packagecom.voidon.agent;importdev.langchain4j.memory.ChatMemory;importdev.langchain4j.memory.chat.MessageWindowChatMemory;importdev.langchain4j.model.chat.ChatLanguageModel;importdev.langchain4j.model.openai.OpenAiChatModel;importdev.langchain4j.service.AiServices;publicclassAgentExample{publicstaticvoidmain(String[]args){// 1. Configure the model to use the Voidon APIChatLanguageModelmodel=OpenAiChatModel.builder().baseUrl("https://api.voidon.astramind.ai/v1").apiKey("your-voidon-api-key").modelName("auto").build();// 2. Create an instance of the class containing the toolsWeatherToolsweatherTools=newWeatherTools();// 3. Define chat memory to hold the conversation historyChatMemorychatMemory=MessageWindowChatMemory.withMaxMessages(10);// 4. Build the AiService (our agent)WeatherAssistantassistant=AiServices.builder(WeatherAssistant.class).chatLanguageModel(model).tools(weatherTools).chatMemory(chatMemory).build();// 5. Interact with the agentStringquestion="What is the weather in Tokyo in celsius?";System.out.println("User: "+question);Stringanswer=assistant.chat(question);System.out.println("Agent: "+answer);}}
When you call assistant.chat(question), LangChain4j sends the user's message and the descriptions of all @Tool-annotated methods to the Voidon API.
The language model analyzes the request and determines that it needs to call the getCurrentWeather function with location="Tokyo" and unit="celsius".
The model responds with a special tool_calls message.
LangChain4j intercepts this response, parses it, and executes the getCurrentWeather method in your WeatherTools class with the specified arguments.
LangChain4j takes the return value of your Java method and sends it back to the model as a "tool result."
The model receives the weather data ("temperature is 10") and formulates a final, natural language response (e.g., "The current weather in Tokyo is 10°C.").
This final response is returned from the assistant.chat() method call.