The current 0.2.0 launch of Google’s Agent Growth Equipment (ADK) for Java provides an integration with the LangChain4j LLM framework. This integration offers builders with a big selection of Massive Language Fashions (LLMs) supported by LangChain4j, for constructing AI brokers.
Along with ADK’s built-in Google Gemini and Anthropic Claude integrations, builders can now use LangChain4j to entry different fashions from third-party suppliers (like OpenAI, Anthropic, GitHub, Mistral…) or native open-weight fashions, e.g. by way of Ollama or Docker Mannequin Runner.
LangChain4j integration for a wide selection of fashions
The LangChain4j LLM framework helps all kinds of fashions. You’ll be able to test the checklist of supported fashions within the LangChain4j documentation. Let’s take a look at a pair concrete examples, utilizing Gemma with Docker Mannequin Runner, and Ollama with Qwen.
When declaring your ADK agent with the LlmAgent builder, you specify the LLM by way of the mannequin() builder methodology. You normally go a string representing the identify of the mannequin, like “gemini-2.5-flash“.
It’s additionally potential to make use of an occasion of a category extending the BaseLlm summary class. That is precisely what the mixing with LangChain4j does, to create a bridge between each frameworks. It’s important to use a brand new LangChain4j class that extends this BaseLlm class.
Operating Gemma 3 with Docker Mannequin Runner
After having put in and enabled Docker Mannequin Runner in your machine, you may pull the Gemma 3 mannequin simply by way of this command:
docker mannequin pull ai/gemma3
Shell
As Docker Mannequin Runner fashions exposes an OpenAI suitable API floor, you need to use the LangChain4j module for OpenAI suitable fashions, by specifying the next dependencies in your Maven pom.xml:
com.google.adk
google-adk-contrib-langchain4j
0.2.0
dev.langchain4j
langchain4j-open-ai
1.4.0
XML
Then, create a LangChain4j chat mannequin, specifying the mannequin you need to use, and the native URL and port:
OpenAiChatModel dmrChatModel = OpenAiChatModel.builder()
.baseUrl("http://localhost:12434/engines/llama.cpp/v1")
.modelName("ai/gemma3n")
.construct();
Java
Now, configure a chess coach agent utilizing that mannequin:
LlmAgent chessCoachAgent = LlmAgent.builder()
.identify("chess-coach")
.description("Chess coach agent")
.mannequin(new LangChain4j(dmrChatModel))
.instruction("""
You're a educated chess coach
who helps chess gamers practice and sharpen their chess abilities.
""")
.construct();
Java
Discover how the bridge between the 2 frameworks is completed by way of the mannequin(new LangChain4j(dmrChatModel)) instruction. And right here you go, your AI agent is powered by an area mannequin!
Operating Qwen 3 with Ollama
If as a substitute you need to construct a pleasant science instructor agent with the Qwen 3 mannequin working regionally in your machine by way of Ollama, first, outline our dependencies inside a Maven pom.xml construct file:
com.google.adk
google-adk
0.2.0
com.google.adk
google-adk-contrib-langchain4j
0.2.0
dev.langchain4j
langchain4j-ollama
1.4.0
XML
Let’s assume you’ve already put in Ollama in your machine, you pulled the Qwen 3 mannequin, and received it working on port 11434. With LangChain4j, in Java, you instantiate the Ollama mannequin supplier as follows:
OllamaChatModel ollamaChatModel = OllamaChatModel.builder()
.modelName("qwen3:1.7b")
.baseUrl("http://127.0.0.1:11434")
.construct();
Java
Now let’s wire this mannequin right into a easy science instructor agent:
LlmAgent scienceTeacherAgent = LlmAgent.builder()
.identify("science-app")
.description("Science instructor agent")
.mannequin(new LangChain4j(ollamaChatModel))
.instruction("""
You're a useful science instructor
who explains science ideas to youngsters and youngsters.
""")
.construct();
Java
If the mannequin helps perform calling, you may give your agent entry to instruments as properly. For instance, give it entry to MCP servers, or your native code pushed features. You’ll be able to discover the assorted instruments at your disposal on this article diving into ADK instruments, or by wanting on the ADK documentation.
New Options on this Launch
Past the headline LangChain4j integration, model 0.2.0 brings a number of different highly effective enhancements to the agent growth workflow:
- Expanded Tooling Capabilities: We have considerably improved the way you create and handle instruments.
- Occasion-based
FunctionTools: Now you can createFunctionToolsfrom object cases, not simply static strategies, providing better flexibility in your agent’s structure. - Improved Async Help:
FunctionToolsnow help strategies that return aSingle. This improves asynchronous operation help and makes brokers extra responsive. - Higher Loop Management: The brand new
endInvocationsubject in Occasion Actions permits programmatic interruption or stopping of the agent loop after a device name. This offers finer management over agent execution.
- Occasion-based
- Superior Agent Logic and Reminiscence:
- Chained Callbacks: We have added help for chained callbacks for
earlier than/afteroccasions on mannequin, agent, and power execution. This allows extra advanced and fine-grained logic inside your agent’s lifecycle. - New Reminiscence and Retrieval: This model introduces an
InMemoryMemoryServicefor easy, quick reminiscence administration and implementsVertexAiRagRetrievalutilizing AI Platform APIs for extra superior RAG patterns.
- Chained Callbacks: We have added help for chained callbacks for
- Different key enhancements embody a mother or father POM and the Maven Wrapper (
./mvnw), making certain a constant and simple construct course of for all contributors.
Let’s put these AI brokers to work
We’re thrilled to get this new model into your palms. The mixing with LangChain4j marks a significant step ahead in making ADK for Java a extra open and versatile framework for constructing highly effective AI brokers.
To be taught extra about this new model of ADK for Java, learn the GitHub launch notes. New to growing brokers in Java with ADK? Take a look at the ADK for Java documentation, this getting began information (and video), or fork this GitHub template undertaking to start shortly.
My colleague Michael Vorburger and myself have been pleased to work on this LangChain4j integration, in collaboration with Dmytro Liubarskyi who created LangChain4j. So in case you’re constructing AI brokers in Java with ADK, don’t hesitate to drop us a message to @glaforge on Twitter/X or @glaforge.dev on Bluesky. We’re wanting ahead to listening to about your nice use instances.







