Skip to content

LangChain4j

AiSupport implementation backed by LangChain4j StreamingChatLanguageModel. When this JAR is on the classpath, @AiEndpoint automatically uses LangChain4j for streaming.

<dependency>
<groupId>org.atmosphere</groupId>
<artifactId>atmosphere-langchain4j</artifactId>
<version>LATEST</version> <!-- check Maven Central for latest -->
</dependency>

Drop the dependency alongside atmosphere-ai and the framework auto-detects it via ServiceLoader:

@AiEndpoint(path = "/ai/chat", systemPrompt = "You are a helpful assistant")
public class MyChat {
@Prompt
public void onPrompt(String message, StreamingSession session) {
session.stream(message); // uses LangChain4j automatically
}
}

The LangChain4jAiSupport implementation has priority 100, which takes precedence over the built-in client (priority 0).

For full control, use LangChain4jStreamingAdapter directly:

var session = StreamingSessions.start(resource);
model.chat(ChatMessage.userMessage(prompt),
new AtmosphereStreamingResponseHandler(session));

Bridges LangChain4j’s StreamingChatResponseHandler to Atmosphere’s StreamingSession:

LangChain4j CallbackStreamingSession Action
onPartialResponse(text)session.send(text)
onCompleteResponse(response)session.complete()
onError(throwable)session.error(message)

AtmosphereLangChain4jAutoConfiguration bridges a Spring-managed StreamingChatLanguageModel bean to the LangChain4jAiSupport SPI automatically.