← Back to What's New

Spring AI Integration Now Available

SessionCast now officially supports Spring AI — connect any LLM to your terminal sessions using familiar Spring abstractions.

What's New

  • sessioncast-core 1.1.0 — Java SDK for session management, tmux control, and real-time streaming
  • sessioncast-spring-boot-starter — Auto-configuration for Spring Boot apps
  • sessioncast-spring-ai 1.0.1 — Spring AI ChatModel implementation that routes prompts through SessionCast relay to your CLI agent's local LLM

How It Works

Spring AI Prompt → SessionCastChatModel → Relay (WebSocket) → CLI Agent → Local LLM → Response

Your AI model runs on the customer's machine. Zero AI token cost for you as a service developer.

Quick Start

repositories {
    maven { url = uri("https://jitpack.io") }
}

dependencies {
    implementation("com.github.sessioncast.sessioncast-java:sessioncast-core:1.1.0")
    implementation("com.github.sessioncast:sessioncast-spring-ai:1.0.1")
}
@Autowired
private ChatClient chatClient;

String answer = chatClient.prompt("Explain this error log")
    .call()
    .content();

BYOAI — Bring Your Own AI

SessionCast's Spring AI adapter supports any LLM:

  • OpenAI (GPT-4, GPT-4o)
  • Anthropic (Claude)
  • Local models (Ollama, llama.cpp)

The model choice is up to the end user — your service code stays the same.

Documentation

Full documentation is available at developer.sessioncast.io/docs#java-sdk-spring-ai.

Loading...