back to all blogsSee all blog posts

Model Context Protocol Server and Netty‑based HTTP transport in 25.0.0.11-beta

image of author
Navaneeth S Nair on Oct 21, 2025
Post available in languages:

In this beta release, Open Liberty introduces the new Model Context Protocol (MCP) Server 1.0, which enables AI applications to seamlessly interact with business logic and access real-time data through discoverable tools. It also features a Netty-based HTTP transport as a beta preview, offering an internal performance upgrade for HTTP, WebSocket, JMS, and SIP communications with zero migration impact.

The Open Liberty 25.0.0.11-beta includes the following beta features (along with all GA features):

Model Context Protocol Server

Model Context Protocol (MCP) is an open standard to enable AI applications to interact with and utilize external systems.

The new MCP Server 1.0 feature (mcpServer-1.0) allows developers of applications on Liberty to bridge the gap between Large Language Models(LLMs) and their business code by providing the capability to expose functionality for AI applications to discover, understand, and use.

Why use MCP Server

Imagine your company provides weather forecasting services that require constant updates from multiple data sources. Since the weather forecast changes frequently, it is impossible for an LLM trained on data from last week to have an idea about the weather this week.

A better approach is to allow the LLM to access this weather data via a tool that your Liberty application exposes. The LLM can then call the tool whenever it needs forecasts and instantly have the up-to-date weather information.

Example usage of the MCP Server 1.0 feature

To expose your business logic to authorized AI applications, you’ll need to declare it as an MCP tool. In this context, a tool is a function or operation that the AI can invoke to perform a specific task.

The Liberty mcpServer-1.0 feature makes this process simple by allowing you to declare a tool by using the @Tool annotation on a method of a CDI managed bean with a scope annotation. To inform the AI application of the arguments needed to be passed when calling the tool, you can use the @ToolArg annotation on the desired arguments in your method.

The following is an example of a tool for getting the weather forecast based on the location supplied as an argument by an AI application:

@ApplicationScoped
public class WeatherTools {

    @Inject
    private WeatherToolsClient weatherClient;

    @Tool(name = "getForecast", title = "Weather Forecast Provider", description = "Get weather forecast for a location.")
    public String getForecast(@ToolArg(name = "latitude", description = "Latitude of the location") String latitude,
                              @ToolArg(name = "longitude", description = "Longitude of the location") String longitude) {
        return weatherClient.getForecast(
                Double.parseDouble(latitude),
                Double.parseDouble(longitude),
                4,
                "temperature_2m,snowfall,rain,precipitation,precipitation_probability");
    }
}

Supported Tool Features

The MCP specification has much of its functionality as optional to implement. In our current beta, we’ve implemented compatibility for the tool feature of the MCP Server specification, along with support for cancellation and tool metadata hints (annotations)

How to Setup a MCP Server Project

Here are the simple steps to follow to try out the new MCP Server 1.0 feature for yourself:

  1. Download the Open Liberty All Beta Features package

  2. Unzip the archive. You should have a wlp folder

  3. Inside your pom.xml file, add the path to your wlp folder as a variable named <wlp-dir-path> within the <properties> block:

    <properties>
      <wlp-dir-path>replace-with-path-to-wlp-dir</wlp-dir-path>
    </properties>
  4. Inside your pom.xml file, to add the MCP jar as dependency, add the following dependency within the <dependencies> block:

    <!-- Liberty MCP API -->
    <dependency>
        <groupId>io.openliberty.mcp</groupId>
        <artifactId>mcp-core</artifactId>
        <version>1.0.106</version>
        <scope>system</scope>
        <systemPath>${wlp-dir-path}/lib/io.openliberty.mcp_1.0.106.jar}</systemPath>
    </dependency>
  5. Inside your pom.xml file, add the following plugin within the plugins block.

    <plugin>
        <groupId>io.openliberty.tools</groupId>
        <artifactId>liberty-maven-plugin</artifactId>
        <configuration>
            <installDirectory>${wlp-dir-path}</installDirectory>
        </configuration>
    </plugin>

We are actively developing support for additional MCP server features and would greatly appreciate your feedback on which functionality would be most valuable for your use case. To learn more about the current supported tool features and how to use them, stay tuned for an upcoming blog post that will go into further detail about how to make the most of the MCP Server feature in Liberty. You can raise a feature request by opening an issue.

Netty‑based HTTP transport on Open Liberty

Open Liberty now provides a Netty‑based HTTP transport as a beta preview. This is an internal implementation swap for the underlying transport used for HTTP/1.1, HTTP/2, WebSocket, JMS and SIP communications. It is designed for zero migration: your applications and server.xml should continue to behave the same. We are looking forward and counting on your feedback before GA release!

Netty’s event‑driven I/O gives us a modern foundation for long‑term scalability, easier maintenance, and future performance work. In order to help us validate parity and performance for your real-world scenarios, here are a few things that you could try:

  • HTTP/1.1 and HTTP/2: large uploads/downloads, chunked transfers, compression-enabled content, keep-alive behavior.

  • WebSocket: long-lived communications, backpressure scenarios

  • Timeouts: read/write/keep-alive timeouts under load

  • Access logging: verify formatting and log results compared to previous builds

  • JMS communications: message send/receive throughput, durable subscriptions

Limitations of the Beta release:

  • HTTP

    • HTTP requests with content length greater than the maximum integer value will fail due to internal limitations on request size with Netty.

    • If the http option maxKeepAliveRequests has no limits, there is a maximum limit of 50 pipelined requests on HTTP 1.1.

    • Http option resetFramesWindow is reduced from millisecond to second precision due to limitations in the Netty library.

    • Due to internal limitations of the Netty library, the http option MessageSizeLimit is now adjusted to be capped at the maximum integer value for HTTP/2.0 connections.

    • Due to internal differences with Netty, the http option ThrowIOEForInboundConnections could behave differently from the Channel Framework implementation.

    • Due to internal limitations of Netty, acceptThread and waitToAccept TCP options are currently not implemented and are ignored for the current Beta if set.

    • As of the moment, clear text upgrade requests for HTTP 2.0 with request data will be rejected with a 'Request Entity Too Large status' code. A fix is in progress.

  • WebSocket

    • Websocket inbound requests can generate FFDC RuntimeExceptions on connection clean up when a connection is closed from the client side.

  • SIP

    • SIP Resolver transport does not use a Netty transport implementation at the moment.

  • ALPN

    • Currently, our Netty implementation only supports the native JDK ALPN implementation. Additional information for the ALPN implementations currently supported by the Channel Framework but not our Netty beta can be found in the documentation

Try it now

To try out these features, update your build tools to pull the Open Liberty All Beta Features package instead of the main release. The beta works with Java SE 25, Java SE 21, Java SE 17, Java SE 11, and Java SE 8.

If you’re using Maven, you can install the All Beta Features package using:

<plugin>
    <groupId>io.openliberty.tools</groupId>
    <artifactId>liberty-maven-plugin</artifactId>
    <version>3.11.5</version>
    <configuration>
        <runtimeArtifact>
          <groupId>io.openliberty.beta</groupId>
          <artifactId>openliberty-runtime</artifactId>
          <version>25.0.0.11-beta</version>
          <type>zip</type>
        </runtimeArtifact>
    </configuration>
</plugin>

You must also add dependencies to your pom.xml file for the beta version of the APIs that are associated with the beta features that you want to try. For example, the following block adds dependencies for two example beta APIs:

<dependency>
    <groupId>org.example.spec</groupId>
    <artifactId>exampleApi</artifactId>
    <version>7.0</version>
    <type>pom</type>
    <scope>provided</scope>
</dependency>
<dependency>
    <groupId>example.platform</groupId>
    <artifactId>example.example-api</artifactId>
    <version>11.0.0</version>
    <scope>provided</scope>
</dependency>

Or for Gradle:

buildscript {
    repositories {
        mavenCentral()
    }
    dependencies {
        classpath 'io.openliberty.tools:liberty-gradle-plugin:3.9.5'
    }
}
apply plugin: 'liberty'
dependencies {
    libertyRuntime group: 'io.openliberty.beta', name: 'openliberty-runtime', version: '[25.0.0.11-beta,)'
}

Or if you’re using container images:

FROM icr.io/appcafe/open-liberty:beta

Or take a look at our Downloads page.

If you’re using IntelliJ IDEA, Visual Studio Code or Eclipse IDE, you can also take advantage of our open source Liberty developer tools to enable effective development, testing, debugging and application management all from within your IDE.

For more information on using a beta release, refer to the Installing Open Liberty beta releases documentation.

We welcome your feedback

Let us know what you think on our mailing list. If you hit a problem, post a question on StackOverflow. If you hit a bug, please raise an issue.