githubEdit

Integration with LangChain & LangGraph

The Crynux Bridge provides an OpenAI-compatible API, making it seamless to integrate with LangChainarrow-up-right and LangGrapharrow-up-right. You can use Crynux Bridge API as a drop-in replacement for OpenAI API in your AI applications.

There are two ways to use Crynux with LangChain:

  1. Using langchain-crynux: A dedicated package optimized for Crynux.

  2. Using langchain-openai: The standard OpenAI integration package.

The langchain-crynux package is a drop-in replacement for ChatOpenAI that is specifically tuned for the Crynux Network. It provides first-class support for Crynux-specific parameters like vram_limit.

Installation

pip install langchain-crynux

Usage

import os
from langchain_crynux import ChatCrynux

# You can set the API key in the environment variable
# os.environ["OPENAI_API_KEY"] = "your-api-key"

chat = ChatCrynux(
    base_url="https://bridge.crynux.io/v1/llm",
    model="Qwen/Qwen2.5-7B-Instruct",
    vram_limit=24,  # Specify the required VRAM in GB
    # api_key="your-api-key", # Or pass it directly
)

response = chat.invoke("Hello, introduce yourself.")
print(response.content)

The vram_limit parameter is essential for the Crynux Network to route your task to a node with sufficient GPU memory. The default is 24GB.

Method 2: Using langchain-openai

Since the Crynux Bridge is fully compatible with the OpenAI API, you can also use the standard langchain-openai library. This is useful if you already have an existing project using LangChain's OpenAI integration.

Installation

Usage

To use ChatOpenAI with Crynux, you simply need to override the base_url and pass Crynux-specific parameters via model_kwargs.

Using with LangGraph

Both methods above return a standard LangChain Runnable, which can be directly used in LangGraph workflows. Here is a simple example of a LangGraph agent using a Crynux model.

Installation

Example

Last updated