How to Run LLM using Crynux Network
from openai import OpenAI
client = OpenAI(
base_url="https://bridge.crynux.io/v1/llm",
api_key="q3hXHA_8O0LuGJ1_tou4_KamMlQqAo-aYwyAIDttdmI=", # For public demonstration only, strict rate limit applied.
timeout=60,
max_retries=1,
)
res = client.chat.completions.create(
model="Qwen/Qwen2.5-7B-Instruct",
messages=[
{
"role": "user",
"content": "What is the capital of France?",
},
],
stream=False,
extra_body={
"vram_limit": 24,
}
)
print(res)GPU VRAM Requirement
Advanced Usage
Tool Use/Function CallingStructured OutputIntegration with LangChain & LangGraphMethod 1: Using the Official Crynux Bridge
Method 2: Hosting Your Own Crynux Bridge
Crynux BridgeMethod 3: Sending Tasks Directly to the Blockchain
Crynux SDKLast updated