Skip to main content

Documentation Index

Fetch the complete documentation index at: https://api-docs.ollang.com/llms.txt

Use this file to discover all available pages before exploring further.

How It All Connects

Ollang exposes the same underlying platform through four integration surfaces. They are complementary, not exclusive — most teams use two or three of them together.
┌──────────────────────┐
│  Olabs Dashboard      │ ← humans: create orders, configure workflows, review
│  (lab.ollang.com)     │
└──────────────────────┘


┌──────────────────────────────────────────────────────────────┐
│           Ollang Integration API  (api-integration.ollang.com)│
│  REST endpoints for uploads, orders, projects, QC, revisions,  │
│  human review, custom instructions, folders, callbacks.         │
└──────────────────────────────────────────────────────────────┘
       ▲                ▲                 ▲                ▲
       │                │                 │                │
       │                │                 │                │
┌──────┴─────┐  ┌───────┴──────┐  ┌───────┴──────┐  ┌──────┴──────┐
│  Your code  │  │  Ollang MCP   │  │  Ollang SDK   │  │ Ollang      │
│  (curl,     │  │  (Cloudflare,  │  │  (npm,        │  │ Skills      │
│   HTTP libs)│  │   OAuth 2)     │  │   TypeScript) │  │ (file-based)│
└────────────┘  └────────────────┘  └───────────────┘  └─────────────┘

The Four Integration Surfaces

REST API

Programmatic access. Works in any language with an HTTP client. API-key authenticated.

MCP Server

Hosted Model Context Protocol endpoint. Plug into Claude Desktop, Cursor, Claude Code, Devin, Replit, Windsurf, and other MCP clients. OAuth 2.0 + PKCE.

SDK

TypeScript/Node.js library (@ollang-dev/sdk) for asset scanning, i18n workflows, CMS capture, and a typed REST client.

Skills

File-based Agent Skills bundle that teaches your AI coding agent how to call Ollang. No proxy server, no MCP needed.

When to Use Which

ScenarioBest fit
Build a translation pipeline from a backend or CI jobREST API (any language) or SDK (if Node.js)
Let translators / PMs trigger work from their AI assistant (Claude, Cursor, Devin) without writing codeMCP
Localize an internal Next.js / React / Vue app’s i18n filesSDK Asset Management
Localize CMS content (Strapi, custom) captured live in the browserSDK Browser SDK
Add translation to an existing AI coding workflow without running a serverSkills
Trigger orders from no-code/low-code tools (Airtable, Notion)MCP with the matching integration
Run Ollang as part of a workflow editor / agent runtimeMCP (preferred) or REST API if you can’t run MCP

Authentication at a Glance

SurfaceAuth
REST APIAPI key in X-Api-Key header. Generate one in the Olabs Dashboard.
MCP ServerOAuth 2.0 + PKCE. Sign in with email OTP, Google, or GitHub the first time you connect.
SDKAPI key (via OLLANG_API_KEY env var or constructor argument).
SkillsAPI key. The agent reads OLLANG_API_KEY from your environment or asks you to provide one in chat.
All MCP traffic still flows through the same Integration API behind the scenes — MCP is a protocol layer that translates conversational tool calls into Ollang REST calls.

A Concrete End-to-End Example

A typical localization flow — phrased four ways:
# 1. Upload
curl -X POST "https://api-integration.ollang.com/integration/upload/direct" \
  -H "X-Api-Key: <your-api-key>" \
  -F "file=@sample.mp4" \
  -F "name=sample" \
  -F "sourceLanguage=en"
# → { "projectId": "<project-id>" }

# 2. Create subtitle order for French + Spanish
curl -X POST "https://api-integration.ollang.com/integration/orders/create" \
  -H "X-Api-Key: <your-api-key>" \
  -H "Content-Type: application/json" \
  -d '{
    "orderType": "subtitle",
    "level": 0,
    "projectId": "<project-id>",
    "targetLanguageConfigs": [{"language":"fr"},{"language":"es"}],
    "autoQc": true
  }'
# → [{ "orderId": "<id-fr>" }, { "orderId": "<id-es>" }]
In all four cases:
  1. The same upload endpoint runs (POST /integration/upload/direct).
  2. The same order endpoint runs (POST /integration/orders/create).
  3. The same project / order records are created in your Ollang account.
  4. The same callback fires (if you set one).

Workflows Apply Equally Everywhere

The workflow layer (which AI providers run, which review gates apply, which custom instructions are used) is configured in the Olabs Dashboard at the Global or Folder level. It applies the same way no matter which surface created the order. See Workflows and Provider Architecture. This means:
  • A developer creating an order via the REST API gets the same provider routing and review-gate behavior as a PM creating one in the dashboard, or an AI agent creating one via MCP.
  • Changing a folder workflow affects all future orders into that folder, regardless of where they originate.

Picking a Path — Quick Recommendations

  • Just exploring? Run through Getting Started with curl.
  • Shipping production code? Use the REST API or, on Node.js, the SDK.
  • AI-driven team workflows? Connect MCP once and any compatible client can use Ollang.
  • AI-driven developer workflows in your editor? Skills — one command, no server.