Documentation renewed! For old docs, visit doc.newapi.pro
New APINew API
User GuideInstallationAPI ReferenceAI ApplicationsHelp & SupportBusiness Cooperation

OpenClaw - Self-Hosted AI Assistant Platform

Project Introduction

OpenClaw is an open-source, self-hosted personal AI assistant platform that bridges messaging apps to AI agents running on your own hardware. Designed for developers and power users who want autonomous AI assistants without surrendering data control.

🌟 Core Features

Multi-Channel Integration

  • Full Platform Coverage: Supports Lark (Feishu), Discord, Slack, Microsoft Teams, and more
  • Single Gateway: Manage all channels through a single Gateway process
  • Voice Support: Voice interaction on macOS/iOS/Android
  • Canvas Interface: Render interactive Canvas interfaces

Self-Hosted & Data Security

  • Fully Self-Hosted: Runs on your own machine or server
  • Open Source: MIT licensed, fully transparent code
  • Local Data Storage: Context and skills stored on your local computer, not in the cloud

Intelligent Agent Capabilities

  • Always-On: Supports background persistent operation with persistent memory
  • Scheduled Tasks: Supports cron scheduled tasks
  • Session Isolation: Sessions isolated per agent/workspace/sender
  • Multi-Agent Routing: Supports multi-agent collaborative work
  • Tool Calling: Native support for tool calling and code execution

šŸ“¦ Installation

Requirements

  • Node.js 22 or higher
  • An AI model API key
npm install -g openclaw@latest

After installation, run the onboarding wizard:

openclaw onboard

šŸš€ Configuration

Configuration File Location

The OpenClaw configuration file is located at ~/.openclaw/config.json. It can be auto-generated through the onboarding wizard or manually edited.

Configuration Example

Here is a complete configuration example using New API as the model provider:

{
  "meta": {
    "lastTouchedVersion": "2026.2.1",
    "lastTouchedAt": "2026-02-03T12:17:41.559Z"
  },
  "wizard": {
    "lastRunAt": "2026-02-02T21:17:16.011Z",
    "lastRunVersion": "2026.2.1",
    "lastRunCommand": "onboard",
    "lastRunMode": "local"
  },
  "auth": {
    "cooldowns": {
      "billingBackoffHoursByProvider": {}
    }
  },
  "models": {
    "providers": {
      "newapi": {
        "baseUrl": "https://your-newapi-domain.com/v1",
        "apiKey": "sk-your-api-key",
        "auth": "api-key",
        "api": "openai-completions",
        "models": [
          {
            "id": "gemini-3-flash-preview",
            "name": "gemini-3-flash-preview",
            "api": "openai-completions",
            "reasoning": true,
            "input": [
              "text",
              "image"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 128000,
            "maxTokens": 64000
          },
          {
            "id": "kimi-k2.5",
            "name": "kimi-k2.5",
            "api": "openai-completions",
            "reasoning": true,
            "input": [
              "text",
              "image"
            ],
            "cost": {
              "input": 0,
              "output": 0,
              "cacheRead": 0,
              "cacheWrite": 0
            },
            "contextWindow": 128000,
            "maxTokens": 64000
          }
        ]
      }
    },
    "bedrockDiscovery": {
      "providerFilter": []
    }
  },
  "agents": {
    "defaults": {
      "model": {
        "primary": "newapi/gemini-3-flash-preview",
        "fallbacks": [
          "newapi/kimi-k2.5"
        ]
      },
      "models": {
        "newapi/gemini-3-flash-preview": {
          "alias": "gemini-3-flash-preview"
        },
        "newapi/kimi-k2.5": {
          "alias": "kimi-k2.5"
        }
      },
      "workspace": "/home/your-username/.openclaw/workspace",
      "maxConcurrent": 4,
      "subagents": {
        "maxConcurrent": 8
      }
    }
  },
  "messages": {
    "ackReactionScope": "group-mentions"
  },
  "commands": {
    "native": "auto",
    "nativeSkills": "auto"
  },
  "channels": {
    "lark": {
      "enabled": true,
      "dmPolicy": "pairing",
      "appId": "your-lark-app-id",
      "appSecret": "your-lark-app-secret",
      "groupPolicy": "allowlist",
      "streamMode": "partial"
    }
  },
  "gateway": {
    "port": 18789,
    "mode": "local",
    "bind": "loopback",
    "auth": {
      "mode": "token",
      "token": "your-secure-token"
    },
    "tailscale": {
      "mode": "off",
      "resetOnExit": false
    }
  },
  "skills": {
    "install": {
      "nodeManager": "npm"
    }
  }
}

Key Configuration Details

Configuration ItemDescription
models.providers.newapi.baseUrlNew API deployment address, must include /v1
models.providers.newapi.apiKeyNew API token key
models.providers.newapi.modelsModel list, add multiple models as needed
agents.defaults.model.primaryDefault primary model, format: provider/model-id
agents.defaults.model.fallbacksFallback model list, auto-switches when primary is unavailable
channels.lark.appIdLark (Feishu) App ID, obtained from Lark Open Platform
channels.lark.appSecretLark (Feishu) App Secret
gateway.portGateway listening port
gateway.auth.tokenGateway access security token

Start the Service

After configuration is complete, start OpenClaw:

openclaw start

Once started, you can interact with the AI assistant through the configured channels.

How is this guide?

Last updated on