Open Source

Private OpenCode

The first open-source AI coding agent with secure inference. Built on OpenCode, extended with attested TLS so your code never leaves your device unencrypted.

Private OpenCode terminal interface showing the coding agent with secure inference

How is this different from OpenCode?

OpenCode is an open-source AI coding agent — like Claude Code, but provider-agnostic and 100% open source. It connects to any LLM API (Claude, OpenAI, local models) and gives you a powerful terminal-based coding assistant.

Private OpenCode adds a critical security layer: attested TLS (aTLS). When you connect to a model running inside a Trusted Execution Environment (TEE), the TLS handshake verifies the remote server's TEE attestation before any data is exchanged. This means:

  • The model is provably running inside an enclave that matches the expected configuration
  • Your code is encrypted in transit and only decrypted inside the TEE
  • Not even the infrastructure provider can read your prompts or responses
  • Verification is cryptographic, not based on trust

Regular OpenCode trusts the API provider to handle your data responsibly. Private OpenCode removes that trust requirement entirely — the security is enforced by hardware attestation.

Installation

Install Private OpenCode
curl -fsSL https://raw.githubusercontent.com/concrete-security/private-opencode/dev/private-install.sh | bash

Then navigate to your project and run:

private-opencode

Configuration

Create .opencode/opencode.jsonc at the root of your project. The policyFile points to the TEE measurement and configuration generated by your provider — this is what aTLS verifies during the handshake.

// .opencode/opencode.jsonc
{
  "$schema": "https://opencode.ai/config.json",
  "model": "concrete-security/secure-gpt-oss-120b",
  "provider": {
    "opencode": { "options": {} },
    "concrete-security": {
      "npm": "@concrete-security/private-ai-sdk",
      "api": "https://vllm.concrete-security.com/v1",
      "options": {
        "sdk": "@ai-sdk/openai-compatible",
        "policyFile": "/absolute/path/to/cvm_policy.json"
      },
      "models": {
        "secure-gpt-oss-120b": {
          "id": "openai/gpt-oss-120b",
          "name": "Secure gpt-oss-120b"
        }
      }
    }
  }
}

Open source

Fork of OpenCode with full source available. Audit the code, verify the behavior, contribute improvements.

Attested TLS (aTLS)

Before any code is sent, the TLS handshake verifies the remote TEE attestation. If the enclave doesn't match the expected configuration, the connection is refused.

Local-first

Runs entirely on your machine. Your codebase never leaves your device unencrypted — plaintext only exists inside the verified TEE.

Provider-agnostic

Works with any provider running a model inside a TEE. Point it at any Umbra-deployed CVM and the aTLS policy file handles verification.