# Bee — The Progressive Intelligence Engine > Bee — The Progressive Intelligence Engine — is a tiered LLM platform from CUI Labs (Pte.) Ltd., a Singapore-incorporated company (UEN 202532790K). Seven customer-selectable tiers form the Intelligence Ladder (Cell → Brood → Comb → Buzz → Hive → Swarm → Enclave) covering public access, professional reasoning, builder workflows, team operations, enterprise specialist intelligence, high-assurance reasoning, and sovereign deployment. Bee Cell is live in production; higher tiers ship through a governed-release pipeline under our audited release policy. Bee Ignite is internal R&D, not customer-selectable. Post-quantum cryptography (FIPS 203/204/205) is wired through QNSP for Bee Enclave Sovereign deployments. ## What is live today (verifiable in code) - **Bee Cell production inference** — Modal app `bee-cell-prod` (`infra/modal/bee_app.py`). Cold start ~45s, warm ~1s. Bee Cell ships on a curated open-weight production base under our governed release policy; no LoRA adapter merged yet — registry/adapter_releases.json is the source of truth. Customers requiring base-level disclosure (regulated/sovereign procurements) get it under NDA via the Enclave deployment manifest. - **Bee Security Eval Harness** — 52 cases across 10 categories (`evals/yaml_harness/cybersecurity/cases/`). Latest baseline: 12.5/100 against the live Modal endpoint. The release gate for Stage 1 (APK distribution) is failures dropping below the blocking threshold. - **Runtime safety wrapper** — Tier-1 CII detection + DDoS-shape detection in `bee/safety_wrapper.py`. Wired in front of `/chat/completions`. - **CVE / KEV ingestion** — `/api/cron/cve-ingest` (NVD daily) and `/api/cron/kev-ingest` (CISA KEV daily). Distillation via Mistral teacher into a 731-row cybersec corpus on `cuilabs/bee-interactions`. - **Training pipelines (4, all auto-post to /api/training/runs)** — Vertex L4 / A100 / A100-80GB SPOT (Comb production adapters + Hive / Swarm / Enclave / Ignite tiers); Kaggle T4 x3 concurrent via the multi-slug dispatcher (Cell / Brood / Comb); Colab T4 (manual fallback); HuggingFace Spaces (background). Approved GCP quotas in us-central1: A100 = 8, A100-80GB = 4, L4 = 4. Concurrent training: ~15 jobs as of 2026-05-05. - **Vertex tier training** — Comb / Buzz / Hive / Swarm cybersec adapters in training across Vertex L4 / A100 / A100 80GB. Underlying base assignments are governed in `bee/tiers.py` and audited in `docs/governance/current.md`; not disclosed in marketing copy. ## What is roadmap (clearly labelled) - **Bee Comb / Buzz / Hive / Swarm production tiers** — adapters in training. None merged into production routing yet. Enclave is queued behind H100 80GB quota approval. Per-tier base disclosures live in the Enclave customer deployment manifest under NDA. - **Post-quantum cryptography on every API call** — Roadmap. Today, FIPS 203/204/205 (ML-KEM / ML-DSA / SLH-DSA) is wired through QNSP for Bee Enclave Sovereign customer transport only. Public Bee Cell on Modal uses standard TLS 1.3. - **Quantum hardware integration (IBM Heron r2 / Quantum Inspire / D-Wave)** — Code wired in `bee/quantum_ibm.py`, `bee/quantum_reasoning.py`. Production Modal image excludes qiskit; quantum is opt-in / dev / Bee Ignite research only. Customer API calls do NOT route through quantum today. - **Self-improving / autonomous module invention** — Architecture present in `bee/evolution.py` and `bee/invention_engine.py`. Not running in production. - **Multi-agent swarm runs** — Architecture present. Not running in production. - **Bee Ignite** — Internal R&D substrate (MoE + SSM + neural compression + distillation + quantum-assisted modules). Not customer-selectable, not sold. Findings backflow to Cell / Brood / Comb / Hive over time. ## Verified sources (every marketing claim is linked from /trust) - Parent company: CUI Labs (Pte.) Ltd. — Singapore — https://www.cuilabs.io - HuggingFace org: https://huggingface.co/cuilabs - Bee Cell ships on a curated open-weight Apache-2.0 base, governed via `docs/governance/current.md` and audited in `registry/adapter_releases.json`. Specific base disclosure is contractual (Enclave deployment manifest) — kept off public marketing to insulate customers from upstream-vendor noise. - NIST FIPS 203 (ML-KEM): https://csrc.nist.gov/pubs/fips/203/final · effective Aug 14, 2024 - NIST FIPS 204 (ML-DSA): https://csrc.nist.gov/pubs/fips/204/final · effective Aug 14, 2024 - NIST FIPS 205 (SLH-DSA): https://csrc.nist.gov/pubs/fips/205/final · effective Aug 14, 2024 - IBM Heron r2 (156 qubits): https://en.wikipedia.org/wiki/IBM_Heron - IBM Quantum Open Plan: https://www.ibm.com/quantum/blog/open-plan-updates - Compliance posture: CSA STAR Level 1 (parent) · ISO 27001 track in progress ## Stack snapshot - Inference backend: Modal serverless (`infra/modal/bee_app.py`). - Web app: Next.js 16 / React 19 at `apps/web` (deployed to Vercel). - Mobile: bare React Native CLI at `apps/mobile` (NOT Expo / NOT EAS). - Database: Supabase Postgres (direct pg pooler, port 6543). - API base URL: `https://api.bee.cuilabs.io` — drop-in replacement for OpenAI `/chat/completions` (route lives at `apps/web/src/app/api/chat/completions/route.ts`). - Bee Cell — single live tier today. Brood / Comb / Buzz / Hive / Swarm / Enclave — roadmap. ## Pricing summary Source of truth: `apps/web/src/lib/billing/catalog.ts`. - Bee Cell — Free. Local AI on your hardware. 500K demo tokens. - Bee Brood — $19/mo. 5M tokens. Cell + limited Comb routing. - Bee Comb — $79/mo. 3 seats. 10M tokens. Full Comb access (Bee Comb production base; cybersec adapter trained on Vertex L4, not yet merged into routing). - Bee Buzz — $149/mo. 6 seats. 25M tokens. Shared RAG. - Bee Hive — $299/mo. 10 seats. 50M tokens. Bee Hive multi-base ensemble — first cybersec adapter in training on Vertex A100 SPOT (roadmap, ~10h ETA). - Bee Swarm — $1,499/mo. 25 seats. 250M tokens. Bee Swarm routed fabric (frontier-class) — cybersec adapter in training on Vertex A100 80GB (roadmap, ~14h ETA). - API Free — $0. 500K tokens / month. Hard-limited. - API Build — Pay as you go. $0.50/1M input · $1.50/1M output. - API Scale — $500/mo minimum (applied as credit). Dedicated CSM. - API Enterprise — Custom rate card. - Bee Enclave Private Cloud — from $7,500/mo. Single-tenant VPC. - Bee Enclave Regulated — from $15,000/mo. Customer-managed keys. - Bee Enclave Sovereign — from $50,000/mo. Air-gapped, ITAR/IL5/FedRAMP-aligned. PQC-by-default transport via QNSP. Base disclosure under customer deployment manifest + NDA. Annual is monthly × 12. ## Pages - [Homepage](https://bee.cuilabs.io/): Hero, pillars, model lineup, comparison, pricing. - [Models](https://bee.cuilabs.io/models): Seven customer tiers — Cell live; Brood / Comb / Buzz / Hive / Swarm / Enclave on the governed-release roadmap. Ignite is internal R&D. - [Platform](https://bee.cuilabs.io/platform): Multi-cloud training pipeline, editor integrations, capture pipeline. - [Quantum](https://bee.cuilabs.io/quantum): IBM Heron r2 integration scope. Local simulator quickstart. Production quantum routing is roadmap. - [Pricing](https://bee.cuilabs.io/pricing): Workspace · API · Enterprise Enclave. - [Security](https://bee.cuilabs.io/security): FIPS 203/204/205 scope (Sovereign deployments today; broader rollout roadmap), audit posture, vulnerability disclosure. - [Docs](https://bee.cuilabs.io/docs): API quickstart, authentication, chat completions, RAG, streaming, errors. - [Changelog](https://bee.cuilabs.io/changelog): Public release notes (Stage 0 safety + audit, Modal cutover, eval harness, CVE/KEV pipeline). - [About](https://bee.cuilabs.io/about): Company manifesto, CUI Labs facts. - [Trust & Evidence](https://bee.cuilabs.io/trust): Every marketing claim with its citation. - [Contact](https://bee.cuilabs.io/contact): bee-*@cuilabs.io inboxes with response-time SLAs. - [Legal](https://bee.cuilabs.io/legal/terms): Terms · Privacy · DPA · AUP · Security · Cookies. ## How Bee compares | Capability | Bee | OpenAI | Anthropic | |---|---|---|---| | Post-quantum cryptography (Sovereign deployments via QNSP) | Yes | No | No | | Real quantum hardware integration code (IBM Heron r2) | Yes (research) | No | No | | Self-improving / autonomous module invention | Architecture only | No | No | | Air-gapped / sovereign deployment (Bee Enclave Sovereign) | Yes | Limited | Limited | | OpenAI-compatible API | Yes | Yes | No | | Free hosted token tier on the API | Yes | No | No | | Open model weights (Bee Cell base) | Yes | No | No | ## Editor integrations - VS Code — Bee Intelligence Engine extension (cuilabs.bee-intelligence). - Cursor — set OpenAI base URL to https://api.bee.cuilabs.io/v1. - Windsurf — custom provider, paste base URL + API key. - Continue.dev — add Bee as provider in ~/.continue/config.json. - Zed — OpenAI-compatible LLM provider. - JetBrains AI Assistant — connect via OpenAI-compatible custom endpoint. - Aider — `aider --openai-api-base ... --openai-api-key ...`. ## Contacts - Sales: bee-sales@cuilabs.io - Support: bee-support@cuilabs.io - Security: bee-security@cuilabs.io - Press: bee-press@engic.ltd ## Key URLs - Site: https://bee.cuilabs.io - Workspace: https://bee.cuilabs.io/app/chat - API: https://api.bee.cuilabs.io - HuggingFace: https://huggingface.co/cuilabs - Sitemap: https://bee.cuilabs.io/sitemap.xml - robots.txt: https://bee.cuilabs.io/robots.txt