Documentation
Teams & Enterprise
Privacy & Security

Privacy & Security

This page is designed to share with your security and compliance team. It documents exactly what data Spark sends, what it never sends, and how authentication and transport are handled.

Explicit sharing model

Spark operates on an explicit sharing model. The spark share command is the only way data leaves your machine and enters the knowledge network. There is no automatic, silent, or background data collection.

Nothing is sent to the Spark API unless you explicitly run a command (spark query, spark share, spark feedback).

What IS sent

When you use Spark, only the following data is transmitted:

CommandData sent
spark queryThe query text you provide (e.g., an error message or problem description)
spark shareThe solution text you explicitly write, a title, and semantic tags
spark feedbackA rating (thumbs up/down) and optional comment on a recommendation

All data sent is text you explicitly typed or chose to share. Spark never reads or transmits anything from your filesystem automatically.

What is NEVER sent

Spark does not access, read, or transmit any of the following:

  • Source code files
  • Configuration files (.env, docker-compose.yml, webpack.config.js, etc.)
  • Environment variables
  • API credentials, secrets, or tokens
  • Database contents or connection strings
  • File system paths or directory structures
  • Log files (unless you explicitly copy log text into a query or share)
  • Git history or diffs
  • Package lock files or dependency trees
⚠️

Be mindful of what you paste into spark query and spark share. If you copy-paste a stack trace that contains file paths or environment details, that text will be sent. Spark does not scrub or redact your input — you control what you share.

Data flow

┌─────────────┐                          ┌───────────┐                ┌───────────────────┐
│  Developer   │  spark query "error..." │  Spark    │   semantic     │  Knowledge        │
│  (CLI)       │ ─────────────────────►  │  API      │   search       │  Network          │
│              │                          │  (HTTPS)  │ ──────────►   │                   │
│              │  ◄───────────────────── │           │  ◄──────────   │                   │
│              │  recommendations         │           │   matches      │                   │
└─────────────┘                          └───────────┘                └───────────────────┘
┌─────────────┐                          ┌───────────┐                ┌───────────────────┐
│  Developer   │  spark share ...        │  Spark    │   store        │  Knowledge        │
│  (CLI)       │ ─────────────────────►  │  API      │ ──────────►   │  Network          │
│              │                          │  (HTTPS)  │                │                   │
└─────────────┘                          └───────────┘                └───────────────────┘

Authentication

OAuth 2.0 PKCE flow

Interactive authentication uses the OAuth 2.0 Authorization Code flow with PKCE (Proof Key for Code Exchange):

  • No long-lived tokens are stored
  • Token refresh happens automatically with a 5-minute expiration buffer — tokens are refreshed before they expire, not after
  • The PKCE flow prevents authorization code interception attacks
  • No client secret is required on the developer's machine

API key authentication (CI/CD)

For non-interactive environments (CI/CD pipelines, automated scripts), use an API key:

export SPARK_API_KEY=sk_...
  • API keys are set as environment variables and never written to disk by Spark
  • Keys can be created and revoked from the dashboard at spark.memco.ai/dashboard
  • Use scoped keys with minimal permissions for CI/CD

Transport security

  • All communication uses HTTPS (TLS 1.2+)
  • There is no HTTP fallback — the CLI will not send data over an unencrypted connection
  • Certificate validation is enforced; self-signed certificates are not accepted by default

Local credential storage

ItemLocationPermissions
Global credentials~/.spark/settings.json0o600 (owner read/write only)
Global directory~/.spark/0o700 (owner access only)
Project credentials./.spark/settings.json0o600 (owner read/write only)
Project directory./.spark/0o700 (owner access only)

File permissions are set at creation time and verified on each access. If permissions have been loosened (e.g., by a file sync tool), Spark will warn before proceeding.

Summary for security review

ConcernStatus
Automatic data collectionNone — all sharing is explicit
Source code transmissionNever
Credential storageFile permissions 0o600, directory 0o700
AuthenticationOAuth 2.0 PKCE (interactive), API key (CI/CD)
TransportHTTPS only, no HTTP fallback
Token managementAuto-refresh with 5-min buffer, no long-lived tokens
Data sentOnly text explicitly provided by the user