Onboarding Guide
A step-by-step playbook for rolling out Spark CLI to your engineering team.
Rolling out Spark
Admin: Create a workspace
Go to spark.memco.ai/dashboard (opens in a new tab) and create a workspace for your team. This gives you:
- A private knowledge layer on top of the public network
- An admin dashboard for managing members and API keys
- Usage analytics and adoption metrics
Admin: Configure authentication
Choose your authentication method:
- OAuth (recommended for developer machines): Configure your identity provider in the dashboard. Developers will authenticate via
spark loginwhich opens a browser-based OAuth flow. - API keys (for CI/CD and automation): Generate keys from the dashboard. Each key can be scoped to specific permissions (read-only, read-write, admin).
Each developer: Install Spark CLI
npm install -g @memco/sparkOr use the install script for environments without Node.js:
curl -fsSL https://raw.githubusercontent.com/memcoai/spark-cli/main/install.sh | bashEach developer: Authenticate
For interactive use on developer machines:
spark loginFor CI/CD environments, set the API key as an environment variable:
export SPARK_API_KEY=sk_your_team_key_hereEach developer: Initialize in their project
cd your-project
spark initThis creates a .spark/settings.json file with project-level configuration. The spark init command detects your project's language and framework to optimize query relevance.
Establish a sharing culture
This is the most important step. Spark's value comes directly from what your team shares. Set expectations:
- After solving a non-trivial problem, run
spark share - During code review, ask: "Did you share this solution to Spark?"
- In standups, celebrate high-value shares that saved teammates time
- Lead by example — tech leads and senior devs sharing first sets the norm
Measure adoption and ROI
Track progress in the dashboard at spark.memco.ai/dashboard (opens in a new tab):
- Queries per developer per day — Are people using it?
- Shares per developer per week — Are people contributing?
- Recommendation hit rate — Are queries returning useful results?
- Feedback ratings — Are the recommendations actually helping?
Getting buy-in
For developers
Lead with the personal benefit: "You'll stop re-solving problems that someone else on the team already fixed." Demonstrate with a real example — solve a known team pain point, share it to Spark, then show a colleague querying for it and finding the answer immediately.
For engineering managers
Lead with the metrics: 40% cost reduction, 34% faster execution, predictable sprint budgets. Frame Spark as a force multiplier — it makes your existing team more productive without adding headcount. The $80,000+ annual savings for a 10-person team is a concrete number to put in a budget proposal.
For security teams
Share the Privacy & Security page. Key points: no source code is transmitted, all sharing is explicit, OAuth 2.0 PKCE authentication, HTTPS-only transport, and credentials stored with restrictive file permissions.
Establishing sharing culture
The biggest risk to a Spark rollout isn't technical — it's adoption. Here's what works:
- Start small. Pick 2-3 developers who hit a lot of shared infrastructure problems. Let them build the initial knowledge base.
- Seed the network. Before the wider rollout, have the pilot group share 20-30 solutions covering common team pain points. New users should find useful results on their first query.
- Make sharing part of the definition of done. If a PR fixes a non-trivial bug, the fix should be shared to Spark as part of closing the ticket.
- Recognize contributions. Use the dashboard to identify top contributors. Mention them in team meetings. Knowledge sharing should be visible and valued.
- Don't force it. Mandating shares leads to low-quality noise. Focus on making sharing easy and rewarding, not mandatory.
The single best predictor of a successful Spark rollout is whether the tech lead or a senior developer actively uses it in the first two weeks. Teams follow the example set by their most respected engineers.
Measuring ROI
After 30 days, pull these metrics from the dashboard:
| Metric | What it tells you |
|---|---|
| Total queries | Adoption — are developers using Spark? |
| Recommendation acceptance rate | Quality — is the knowledge base useful? |
| Shares per week | Contribution — is the knowledge base growing? |
| Average query-to-resolution time | Speed — are developers finding answers faster? |
| AI agent token costs (before/after) | Cost — are you saving money on agent usage? |
Compare AI agent costs from the month before Spark adoption to the month after. The 40% cost reduction benchmark gives you a target, but your actual savings depend on how actively your team shares knowledge.