Skip to main content
POST
/
analyze
curl -X POST http://localhost:3001/analyze \
  -H "Content-Type: application/json" \
  -d '{
    "githubUrl": "https://github.com/ripple/rippled",
    "twitterHandle": "@ripple"
  }'
{
  "success": true,
  "data": {
    "id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
    "status": "pending"
  }
}

Request

githubUrl
string
required
Full GitHub URL of the repository to analyze (e.g., https://github.com/ripple/rippled).
twitterHandle
string
Optional Twitter handle for social signal analysis (e.g., @ripple).

Response

Returns immediately with a report ID. The analysis runs asynchronously in the background.
success
boolean
Always true on 201.
data
object
curl -X POST http://localhost:3001/analyze \
  -H "Content-Type: application/json" \
  -d '{
    "githubUrl": "https://github.com/ripple/rippled",
    "twitterHandle": "@ripple"
  }'
{
  "success": true,
  "data": {
    "id": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
    "status": "pending"
  }
}

What Happens Next

The analysis pipeline runs these steps in order:
  1. Scraping — GitHub data via Octokit (commits, contributors, languages, CI)
  2. Social scraping — Twitter data (currently mocked)
  3. Polymarket — Fetches market sentiment (non-critical)
  4. Claude scoring — AI evaluates code quality, team strength, traction, social presence
  5. Adversarial audit — Second Claude call challenges the scores (non-critical)
Poll GET /report/:id/score to track progress.