Hosted demo — paste a URL, upload a HAR, or browse the demo scenarios. HTTPS-only, SSRF-guarded, rate-limited.

api-medic · diagnostic · report
api-medic diagnostic report — POST returning 401 with timing breakdown and two critical findings: trailing whitespace in Authorization header and an expired bearer token
// cover · web report for a POST that returned 401 — two critical findings, with evidence and a suggested fix

"My API call is returning a 401 — can you take a look?" Anyone who has done technical support for an API product has had this conversation a thousand times. The actual answer is almost always one of a small handful of things — an expired token, a copy-paste that picked up a trailing newline, a clock skew, a CORS preflight, a content-length mismatch, a TLS chain that broke when the cert rotated. Finding it takes ten minutes of network-tab archaeology that the support engineer has to do over and over again, on every ticket, all day.

api-medic is the tool I wanted to have on my desk for that work. You hand it a request — as a URL, a curl command, a HAR file from the browser, or a captured request from its DevTools panel — and it runs the same twenty diagnostic checks every time, returns a structured report with evidence and a suggested fix, and renders it as terminal output, JSON, markdown, or a polished HTML report. Same engine, four surfaces, identical results.

01 // The Challenge

Two things were going on at once. First, the diagnosis loop for HTTP failures is repetitive and rote — the same ten or fifteen failure modes account for the vast majority of real tickets, and there's no good reason a human should be the one pattern-matching them every time. Second, the existing tools all live at the wrong layer. Curl gives you bytes. Postman gives you a request builder. Wireshark gives you packets. None of them tell you "the JWT in your Authorization header expired three hours ago and that's why you're seeing 401" in plain English with the evidence quoted underneath.

The brief I set for myself: build something that takes the request the user is already holding — in whatever form they have it — and produces a report a junior support engineer could hand to a customer without rewriting. The report has to be the same regardless of how you get there, the checks have to be stable enough to reference by ID in tickets and runbooks, and the tool has to be deployable as something a customer can actually use without installing anything.

02 // The Stack

Each piece exists to solve a specific problem in the diagnosis loop, not because it was the trendy choice.

03 // Architecture Overview

One engine, four surfaces, identical reports. Whichever way the request comes in — typed into a CLI, posted to the hosted Lambda, captured by the browser extension — it ends up as the same internal Request object, runs through the same twenty checks, and produces the same Report. The four renderers are interchangeable; pick the one that fits where the report is being read.

            ┌────────────────┐  ┌──────────────┐  ┌──────────────────┐  ┌────────────────────┐
            │      CLI       │  │ Local Web UI │  │   Hosted Demo    │  │ Browser Extension  │
            │  api-medic …   │  │ :8765 (local)│  │ AWS Lambda       │  │ DevTools panel     │
            └────────┬───────┘  └──────┬───────┘  └────────┬─────────┘  └────────┬───────────┘
                     │                 │                   │                     │
                     └────────┬────────┴────────┬──────────┴──────────┬──────────┘
                              ▼                 ▼                     ▼
                        ┌─────────────────────────────────────────────────────┐
                        │              api-medic core (Python)                │
                        │   parser  →  HTTP runner  →  20 checks  →  Report   │
                        └─────────────────────────────────────────────────────┘
                                              │
                              ┌───────────────┼────────────────┐
                              ▼               ▼                ▼
                          terminal          JSON          markdown / HTML

The Lambda surface gets one extra piece in front of the runner: an SSRF guard that resolves the target host, blocks private and link-local ranges, and forces HTTPS-only with per-IP throttling. The CLI and the local web UI skip that guard — they trust the local user — and the browser extension captures requests client-side, so the analysis never sees anything the user's own browser couldn't already make.

04 // The 20 Diagnostic Checks

Every check has a stable, namespaced ID — auth.jwt.expired, network.tls.expired, http.cors.misconfigured, body.content_length_mismatch. The IDs are the contract: tickets and runbooks can reference a check by ID, filter on it in CI, suppress it for a known case — all without coupling to whatever the human-readable title happens to say this week. The five categories below carve the 20 checks into things a support engineer thinks about distinctly:

▸ 06
Network & TransportDNS resolution and slow-DNS detection, TLS expiration with expiring-soon warnings, weak-protocol detection, certificate CN mismatch.
▸ 05
HTTP SemanticsCORS misconfiguration, duplicate headers, redirect loops, too-many-redirect chains, protocol downgrades.
▸ 04
AuthenticationMissing Authorization header, expired or not-yet-valid JWT claims, whitespace contamination in header values.
▸ 03
Body / ContentMalformed JSON, content-length vs. body size mismatches, encoding declaration mismatches.
▸ 02
Rate Limiting429 responses with retry-after surfacing, approaching-limit detection from rate-limit headers.

One scoping note worth being explicit about: JWT signatures are not verified. The token is decoded for claims (exp, nbf, iss, aud) because those are what produce the kind of "expired three hours ago" finding that's actually useful. Full signature verification needs the issuer's secret or public key, which is out of scope for a tool that doesn't know which tenant it's looking at.

05 // The CLI

The CLI is the surface I use most. Four input modes, four output formats, designed to live inside a script or a CI step as easily as it lives in an interactive terminal session.

api-medic · run · --help
api-medic run --help output showing all options: --method, --header, --body, --body-file, --timeout, --output (terminal/json/markdown/html), --save, --no-color, --verbose
// api-medic run --help — the request-builder shape mirrors curl, the renderer flags select which surface gets the report

The terminal renderer runs first because it's the one that needs the most work. Color, ANSI box-drawing for the timing panel, severity glyphs, and a deliberately tight layout so a critical finding fits on one screen. The other renderers are mechanical translations of the same Report object — JSON for tooling, markdown for ticket pastes, HTML for the polished web view.

api-medic · cli · expired tls cert
api-medic CLI terminal output for a request to expired.badssl.com — TLS certificate expired 4037 days ago, with not_after date and a suggested fix to renew via certbot/ACME
// CLI run against expired.badssl.com — TLS check fires, evidence quoted, fix suggested
# Simple URL — defaults to GET, terminal output
api-medic https://api.example.com/v1/users

# Full request with headers and body
api-medic run https://api.example.com/v1/users \
    --method POST \
    --header "Authorization: Bearer ..." \
    --body '{"name": "Alex Doe"}'

# Re-run a curl command the user pasted into the ticket
api-medic from-curl 'curl -X GET https://api.example.com/v1/users -H ...'

# Analyze a HAR exported from the browser network tab
api-medic from-har session.har

# Launch the local web UI
api-medic serve

06 // The Web UI

The web UI is the surface a customer or a ticket reporter uses. Everything is form-driven — no commands to remember, no flags to look up — and every input maps onto the same internal Request the CLI builds. The local mode (api-medic serve) and the hosted demo render the same UI; the only difference is that the hosted version refuses non-HTTPS targets and runs every request through the SSRF guard.

web ui · run · request builder
api-medic web UI Run tab — request builder with method dropdown, URL field, header rows for Authorization and Content-Type, and a JSON body
// Run tab — request builder. Method, URL, headers, body. The same fields the CLI accepts as flags.

The HAR tab is the one that earns its keep on real tickets. A user reports "a request failed in the browser an hour ago" and instead of describing it, they hit File → Save All As HAR in DevTools, drop the file in, and pick the failing entry. api-medic re-runs the request as captured — same headers, same body, same cookies — and produces a report against the live response. The same machinery that re-runs a single entry can iterate over the whole HAR and surface anything that already had a finding embedded in the original capture.

web ui · har · upload
api-medic web UI HAR tab — uploaded www.markandrewmarquez.com.har with 23 entries, first request shown as https://www.markandrewmarquez.com/
// HAR tab — drop the file the user exported from DevTools, pick the failing entry, re-run with the same headers and body

The third tab — Demos — ships eight bundled scenarios covering the failure modes the tool is built to catch (expired TLS, expired JWT, CORS misconfiguration, content-length mismatch, and a few others). It exists so a first-time visitor can see the report shape and the suggested-fix copy without needing a broken endpoint of their own.

07 // The Browser Extension

The browser extension closes the last gap — the case where the user is looking at the failing request right now in their own DevTools and shouldn't need to export a HAR file just to ask a question about it. It's a Chrome/Firefox extension that registers a DevTools panel, listens to the same chrome.devtools.network events that populate the Network tab, and lets you analyze any captured request with one click. Capture happens client-side; only the selected request is sent to the analysis backend.

devtools · panel · analyze
api-medic browser extension panel docked in DevTools next to Network tab — selected GET request to httpbin.org/favicon.ico, analyzed report showing 404 status, timing breakdown, and a Content-Length mismatch warning
// DevTools panel — selected request analyzed in place. Same report as the CLI and web UI; same check IDs.

The extension UI is intentionally small — list the captured requests, let the user click "Analyze with api-medic" on one, render the same report inline. Every other surface in the project gets the rich diagnostic experience; the extension's job is to be where the user already is, which means staying out of the way of the rest of DevTools.

08 // Hosting on Lambda Without FastAPI

The hosted demo is the one place where deployment shape mattered. A diagnostic tool is exactly the thing people open once, paste one request into, and close — bursty, low-volume, long idle gaps. That traffic shape is a perfect fit for Lambda's scale-to-zero model, and a perfect mismatch for anything that pays a multi-hundred-millisecond cold-start tax just to spin up an ASGI server.

So FastAPI/uvicorn are excluded from the hosted build on purpose. The Lambda handler is a thin function that maps the API Gateway event onto the same internal Request object the CLI uses, runs the SSRF guard, calls the runner, and returns a rendered report. Everything the core needs is already imported as part of the cold start; nothing extra has to wake up.

The SSRF guard is the load-bearing piece for letting this run as a public surface at all. Targets are resolved before the request fires; any address that resolves into 10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16, 127.0.0.0/8, link-local, multicast, or the AWS instance metadata service is rejected. HTTPS is enforced on the input. Per-IP throttling sits in front of the handler, and request timeouts are hard-capped server-side regardless of what the client asks for. The combination keeps the demo from becoming a free packet generator.

The other half of running this safely as a public surface is honesty about what the service does with what it sees. The hosted demo is stateless: nothing in the request, response, or generated report is persisted server-side — the report is rendered into the response and discarded with the Lambda invocation. The repository ships a PRIVACY.md that states this in plain language so anyone pasting a real-looking URL into the demo can verify what happens to it.

09 // Results

▸ ship
Shipped end-to-endv1 published to PyPI as api-medic (1.1.1 as of 2026-05); hosted demo live at api-medic.markandrewmarquez.com on AWS Lambda + CloudFront/S3; Chrome/Firefox MV3 DevTools extension installable via load-unpacked, store submissions in review.
▸ 20
Coverage of the diagnosis surface20 namespaced checks across 5 categories — DNS, TLS, HTTP semantics, auth/JWT claims, body, rate limiting — each with a stable ID designed to be quoted in tickets and grep-able in CI.
▸ one
One report shape, four surfacesCLI, local web UI, Lambda demo, and browser extension all serialize the same Pydantic Report; identical input produces byte-identical output. TypeScript types are auto-generated from the Python models so the frontend cannot drift.
▸ safe
Public-internet-safe demoSSRF guard rejects RFC1918, loopback, link-local, multicast, and the AWS IMDS endpoint after pre-resolution; HTTPS enforced on input; per-IP throttling at 2 req/s burst 5; Lambda reserved-concurrency cap; AWS budget alarm; no payload persisted (see PRIVACY.md).
▸ cold
Cold-start disciplineNo FastAPI / uvicorn in the Lambda build. The handler dispatches API Gateway events inline against the same Python core, so the httpx / dnspython / cryptography import cost is the only cold-start tax.
▸ ci
CI quality gatespytest, ruff check, ruff format --check, mypy, and a TypeScript type-generation diff all run on every PR. A merge fails if any gate fails or if the generated frontend types fall out of sync with the Python models.

10 // What I Took From It

11 // Try It

The hosted demo is live — paste a public HTTPS URL, upload a HAR, or browse the eight bundled demo scenarios on the Demos tab. Source is open at github.com/marky224/api-medic — Python core, CLI, web UI, browser extension, and Lambda deploy all in the same repo.

Local install (api-medic on PyPI, currently 1.1.1):

pip install api-medic
api-medic https://api.example.com/v1/users    # CLI
api-medic serve                                # local web UI on :8765