Skip to main content
APIs are different from form-submit surfaces. You don’t want a sealed handoff per request — you want one Tripwire session that covers a user’s browsing session, reused across many API calls. And you want to distinguish LLM scrapers (ai-agent) from legitimate crawlers (verified-bot, crawler), because the right answer for those is different.

The threat

Two problems get lumped together as “API abuse,” and they want different responses:
  • LLM scraping. Cloud agents like browser-use and computer-use, plus self-hosted equivalents, scrape the web on behalf of model training, retrieval pipelines, or end-user automation. They look like browsers because they are browsers, just driven by an agent loop. Tripwire attributes these as ai-agent.
  • Scripted scraping. Classic headless Chrome or HTTP-level clients iterating your listing endpoints, pricing APIs, or search results. automation attribution.
Against both, you want something stronger than a per-user-account rate limit — attackers rotate accounts — and stronger than a per-IP rate limit — attackers rotate residential proxies. The durable visitor fingerprint is the axis that holds. At the same time, you almost certainly do want some non-human traffic:
  • Search crawlers (Googlebot, Bingbot) — you want them indexing your public content.
  • Web-Bot-Auth–authenticated agents — a growing set of AI products sign their requests with HTTP message signatures declaring themselves. Tripwire surfaces this as the verified-bot category and carries the domain/key identity through to your backend.
  • Your own internal tooling — health checks, monitoring, analytics pipelines.
The integration on this page separates “block” from “allow” on the attribution category, not just the verdict.

The flow

1

Start Tripwire once at app boot

For authenticated SPAs, start the client when the shell mounts. For public APIs accessed directly (no browser), see the bottom of this page.
2

Acquire a session once and reuse it

Call getSession() at the start of an API-consuming flow — not on every request. Pass sessionId as a header on subsequent XHRs.
3

Verify on the first protected request

Either verify a fresh sealed token, or look up the durable session via GET /v1/sessions/:sessionId and cache the verdict.
4

Re-verify on high-value actions or periodically

For mutations or sensitive reads, request a fresh handoff. For long-lived sessions, refresh the cached verdict every few minutes.
5

Split policy by attribution category

Allow verified-bot and crawler; block automation and ai-agent; treat human normally.

Client integration

Request a sealed handoff at the start of a session, stash the session ID, and send it as a header on every API call.
<script type="module">
  const tripwirePromise = import("https://cdn.tripwirejs.com/t.js").then(
    (Tripwire) =>
      Tripwire.start({
        publishableKey: "pk_live_your_publishable_key",
      }),
  );

  let sessionHandoffPromise = null;

  async function getTripwireHandoff() {
    if (!sessionHandoffPromise) {
      sessionHandoffPromise = tripwirePromise.then((t) => t.getSession());
    }
    return sessionHandoffPromise;
  }

  async function apiFetch(path, init = {}) {
    const { sessionId, sealedToken } = await getTripwireHandoff();
    const headers = new Headers(init.headers);
    headers.set("X-Tripwire-Session", sessionId);
    headers.set("X-Tripwire-Token", sealedToken);
    return fetch(path, { ...init, headers });
  }

  // Refresh the handoff for high-value mutations
  async function refreshHandoff() {
    const tripwire = await tripwirePromise;
    sessionHandoffPromise = Promise.resolve(tripwire.getSession());
    return sessionHandoffPromise;
  }
</script>

Server verification: two patterns

For long-lived API sessions you have a choice. Either verify the sealed token on every request (cheap — it’s a local crypto operation) and cache the decoded result, or call GET /v1/sessions/:sessionId once and cache the durable verdict for a few minutes.

Pattern A: verify the sealed token per request

const { safeVerifyTripwireToken } = require("@abxy/tripwire-server");

async function tripwireGuard(req, res, next) {
  const sealedToken = req.get("X-Tripwire-Token");
  if (!sealedToken) {
    return res.status(401).json({ error: "Missing Tripwire token" });
  }

  const result = safeVerifyTripwireToken(sealedToken, process.env.TRIPWIRE_SECRET_KEY);
  if (!result.ok) {
    return res.status(401).json({ error: "Invalid Tripwire token" });
  }

  const { decision, attribution } = result.data;
  const category = attribution?.bot?.facets?.category?.value;

  // Allow verified bots and search crawlers on read endpoints
  if (req.method === "GET" && (category === "verified-bot" || category === "crawler")) {
    req.tripwireVerdict = { verdict: "allowed_bot", category };
    return next();
  }

  // Block automation and ai-agent regardless of method
  if (decision.verdict === "bot") {
    return res.status(403).json({ error: "Blocked" });
  }

  req.tripwireVerdict = { verdict: decision.verdict, category: category ?? null };
  next();
}

app.use("/api", tripwireGuard);

Pattern B: cache the durable verdict

For very hot endpoints, verify once per session and cache the result.
Node.js
const { Tripwire } = require("@abxy/tripwire-server");
const client = new Tripwire({ secretKey: process.env.TRIPWIRE_SECRET_KEY });
const verdictCache = new Map(); // sessionId → { verdict, category, expiresAt }

async function getCachedVerdict(sessionId) {
  const cached = verdictCache.get(sessionId);
  if (cached && cached.expiresAt > Date.now()) return cached;

  const session = await client.sessions.get(sessionId);
  const category = session.automation?.category ?? null;
  const entry = {
    verdict: session.decision.automation_status, // "automated" | "human" | "uncertain"
    category,
    expiresAt: Date.now() + 60_000, // 1 minute
  };
  verdictCache.set(sessionId, entry);
  return entry;
}
The durable readback returns decision.automation_status ("automated" | "human" | "uncertain") rather than the sealed-token verdict field; see Server verification for the full shape.

When to re-verify

Session reuse is great for throughput, but a long-lived session becomes a stale verdict. Three refresh triggers worth coding:
  • High-value mutations — a user changing their email, deleting data, exporting an archive. Always ask for a fresh sealed handoff from the client.
  • Periodic refresh — every 5–15 minutes for long sessions, re-fetch GET /v1/sessions/:sessionId. Tripwire’s server-side behavioral scoring can move a verdict between snapshot and behavioral phases as evidence accumulates, and you want the latest.
  • Suspicious pattern observed — your own application logic (burst of identical queries, geographic jump) can trigger a client-side tripwire.getSession() that produces a new, freshly-attested handoff.

Splitting policy by attribution

For read APIs, the policy matrix that works on most sites:
CategoryGETPOST / PUT / DELETE
humanAllowAllow
verified-botAllowBlock (bots generally shouldn’t mutate)
crawlerAllowBlock
automationBlock or throttleBlock
ai-agentBlock or throttleBlock
unknown (with bot verdict)ThrottleBlock
“Allow” doesn’t mean unlimited. Keep a generous rate limit on verified bots — a badly-written crawler can still hurt you — but don’t return 403s. If you want LLM agents to read your content but not scrape it, set different caps for ai-agent: a low QPS limit that’s fine for an agent answering one user’s question and painful for a training-data crawler.

APIs called directly (no browser)

Some API consumers don’t run a browser at all — a mobile app, a server-to-server integration, a CLI tool. Tripwire’s browser SDK doesn’t apply there. Two options:
  • Require pre-issued API keys for non-browser traffic. Route traffic without a Tripwire session header down the API-key path, and apply Tripwire only to browser-origin requests.
  • Rely on network-level attribution. Even without the browser bundle, Tripwire’s HTTP edge sees JA4 TLS fingerprints, HTTP/2 SETTINGS, and Web-Bot-Auth signatures, and the durable session API can surface these for direct requests. See Detection categories for what’s available without the client SDK.

What’s next

User-generated content

The write-side counterpart: stop LLM posts at the composer.

Server verification

Reference for both sealed-token and durable-readback paths.

Detection categories

What Tripwire detects, with and without the browser SDK.

Going to production

Rollout plan for API-wide enforcement.