Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.usehasp.com/llms.txt

Use this file to discover all available pages before exploring further.

The Hasp AI API exposes two inference surfaces. Both enforce the same compliance checks — BAA validation, credit accounting, PHI scanning — but they differ in what they expose and who they’re designed for.

Surface comparison

Native (/v1/ai/chat)Anthropic-compat (/v1/messages)
Primary use caseNew integrations, regulated workflowsDrop-in for existing Anthropic SDK code
Migration effortMinimal (Hasp-native SDK)None — change baseURL only
Response envelopeHasp envelope with meta.request_id, meta.phiAnthropic wire format
Error formatHasp error codes (INVALID_API_KEY, etc.)Anthropic error types (authentication_error, etc.)
StreamingSSE with Hasp event taxonomySSE matching Anthropic’s stream protocol
PHI metadataExposed in response (meta.phi.*)Not exposed
Audit accessRetrieve via GET /v1/ai/messages/{id}Same — request_id maps to same record
Future featuresFirst to get Hasp-native capabilitiesAnthropic wire compatibility only

When to use native (/v1/ai/chat)

Use the native surface when you:
  • Are building a new integration and want full Hasp capabilities from the start.
  • Need PHI metadata in responses (entity types detected, redaction count).
  • Want the Hasp error vocabulary (PHI_BLOCKED, BAA_REQUIRED, etc.) rather than the Anthropic mapping.
  • Are building tooling against the audit trail and want the richest response envelope.

When to use Anthropic-compat (/v1/messages)

Use the compat surface when you:
  • Already have code using @anthropic-ai/sdk and want Hasp compliance with zero code changes.
  • Are evaluating Hasp as a drop-in replacement for direct Anthropic calls.
  • Are building a prototype and want to defer the native migration.
The migration from compat to native is a single endpoint change plus adopting the Hasp error vocabulary. No compliance or PHI-handling capabilities are lost by starting on compat.

Migrating from compat to native

- import Anthropic from '@anthropic-ai/sdk';
- const client = new Anthropic({ baseURL: 'https://api.usehasp.com', apiKey: process.env.HASP_API_KEY });
- const msg = await client.messages.create({ model: 'claude-sonnet-4-6', ... });
+ // Coming soon: Hasp-native SDK
+ // For now, use the native REST endpoint directly
+ const res = await fetch('https://api.usehasp.com/v1/ai/chat', {
+   method: 'POST',
+   headers: { Authorization: `Bearer ${process.env.HASP_API_KEY}`, 'Content-Type': 'application/json' },
+   body: JSON.stringify({ model: 'claude-sonnet-4-6', messages: [...] }),
+ });
The response shape differs: native responses include meta.request_id at top level and meta.phi if redaction was applied. Anthropic-compat responses follow the standard Anthropic shape with id as the request identifier.

What’s the same on both surfaces

  • All compliance checks run. PHI scanning, BAA validation, credit checks, and audit logging are identical on both surfaces. You cannot bypass compliance by using the compat endpoint.
  • Same API keys. The same wa_live_* key works on both surfaces with the ai:chat scope.
  • Same model identifiers. The Hasp model name (claude-sonnet-4-6) is the same on both.
  • Same request_id. The id field on compat responses and the meta.request_id on native responses resolve to the same record via GET /v1/ai/messages/{id}.