Documentation Index
Fetch the complete documentation index at: https://docs.usehasp.com/llms.txt
Use this file to discover all available pages before exploring further.
Bulk methods allow you to create, update, or delete up to 100 records in a single request. Use them for imports, batch status updates, or any operation where looping individual calls would be slow or hit rate limits.
bulkCreateRecords(entityKey, records, signal?)
Creates up to 100 records in one request.
const result = await sdk.bulkCreateRecords('tasks', [
{ title: 'Task A', status: 'open' },
{ title: 'Task B', status: 'open' },
]);
if (result.error) {
// Partial failure — some records were created, some failed
console.log('created:', result.data); // RecordData[] of successes
console.log('failed:', result.error.details); // { "1": { title: ['Required'] } }
} else {
console.log('all created:', result.data);
}
Unlike single-record operations, partial bulk failures do not throw — always check result.error after every bulk call.
Whole-batch failures (entity not found, quota exceeded) throw HaspSDKError before any records are written.
bulkUpdateRecords(entityKey, updates, signal?)
Updates up to 100 records in one request. Each item must include id and data. PATCH semantics — only submitted fields change; send null to clear a field.
const result = await sdk.bulkUpdateRecords('tasks', [
{ id: '01JA...', data: { status: 'done' } },
{ id: '01JB...', data: { status: 'done' } },
]);
if (result.error) {
console.log('updated:', result.data);
// result.error.details: { "1": { _general: ['Record not found.'] } }
}
bulkDeleteRecords(entityKey, ids, signal?)
Deletes up to 100 records in one request.
const result = await sdk.bulkDeleteRecords('tasks', ['01JA...', '01JB...']);
if (result.error) {
console.log('deleted:', result.data.deleted); // string[] of deleted IDs
console.log('failed:', result.error.details);
}
Limits
| Operation | Max per request |
|---|
| bulkCreateRecords | 100 records |
| bulkUpdateRecords | 100 records |
| bulkDeleteRecords | 100 IDs |
For datasets larger than 100, loop in batches:
const batchSize = 100;
for (let i = 0; i < records.length; i += batchSize) {
const batch = records.slice(i, i + batchSize);
const result = await sdk.bulkCreateRecords('tasks', batch);
if (result.error) {
console.warn('Partial failure in batch', i / batchSize, result.error.details);
}
}