nodestash

Syncing External Data

Import contacts from external systems safely and efficiently

This guide covers patterns for importing data from external systems into nodestash — handling idempotency, pagination, and error recovery.

Idempotent Imports

When importing data, use idempotency keys to ensure that retrying a failed import doesn't create duplicates:

curl -X POST https://api.nodestash.io/v1/contacts \
  -H "Authorization: Bearer $NODESTASH_API_KEY" \
  -H "Content-Type: application/json" \
  -H "Idempotency-Key: hubspot-import-contact-12345" \
  -d '{
    "email": "imported@example.com",
    "first_name": "Imported",
    "last_name": "User",
    "source": "hubspot"
  }'
const contact = await client.contacts.create(
  {
    email: 'imported@example.com',
    first_name: 'Imported',
    last_name: 'User',
    source: 'hubspot',
  },
  'hubspot-import-contact-12345', // idempotency key
)

Use a predictable idempotency key format like {source}-import-{entity}-{external_id} so you can safely retry any failed import.

Batch Import Pattern

Import contacts in batches, handling errors per-record:

import { NodeStash, ValidationError, RateLimitError } from '@nodestash/sdk'

const client = new NodeStash({
  apiKey: process.env.NODESTASH_API_KEY!,
  maxRetries: 5, // extra retries for imports
})

interface ExternalContact {
  id: string
  email: string
  name: string
  company: string
}

async function importContacts(externalContacts: ExternalContact[]) {
  const results = { success: 0, skipped: 0, failed: 0 }

  for (const ext of externalContacts) {
    try {
      const [firstName, ...rest] = ext.name.split(' ')
      const lastName = rest.join(' ')

      await client.contacts.create(
        {
          email: ext.email,
          first_name: firstName,
          last_name: lastName || undefined,
          source: 'external-crm',
          tags: ['imported'],
        },
        `import-external-${ext.id}`, // idempotency key
      )
      results.success++
    } catch (error) {
      if (error instanceof ValidationError) {
        console.warn(`Skipping invalid contact ${ext.id}: ${error.message}`)
        results.skipped++
      } else {
        console.error(`Failed to import ${ext.id}: ${error}`)
        results.failed++
      }
    }
  }

  return results
}

Syncing with Pagination

When syncing from a source that also uses pagination, combine both pagination flows:

async function fullSync() {
  // 1. Export all existing contacts from nodestash
  const existing = new Map<string, string>() // email → id

  for await (const contact of client.contacts.listAll({ limit: 100 })) {
    if (contact.email) {
      existing.set(contact.email, contact.id)
    }
  }

  console.log(`Found ${existing.size} existing contacts`)

  // 2. Fetch from external source and sync
  const externalContacts = await fetchFromExternalSystem()

  for (const ext of externalContacts) {
    const existingId = existing.get(ext.email)

    if (existingId) {
      // Update existing contact
      await client.contacts.update(existingId, {
        first_name: ext.firstName,
        last_name: ext.lastName,
        phone: ext.phone,
      })
    } else {
      // Create new contact
      await client.contacts.create(
        {
          email: ext.email,
          first_name: ext.firstName,
          last_name: ext.lastName,
          phone: ext.phone,
          source: 'sync',
        },
        `sync-${ext.email}`,
      )
    }
  }
}

Rate Limit Considerations

When running bulk imports, be mindful of rate limits:

PlanDaily LimitBurst Limit
Free1,0005 req/s
Starter25,00025 req/s
Pro250,00050 req/s
Scale2,000,000200 req/s

The SDK automatically handles 429 responses with retries, but you can add your own throttling:

async function throttledImport(contacts: ExternalContact[], batchSize = 10) {
  for (let i = 0; i < contacts.length; i += batchSize) {
    const batch = contacts.slice(i, i + batchSize)

    // Process batch concurrently
    await Promise.allSettled(
      batch.map((c) =>
        client.contacts.create(
          {
            email: c.email,
            first_name: c.name.split(' ')[0],
            source: 'import',
          },
          `import-${c.id}`,
        ),
      ),
    )

    console.log(`Imported ${Math.min(i + batchSize, contacts.length)}/${contacts.length}`)
  }
}

Error Recovery

Track import progress so you can resume from where you left off:

async function resumableImport(
  contacts: ExternalContact[],
  startIndex = 0,
) {
  for (let i = startIndex; i < contacts.length; i++) {
    try {
      await client.contacts.create(
        {
          email: contacts[i].email,
          first_name: contacts[i].name,
          source: 'import',
        },
        `import-${contacts[i].id}`,
      )
    } catch (error) {
      if (error instanceof RateLimitError) {
        console.log(`Rate limited at index ${i}. Resume from here.`)
        // Save progress and resume later
        return { lastIndex: i, completed: false }
      }
      // Other errors — log and continue
      console.error(`Error at index ${i}:`, error)
    }
  }

  return { lastIndex: contacts.length, completed: true }
}

Best Practices

  • Always use idempotency keys — they make imports safe to retry
  • Use the source field — tag imported contacts with their origin for traceability
  • Handle errors per-record — don't let one bad record stop the entire import
  • Respect rate limits — use the SDK's built-in retry or add throttling for large imports
  • Track progress — save your position so you can resume failed imports
  • Start with a test key — run imports against test data first using nds_test_ keys

On this page