GTM Stack

How do I handle large Phantombuster datasets exceeding 200kb in Clay?

When Phantombuster results exceed 200kb (typically with ~700+ comments), the API returns a CSV file instead of inline values. This commonly occurs when pulling LinkedIn post comments into Clay.
February 2026

1 Answer

Handling Large Phantombuster Datasets in Clay

When Phantombuster results exceed 200kb (typically with ~700+ comments), the API returns a CSV file instead of inline values.

Solutions

Option 1: Reduce Dataset Size

  • Limit the number of comments to a smaller amount for direct table output in Clay

Option 2: Process Large Datasets with n8n

  1. Send an HTTP request from Clay to n8n
  2. Configure n8n to process all comments from the Phantombuster CSV
  3. Have n8n push the processed data back to your Clay table in batches
February 2026

Add your take

Have experience with the tools discussed here? Share your honest opinion.