Data Ingestion

How to send data to Finic

Ingesting Data

Finic offers real-time monitoring by default. Any data sent to the /transaction, /account, or /activity endpoints will be processed and available in the dashboard within seconds. Any real-time rules will be evaluated immediately when a record is received.

Batch Ingestion

It can be useful to upload a large number of records at once for use cases that aren’t time sensitive, or when backfilling historical records when onboarding onto Finic for the first time.

Use the Batch API to send a large number of records. Each request returns a job ID, which can be used to track the status of the batch processing by polling the /batch-status endpoint.

If you don’t have access to send records via API, you can send flat files to our SFTP server instead. Contact us in Slack or at support@finic.ai to learn more.

If you don’t have access to send records via API or through SFTP, you can manually upload files to Finic through a secure link. Contact us in Slack or at support@finic.ai to learn more.

Schema Changes

By default, all POST APIs will return a 400 error if the schema doesn’t match the schema we have on file. You can view and update the schema for each object in the Finic Dashboard.

Automatic Schema Updates

We understand that the data you collect and is always evolving, which is why Finic is designed to gracefully handle schema changes without having to manually update the schema each time.

If the update_schema flag is set to true in any POST request, we will automatically detect any schema changes and update the schema on file for that object.

  • Any fields that are missing will be set to null going forward, but historical records will not be updated.
  • Any fields that are new will be added to the schema, and historical records will have the value of this column set to null. You can then manually backfill values through the Batch APIs.

Security & Compliance

Finic is certified SOC2 and ISO 27001 compliant. Click here to view our security center.