Example: Database Sync
Scenario: You extract data from contracts, invoices, or forms and need it in your database — not a spreadsheet.
Overview
Upload documents → Parse → Map data fields → Insert into database
Setup
Step 1: Connect your database
- Go to Clouds in the sidebar
- Click Add Connection
- Enter your database credentials (PostgreSQL, MySQL, etc.)
- Click Test Connection to verify
Step 2: Review your schema
Use the Schema Viewer to see your tables and columns. Note the table and column names you want to insert into.
Step 3: Build the workflow
- Create a new workflow
- Add a trigger:
- Webhook — for on-demand processing
- Schedule — for periodic batch processing
- Email Inbox — for email-triggered processing
- Add a Parse Document node with your extraction template
- Add a Map Data node to rename fields to match your database columns:
invoice_number → inv_no
vendor_name → supplier
total_amount → amount
invoice_date → inv_date - Add a Target Database node:
- Select your database connection
- Select the target table
- Map the fields to columns
- Save and activate
Step 4: Test
Trigger the workflow with a sample document and verify the data appears in your database.
Example: Invoice to PostgreSQL
Database table: invoices
| Column | Type | Mapped from |
|---|---|---|
id | SERIAL | Auto-generated |
inv_number | VARCHAR | invoice_number |
vendor | VARCHAR | vendor_name |
amount | DECIMAL | total_amount |
inv_date | DATE | invoice_date |
created_at | TIMESTAMP | Auto-generated |
Workflow:
Webhook trigger → Parse Document → Map Data → Target Database (invoices table)
Tips
- Test with SELECT first — use the Schema Viewer to understand your table structure before writing
- Map data types carefully — numbers should go to numeric columns, dates to date columns
- Handle duplicates — consider adding a condition to check if a record already exists
- Start with a test table — create a test table in your database first, verify the pipeline works, then switch to production