Internal Platform · Native AI

Run our n8n data transformation operations from a single hub.

The Native AI platform we use internally keeps n8n live on AWS while Analyzer and Report modules are tested locally in a semi-automated flow. By entering file paths manually we have completed 10k-row .sav transformations with 97% success; once the webhook panel ships, Excel and JSON uploads will run through the UI.

End-to-end data transformation for the internal team

Built by Native AI, this platform unifies ingest, quality checks, report staging, and output storage in a single control surface.

10k+ Rows transformed successfully via the AWS-hosted n8n
50 Retries Manually re-run after supplying file paths
1 Live Module n8n is live on AWS; others remain local
Manual Input Excel/JSON paths still selected by hand

Module status check

Workflow, Analyzer, and Report layers are owned by the Native AI team. n8n runs on AWS today; Analyzer and Report are in local test with AWS migration scheduled.

n8n Orchestration

Currently ingests Excel and JSON payloads when file paths are supplied; results are stored in the n8n data table.

  • Webhook #1 (planned): Analyzer output ingest
  • Webhook #2 (planned): Kick off the report pipeline
  • n8n data table: transient storage and status tracking

SAV Analyzer

In local testing; performs SPSS dictionary parsing and quality checks to emit Excel/JSON outputs.

  • Variable type and label mapping
  • Multi-select and out-of-range validation
  • Next: Post outputs to the n8n webhook through the panel

Data Transformation

Transform SPSS job data using n8n workflows. Select completed jobs and process them through the transformation pipeline.

  • Select completed SPSS jobs with CSV outputs
  • Process data in chunks through n8n workflows
  • Track transformation progress and download results

Our internal transformation flow

Data moves between Analyzer and Report through n8n workflows backed by a two-webhook pattern.

01

Ingest the .sav

The team loads the file into Analyzer; the n8n ingest node validates checksums and opens the job.

02

Parse the dictionary

Analyzer separates the dictionary, normalises labels, and runs quality checks.

03

Produce Excel and JSON

Analyzer prepares Excel and JSON outputs; for now we pass file paths to the n8n job manually until the webhook panel launches.

04

Transform & version

n8n keeps transformation outcomes versioned in the data table; webhook integration for the report pipeline is in progress.

05

Report & archive

The report module is being readied for PDF/DOCX export; completed records will land in AWS S3 for permanent storage.

Open the tools

Buttons point to the active environments; underlying workflows and components evolve every sprint.

n8n Data Transformation

Live on AWS. Manage workflow automation, the data table, and the upcoming S3 writer from here.

/n8n/ Open n8n

SAV Analyzer

In local testing; handles SPSS .sav parsing, quality alerts, and soon will post Excel/JSON outputs to the webhook.

/spss/panel Open Analyzer

Data Transformation

Transform SPSS job data using n8n workflows. Select completed jobs and process them through the transformation pipeline.

/transformation/panel Open Data Transformation

Process components

The PDF roadmap breaks the semi-automated flow into four layers from data ingestion to narrative generation.

1 · Input files

The Excel dataset and `dictionary.json` enter the system. File paths are typed manually today; the web panel will automate this step.

2 · PrepareCleanJSON

The custom JS node inside n8n converts data into a human-readable format. Each project still needs developer updates while we build the dynamic version.

3 · GPT-5 Mini narrative generation

Prepared data is passed to OpenAI, which writes participant-level narratives. Outputs are captured in the n8n data table.

4 · Operational benefit

Manual reporting time drops, though developers still re-run failed jobs. Full automation is the target once we complete the AWS migration.

Turn raw feedback into digital twin intelligence

Extend your workflows with Native AI’s always-on market intelligence platform. Transform survey data, launch digital twins of your audience, and orchestrate consumer analytics with advanced safeguards.

Explore Native AI Platform

Future plans

Highlights pulled from the Native Data Transformation Automation roadmap.

1 · Upload through the web panel

Standard `.sav` files will be uploaded via the in-progress panel, eliminating manual path entry.

2 · Automated Analyzer pipeline on AWS

The Analyzer module, now local, will move to AWS to generate Excel/JSON outputs and send them straight to the n8n webhook.

3 · Intelligent PrepareCleanJSON

The custom node will read the `dictionary.json` dynamically, removing project-specific code edits.

4 · Fully automated reporting

Analyzer outputs will feed the report pipeline; refreshed PDF/DOCX templates will publish to Amazon S3 for archival.