Skip to main content
Planasonix is an enterprise ETL platform with 300+ connectors. A connection is a named configuration that tells Planasonix how to reach a system: which connector to use, non-secret settings (hosts, regions, default databases), and which credential supplies secrets at run time. You create connections once and attach them to source, destination, and HTTP nodes across pipelines, schedules, and environments without duplicating passwords or API keys in every job.

What connections are

A connection identifies one external system in your workspace or project. It stores:
  • Connector type — The engine or product family (PostgreSQL, S3, Salesforce, Kafka, and so on), which drives the fields you see and which pipeline nodes can use the connection.
  • Reachability settings — Hostnames, ports, regions, bucket names, base URLs, TLS options, and other values that are safe to show to teammates who edit connections.
  • Credential reference — A pointer to an encrypted credential record; the connection never embeds raw secrets in exportable pipeline metadata.
You usually create one connection per logical endpoint (for example prod-postgres-finance, eu-s3-landing). Use separate connections when authentication, region, or data residency must differ, even for the same product (for example two Salesforce orgs).

How connections work

1

Choose a connector type

When you create a connection, pick the system family. That choice determines required fields, supported authentication modes, and which canvas nodes can reference the connection.
2

Enter non-secret configuration

Set hosts, ports, default databases or buckets, topic prefixes, warehouse names, and TLS behavior. Keep these fields free of embedded secrets; put passwords and keys in credentials instead.
3

Link a credential

Attach a credential that matches the connector (database login, cloud IAM, API key, OAuth client, and so on). See Credentials management for types and rotation.
4

Test and save

Run Test connection so the check uses the same network path as workers (including VPN, private link, or IP allowlists). After success, save and select this connection in pipeline nodes.
Orchestration resolves the connection on each run: Planasonix loads your saved settings, decrypts the linked credential for that execution only, and passes both to the connector runtime. If you rotate a credential, every pipeline that references a connection using that credential picks up the new secret on the next run without you editing individual jobs.

Credential system

Credentials are stored encrypted at rest and injected only when a job needs them. Separating credentials from connection metadata gives you:
  • Rotation — Update one credential; all linked connections benefit.
  • Audit — Review who created credentials and which connections use them, without plaintext secrets in pipeline exports or Git.
  • Least privilege — Grant builders permission to use a connection while restricting who can create or replace underlying secrets.
Supported credential kinds include AWS, Azure, GCP, database logins, API keys, OAuth bundles (client credentials plus stored refresh material), and protocol-specific material (FTP, SAP, and others). The exact list in your tenant may vary by edition; your administrator can confirm which types appear under Credentials.
Some organizations connect Planasonix to an external secret manager. If that applies to you, credential records may proxy to vault paths instead of holding long-lived plaintext in Planasonix—follow your security team’s runbook for rotation.

Connection type categories

Use the cards below to open guides for supported systems, authentication options, and configuration patterns.

Databases

PostgreSQL, MySQL, SQL Server, Oracle, MongoDB, ClickHouse, Cassandra, DuckDB, Elasticsearch, and managed cloud variants.

Data warehouses

Snowflake, BigQuery, Databricks, Redshift, Synapse, Fabric, and Apache Iceberg–oriented lakehouse paths.

Cloud storage

S3, Azure Blob, GCS, R2, MinIO, Wasabi, Box, OneDrive, SharePoint, FTP/SFTP, and common file formats.

Streaming platforms

Kafka, Confluent Cloud, Redpanda, Kinesis, Pulsar, Upstash, with Confluent and Glue schema registry options.

APIs and webhooks

REST, AI REST, GraphQL, SAP OData, SAP RFC/BAPI, webhooks, OpenAPI import, and pagination tuning.

SaaS applications

CRM, ERP, commerce, ads, marketing, HR, analytics, collaboration, and 100+ more packaged connectors.

AI providers

OpenAI, Anthropic Claude, Google Gemini for LLM transforms and AI Copilot.

Credentials management

Types, sharing, rotation, OAuth tokens, and security practices for integration owners.

Choose a starting point

If data lives in a transactional or document database, start with Databases. Use read replicas or scoped database users for heavy extract workloads.
For Snowflake, BigQuery, Databricks, or similar, use Data warehouses. Pair with Cloud storage when you stage files before load.
For CSV, Parquet, JSON, or partner drops in buckets, use Cloud storage.
For topics and streams, use Streaming platforms.
Use APIs and webhooks for generic or custom HTTP. Use SaaS applications when Planasonix ships a first-class connector (OAuth, object catalog, incremental sync).
Name connections after environment and system (for example prod-salesforce, eu-s3-landing) so anyone selecting a connection on the canvas can choose correctly without opening the full configuration.