Supported Databases

Connect to your data warehouse or database for AI-powered analysis

Scoop connects directly to your database or data warehouse, enabling powerful AI-driven analysis on your live data. Whether you're using a cloud data warehouse like Snowflake or BigQuery, or a traditional relational database like PostgreSQL or MySQL, Scoop makes it easy to query and analyze your data.

Supported Databases

DatabaseTypeDefault PortBest For
SnowflakeCloud Data Warehouse443Enterprise analytics, large-scale data
PostgreSQLRelational Database5432General purpose, web applications
MySQLRelational Database3306Web applications, widely deployed
Amazon RedshiftCloud Data Warehouse5439AWS ecosystem, petabyte-scale analytics
Google BigQueryCloud Data Warehouse443Google Cloud ecosystem, serverless analytics
OracleEnterprise Database1521Enterprise applications, legacy systems
SQL ServerEnterprise Database1433Microsoft ecosystem, enterprise apps
MariaDBRelational Database3306MySQL-compatible, open source
ClickHouseAnalytical Database8123Real-time analytics, high-volume data
GreenplumData Warehouse5432Large-scale analytics, PostgreSQL-compatible
IBM DB2Enterprise Database50000Enterprise systems, mainframe integration
TeradataEnterprise Data Warehouse1025Enterprise analytics, large organizations
VerticaAnalytical Database5433High-performance analytics

Quick Selection Guide

Your SituationRecommended Database
AWS environmentRedshift, PostgreSQL (RDS)
Google CloudBigQuery, Cloud SQL
Azure environmentSQL Server, PostgreSQL
Large-scale analyticsSnowflake, BigQuery, Redshift
Existing PostgreSQLPostgreSQL direct connection
MySQL application dataMySQL direct connection
Enterprise legacyOracle, DB2, Teradata
Real-time analyticsClickHouse, Vertica

Connection Methods

Scoop offers two ways to connect to your database:

1. Import Mode (Traditional)

Import your data into Scoop for analysis. Data is copied and stored in Scoop.

FeatureDescription
Scheduled syncDaily, hourly, or custom schedules
TransformationClean and transform during import
BlendingCombine with other data sources
SnapshotsTrack changes over time

Best for:

  • Scheduled reports that run daily
  • Data that needs transformation before analysis
  • Combining data from multiple sources
  • Tracking historical changes

2. Live Query Mode

Query your database directly without importing. Data stays in your database.

FeatureDescription
Real-timeAlways current data
No storageData not copied to Scoop
Star schemaFact and dimension table joins
Large datasetsNo import size limits

Best for:

  • Real-time analysis on current data
  • Large datasets where importing isn't practical
  • Star schema queries with fact and dimension tables
  • Security-sensitive data that must stay in place

Comparing Connection Modes

FactorImport ModeLive Query Mode
Data freshnessScheduled updatesReal-time
Query speedVery fast (local data)Depends on database
Data volume limitsScoop storage limitsDatabase limits only
TransformationFull supportLimited
Historical trackingSnapshots availableQuery historical tables
Network dependencyOnly during syncEvery query

Getting Started

Step 1: Choose Your Database

Select your database from the supported list above.

Step 2: Create a Read-Only User

Create a dedicated user with read-only permissions:

DatabasePermission Required
PostgreSQLSELECT on tables/views
MySQLSELECT privilege
SnowflakeUSAGE on warehouse + SELECT on tables
BigQueryBigQuery Data Viewer role
RedshiftSELECT on schema.tables
SQL Serverdb_datareader role
OracleSELECT on tables

Security Best Practice: Always use read-only credentials. Scoop only needs SELECT permissions and will never modify your data.

Step 3: Whitelist Scoop's IP Address

Configure your firewall to allow Scoop connections:

EnvironmentIP Address
Production44.231.97.118

All Scoop services (API, Live Query, scheduled imports) connect from this single IP address.

Cloud Database Whitelisting

Cloud ProviderWhere to Whitelist
AWS (RDS, Redshift)Security Groups > Inbound Rules
Google Cloud (BigQuery, Cloud SQL)VPC Network > Firewall Rules
Azure (SQL Database, Synapse)Networking > Firewall Rules
SnowflakeNetwork Policies
HerokuTrusted Sources in Data Clips

On-Premises Databases

Configure your firewall to allow incoming connections from 44.231.97.118 on your database's port:

DatabaseDefault Port
PostgreSQL5432
MySQL/MariaDB3306
SQL Server1433
Oracle1521
Redshift5439

Step 4: Enter Connection Details

In Scoop, provide your connection information:

FieldDescriptionExample
HostDatabase server addressmydb.abc123.us-west-2.rds.amazonaws.com
PortDatabase port5432
DatabaseDatabase nameanalytics
UsernameRead-only userscoop_reader
PasswordUser password(secure credential)
SchemaSchema to query (optional)public

Step 5: Select Tables or Write Queries

Choose how to access your data:

MethodUse Case
Select tablesQuick access to existing tables
Select viewsPre-defined queries from your DBA
Custom SQLSpecific queries with joins, filters

Security Best Practices

Credentials

PracticeWhy
Read-only userScoop never needs write access
Dedicated userEasy to audit and revoke
Strong passwordProtect database access
Rotate regularlyFollow security policy

Network

PracticeWhy
IP whitelistingOnly allow Scoop IP (44.231.97.118)
SSL/TLSEncrypt data in transit
VPN/tunnelAdditional layer for sensitive data
Firewall rulesRestrict to database port only

Data Access

PracticeWhy
Limit schemasOnly expose needed data
Row-level securityRestrict sensitive records
Column maskingHide PII columns if needed
View-based accessControl exactly what's queryable

Common Connection Scenarios

Cloud Data Warehouse

ScenarioConfiguration
SnowflakeAccount identifier, warehouse, database, schema
BigQueryProject ID, dataset, service account JSON
RedshiftCluster endpoint, database, schema

Application Database

ScenarioConfiguration
Production DBRead replica recommended for performance
Analytics DBDirect connection typically fine
Multi-tenantSchema-per-tenant or filtered views

On-Premises Database

ScenarioConfiguration
Direct connectionOpen firewall port to Scoop IP
SSH tunnelBastion host for secure access
VPNSite-to-site or client VPN

Troubleshooting

Connection Failed

ErrorSolution
Connection timeoutCheck firewall rules, verify IP whitelisted
Authentication failedVerify username/password, check user exists
Database not foundConfirm database name spelling
Permission deniedCheck user has SELECT permissions

Query Errors

ErrorSolution
Table not foundVerify schema and table name
Column not foundCheck column exists in table
Syntax errorReview SQL syntax for database type
TimeoutOptimize query or increase timeout

Performance Issues

IssueSolution
Slow queriesAdd indexes, optimize SQL
Connection dropsCheck network stability
High database loadUse read replica, schedule off-peak
Large result setsAdd filters, paginate results

Database-Specific Guides

For detailed setup instructions, select your database:

Related Topics