Scoop connects directly to your database or data warehouse, enabling powerful AI-driven analysis on your live data. Whether you're using a cloud data warehouse like Snowflake or BigQuery, or a traditional relational database like PostgreSQL or MySQL, Scoop makes it easy to query and analyze your data.
| Database | Type | Default Port | Best For |
|---|
| Snowflake | Cloud Data Warehouse | 443 | Enterprise analytics, large-scale data |
| PostgreSQL | Relational Database | 5432 | General purpose, web applications |
| MySQL | Relational Database | 3306 | Web applications, widely deployed |
| Amazon Redshift | Cloud Data Warehouse | 5439 | AWS ecosystem, petabyte-scale analytics |
| Google BigQuery | Cloud Data Warehouse | 443 | Google Cloud ecosystem, serverless analytics |
| Oracle | Enterprise Database | 1521 | Enterprise applications, legacy systems |
| SQL Server | Enterprise Database | 1433 | Microsoft ecosystem, enterprise apps |
| MariaDB | Relational Database | 3306 | MySQL-compatible, open source |
| ClickHouse | Analytical Database | 8123 | Real-time analytics, high-volume data |
| Greenplum | Data Warehouse | 5432 | Large-scale analytics, PostgreSQL-compatible |
| IBM DB2 | Enterprise Database | 50000 | Enterprise systems, mainframe integration |
| Teradata | Enterprise Data Warehouse | 1025 | Enterprise analytics, large organizations |
| Vertica | Analytical Database | 5433 | High-performance analytics |
| Your Situation | Recommended Database |
|---|
| AWS environment | Redshift, PostgreSQL (RDS) |
| Google Cloud | BigQuery, Cloud SQL |
| Azure environment | SQL Server, PostgreSQL |
| Large-scale analytics | Snowflake, BigQuery, Redshift |
| Existing PostgreSQL | PostgreSQL direct connection |
| MySQL application data | MySQL direct connection |
| Enterprise legacy | Oracle, DB2, Teradata |
| Real-time analytics | ClickHouse, Vertica |
Scoop offers two ways to connect to your database:
Import your data into Scoop for analysis. Data is copied and stored in Scoop.
| Feature | Description |
|---|
| Scheduled sync | Daily, hourly, or custom schedules |
| Transformation | Clean and transform during import |
| Blending | Combine with other data sources |
| Snapshots | Track changes over time |
Best for:
- Scheduled reports that run daily
- Data that needs transformation before analysis
- Combining data from multiple sources
- Tracking historical changes
Query your database directly without importing. Data stays in your database.
| Feature | Description |
|---|
| Real-time | Always current data |
| No storage | Data not copied to Scoop |
| Star schema | Fact and dimension table joins |
| Large datasets | No import size limits |
Best for:
- Real-time analysis on current data
- Large datasets where importing isn't practical
- Star schema queries with fact and dimension tables
- Security-sensitive data that must stay in place
| Factor | Import Mode | Live Query Mode |
|---|
| Data freshness | Scheduled updates | Real-time |
| Query speed | Very fast (local data) | Depends on database |
| Data volume limits | Scoop storage limits | Database limits only |
| Transformation | Full support | Limited |
| Historical tracking | Snapshots available | Query historical tables |
| Network dependency | Only during sync | Every query |
Select your database from the supported list above.
Create a dedicated user with read-only permissions:
| Database | Permission Required |
|---|
| PostgreSQL | SELECT on tables/views |
| MySQL | SELECT privilege |
| Snowflake | USAGE on warehouse + SELECT on tables |
| BigQuery | BigQuery Data Viewer role |
| Redshift | SELECT on schema.tables |
| SQL Server | db_datareader role |
| Oracle | SELECT on tables |
Security Best Practice: Always use read-only credentials. Scoop only needs SELECT permissions and will never modify your data.
Configure your firewall to allow Scoop connections:
| Environment | IP Address |
|---|
| Production | 44.231.97.118 |
All Scoop services (API, Live Query, scheduled imports) connect from this single IP address.
| Cloud Provider | Where to Whitelist |
|---|
| AWS (RDS, Redshift) | Security Groups > Inbound Rules |
| Google Cloud (BigQuery, Cloud SQL) | VPC Network > Firewall Rules |
| Azure (SQL Database, Synapse) | Networking > Firewall Rules |
| Snowflake | Network Policies |
| Heroku | Trusted Sources in Data Clips |
Configure your firewall to allow incoming connections from 44.231.97.118 on your database's port:
| Database | Default Port |
|---|
| PostgreSQL | 5432 |
| MySQL/MariaDB | 3306 |
| SQL Server | 1433 |
| Oracle | 1521 |
| Redshift | 5439 |
In Scoop, provide your connection information:
| Field | Description | Example |
|---|
| Host | Database server address | mydb.abc123.us-west-2.rds.amazonaws.com |
| Port | Database port | 5432 |
| Database | Database name | analytics |
| Username | Read-only user | scoop_reader |
| Password | User password | (secure credential) |
| Schema | Schema to query (optional) | public |
Choose how to access your data:
| Method | Use Case |
|---|
| Select tables | Quick access to existing tables |
| Select views | Pre-defined queries from your DBA |
| Custom SQL | Specific queries with joins, filters |
| Practice | Why |
|---|
| Read-only user | Scoop never needs write access |
| Dedicated user | Easy to audit and revoke |
| Strong password | Protect database access |
| Rotate regularly | Follow security policy |
| Practice | Why |
|---|
| IP whitelisting | Only allow Scoop IP (44.231.97.118) |
| SSL/TLS | Encrypt data in transit |
| VPN/tunnel | Additional layer for sensitive data |
| Firewall rules | Restrict to database port only |
| Practice | Why |
|---|
| Limit schemas | Only expose needed data |
| Row-level security | Restrict sensitive records |
| Column masking | Hide PII columns if needed |
| View-based access | Control exactly what's queryable |
| Scenario | Configuration |
|---|
| Snowflake | Account identifier, warehouse, database, schema |
| BigQuery | Project ID, dataset, service account JSON |
| Redshift | Cluster endpoint, database, schema |
| Scenario | Configuration |
|---|
| Production DB | Read replica recommended for performance |
| Analytics DB | Direct connection typically fine |
| Multi-tenant | Schema-per-tenant or filtered views |
| Scenario | Configuration |
|---|
| Direct connection | Open firewall port to Scoop IP |
| SSH tunnel | Bastion host for secure access |
| VPN | Site-to-site or client VPN |
| Error | Solution |
|---|
| Connection timeout | Check firewall rules, verify IP whitelisted |
| Authentication failed | Verify username/password, check user exists |
| Database not found | Confirm database name spelling |
| Permission denied | Check user has SELECT permissions |
| Error | Solution |
|---|
| Table not found | Verify schema and table name |
| Column not found | Check column exists in table |
| Syntax error | Review SQL syntax for database type |
| Timeout | Optimize query or increase timeout |
| Issue | Solution |
|---|
| Slow queries | Add indexes, optimize SQL |
| Connection drops | Check network stability |
| High database load | Use read replica, schedule off-peak |
| Large result sets | Add filters, paginate results |
For detailed setup instructions, select your database: