Decube
Try for free
  • 🚀Overview
    • Welcome to decube
    • Getting started
      • How to connect data sources
    • Security and Compliance
    • Data Policy
    • Changelog
    • Public Roadmap
  • Support
  • 🔌Data Warehouses
    • Snowflake
    • Redshift
    • Google Bigquery
    • Databricks
    • Azure Synapse
  • 🔌Relational Databases
    • PostgreSQL
    • MySQL
    • SingleStore
    • Microsoft SQL Server
    • Oracle
  • 🔌Transformation Tools
    • dbt (Cloud Version)
    • dbt Core
    • Fivetran
    • Airflow
    • AWS Glue
    • Azure Data Factory
    • Apache Spark
      • Apache Spark in Azure Synapse
    • OpenLineage (BETA)
    • Additional configurations
  • 🔌Business Intelligence
    • Tableau
    • Looker
    • PowerBI
  • 🔌Data Lake
    • AWS S3
    • Azure Data Lake Storage (ADLS)
      • Azure Function for Metadata
    • Google Cloud Storage (GCS)
  • 🔌Ticketing and Collaboration
    • ServiceNow
    • Jira
  • 🔒Security and Connectivity
    • Enabling VPC Access
    • IP Whitelisting
    • SSH Tunneling
    • AWS Identities
  • ✅Data Quality
    • Incidents Overview
    • Incident model feedback
    • Enable asset monitoring
    • Available Monitor Types
    • Available Monitor Modes
    • Catalog: Add/Modify Monitor
    • Set Up Freshness & Volume Monitors
    • Set Up Field Health Monitors
    • Set Up Custom SQL Monitors
    • Grouped-by Monitors
    • Modify Schema Drift Monitors
    • Modify Job Failure Monitors (Data Job)
    • Custom Scheduling For Monitors
    • Config Settings
  • 📖Catalog
    • Overview of Asset Types
    • Assets Catalog
    • Asset Overview
    • Automated Lineage
      • Lineage Relationship
      • Supported Data Sources and Lineage Types
    • Add lineage relationships manually
    • Add tags and classifications to fields
    • Field Statistcs
    • Preview sample data
  • 📚Glossary
    • Glossary, Category and Terms
    • Adding a new glossary
    • Adding Terms and Linked Assets
  • Moving Terms to Glossary/Category
  • AI Copilot
    • Copilot's Autocomplete
  • 🤝Collaboration
    • Ask Questions
    • Rate an asset
  • 🌐Data Mesh [BETA]
    • Overview on Data Mesh [BETA]
    • Creating and Managing Domains/Sub-domains
    • Adding members to Domain/Sub-domain
    • Linking Entities to Domains/Sub-domains
    • Adding Data Products to Domains/Subdomains
    • Creating a draft Data Asset
    • Adding a Data Contract - Default Settings
    • Adding a Data Contract - Freshness Test
    • Adding a Data Contract - Column Tests
    • Publishing the Data Asset
  • 🏛️Governance
    • Governance module
    • Classification Policies
    • Auto-classify data assets
  • ☑️Approval Workflow
    • What are Change Requests?
    • Initiate a change request
    • What are Access Requests?
    • Initiate an Access Request
  • 📑Data reconciliation
    • Adding a new recon
    • Understand your recon results
    • Supported sources for Recon
  • 📋Reports
    • Overview of Reports
    • Supported sources for Reports
    • Asset Report: Data Quality Scorecard
  • 📊Dashboard
    • Dashboard Overview
    • Incidents
    • Quality
  • ⏰Alert Notifications
    • Get alerts on email
    • Connect your Slack channels
    • Connect to Microsoft Teams
    • Webhooks integration
  • 🏛️Manage Access
    • User Management - Overview
    • Invite users
    • Deactivate or re-activate users
    • Revoke a user invite
  • 🔐Group-based Access Controls
    • Groups Management - Overview
    • Create Groups & Assign Policies
    • Source-based Policies
    • Administrative-based Policies
    • Module-based Policies
    • What is the "Owners" group?
  • 🗄️Org Settings
    • Multi-factor authentication
    • Single Sign-On (SSO) with Microsoft
    • Single Sign-On (SSO) with JumpCloud
  • ❓Support
    • Supported Features by Integration
    • Frequently Asked Questions
    • Supported Browsers and System Requirements
  • Public API (BETA)
    • Overview
      • Data API
        • Glossary
        • Lineage
        • ACL
          • Group
      • Control API
        • Users
    • API Keys
Powered by GitBook
On this page
  • Architecture
  • Compliance
  • Organizational Security and Privacy Practices
  • Data collected by Decube
  1. Overview

Security and Compliance

We built our unified data platform with industry standards to protect your data and ensure compliance while delivering observability and governance.

PreviousHow to connect data sourcesNextData Policy

Last updated 1 year ago

Decube is built with enterprise-grade practices to ensure stringent security standards and compliance are met.

Architecture

Connection

  • Connection to your data sources uses only read-access levels or dedicated service accounts that allow us specific permissions to scan your database.

  • Credentials that are used for all connections are stored with double encryption on Decube's servers hosted in AWS and are not accessible internally by Decube's engineers.

Collection

  • Decube's data collectors only extract metadata, query logs, and aggregated statistics into its cloud service.

  • Data extracted from these scans is solely for assessing your data's reliability and providing statistics and incident alerts of which you have opted-in.

  • Decube uses encrypted connections (HTTPS and TLS) to protect the contents of data in transit.

  • Decube's architecture also supports a setup specifically for enterprise customers where you can host the data collectors within your own cloud infrastructure so you never have to expose any of your data sources to decube's cloud service.

Compliance

  • Decube is currently SOC2 certified. Reach out to us if you would like a copy of the SOC2 report.

  • Decube will sign any NDAs and/or DPAs where it is appropriate.

  • Decube, while collecting metadata, query logs, and metrics for the purposes of running the monitoring, cataloging, and recon modules, acknowledges that personal data may be collected and processed. If any such data is passed into Decube, it is used only for the sole purpose of running the monitoring, cataloging, and recon modules.

  • Usage of all SaaS applications internally within Decube for operational purposes is vetted with due diligence so that confidential company and personnel data are protected.

Organizational Security and Privacy Practices

Decube's team practices industry best practices across the board to protect the security of the application, and the data privacy of its customers.

  • Decube engages a third party to perform an annual penetration test over the application layers of the platform.

  • Processing of collected data is conducted on secure servers hosted on Amazon Web Services.

  • Decube employees engage in privacy and security training during the onboarding and are required to take an examination after the training. All Decube personnel are required to acknowledge, electronically, that they have attended training and understand the security policy.

  • Access to all critical systems and production environments are protected using strong passwords and multi-factor authentication. SSO is also used to centralize access control for certain applications. Access rights are reviewed before being granted, and then periodically reviewed thereafter.

Data collected by Decube

The following information may be processed and stored by Decube on its cloud services:

Collected Data
Details
Purpose of collection

Metadata

Asset names such as tables and columns, field types, status of transformation jobs and other such metadata.

To populate the data catalog with information about the assets available (table, columns, jobs etc.) within the data warehouses, databases and other data sources.

Metrics

Row counts, last updated and other similiar metrics.

Enable tracking of metrics such as freshness, volume and other metrics.

Aggregated Statistics

Measures the data in selected table which is opt-in only by the user. Statistics may include null percentiles, distinctness, and other similar metrics.

Enable tracking of data health and setup of field tests by user via preset or custom SQL.

Understand further how this data is handled below.

🚀
Data Policy
Architecture Overview: data does not leave your VPC.