Use Cases
This guide provides practical examples of how to leverage the Decube Public API to automate common data management workflows and integrate with your existing systems.
User Management Automation
Automated User Onboarding
Scenario: Automatically create Decube users when new employees join your organization.
APIs Used: Control API - Users
POST /user
- Create new usersGET /users
- List existing users to avoid duplicates
Example Workflow:
HR system triggers when new employee is added
Check if user already exists using GET /users
Create new user account using POST /user
User receives welcome email with login instructions from Decube.
User Lifecycle Management
Scenario: Deactivate users when employees leave the organization.
APIs Used: Control API - Users
DELETE /user
- Deactivate user accountsGET /user
- Verify user status before deactivation
Data Asset Discovery and Management
Automated Asset Catalog Updates
Scenario: Keep your data catalog synchronized with metadata changes from your data infrastructure.
APIs Used: Data API - Assets
POST /assets/search
- Find existing assetsPATCH /assets
- Update asset metadata and descriptionsGET /assets
- Retrieve current asset details
Example Workflow:
Data pipeline completion triggers metadata update
Search for assets using POST /assets/search
Update asset descriptions and ownership using PATCH /assets
Data Asset Discovery
Scenario: Build custom dashboards or integrations that surface relevant data assets to users.
APIs Used: Data API - Assets
POST /assets/search
- Search assets by type, tags, or ownershipGET /assets
- Get detailed asset information
Glossary Management
Automated Glossary Synchronization
Scenario: Maintain consistent business terminology across multiple systems.
APIs Used: Data API - Glossary
GET /catalog/glossary/list
- List existing terms and categoriesPOST /catalog/glossary
- Create new terms and categoriesPATCH /catalog/glossary
- Update existing definitionsPOST /catalog/glossary/documentation
- Attach documentation to terms
Example Workflow:
Business stakeholders update terms in external system
Sync process retrieves updated definitions
Create or update glossary terms in Decube
Attach rich documentation to terms
Data Lineage Tracking
Manual Lineage Documentation
Scenario: Document data transformations and dependencies that aren't automatically detected.
APIs Used: Data API - Lineage
POST /catalog/lineage/manual_lineage
- Create lineage relationshipsGET /catalog/lineage/manual_lineage
- List existing lineage connectionsDELETE /catalog/lineage/manual_lineage
- Remove outdated lineage
Example Workflow:
Data engineer completes new transformation pipeline
Create manual lineage connections between source and target datasets
Document the transformation logic in lineage metadata
Update lineage when pipelines change
Access Control and Security
Group-Based Permission Management
Scenario: Automate user access provisioning based on organizational roles.
APIs Used: Data API - ACL Groups
GET /acl/group/list
- List available groupsPOST /acl/group/add_user
- Add users to appropriate groupsPOST /acl/group/remove_user
- Remove users when roles changeGET /acl/group
- Verify group memberships
Example Workflow:
Employee role changes in internal system
Determine appropriate Decube groups based on new role
Add user to new groups and remove from old groups
Verify permissions are correctly applied
Data Quality Reporting
Automated Data Quality Scorecard Generation
Scenario: Generate regular data quality reports for compliance and governance purposes.
APIs Used: Data API - Data Quality Scorecard
POST /data_quality_scores/report/generate
- Request report generationGET /data_quality_scores/report/result
- Poll for report completion and download results
Use Cases:
Generate periodic data quality reports for compliance and governance
Export quality metrics for external dashboards and analytics
Track data quality trends over time across different data sources
Create automated alerts based on data quality thresholds
Audit data quality performance by data owner, schema, or asset type
Example Workflow:
Scheduled job triggers data quality report request
Submit request with desired filters (time range, data sources, dimensions)
Receive job_id for tracking the asynchronous report generation
Poll the results endpoint until report is ready
Download the complete JSON report with quality scores and metrics
Process results for dashboards, alerts, or compliance documentation
Monitors: Monitoring & Alerting Automation
Discover & Visualize Monitors
A data catalog UI or external tool calls
POST /monitors/search
for a givenasset
.Show available scheduled and on‑demand monitors to users (name, type, incident level, last run status).
Link into
GET /monitors?monitor_id={id}
to display the full configuration and recent execution summary.
Trigger On‑Demand Monitors after Upstream Change
Upstream pipeline emits an event after data load/transform.
Integration calls
POST /monitors/trigger
with the list of monitor ids or anasset
identifier to run relevant monitors immediately.Poll
GET /monitors/{monitor_id}/status
to wait for completion.On completion, fetch details with
GET /monitors/{monitor_id}/history
and surface failures to downstream alerting or orchestration systems.
Temporarily Pause Scheduled Monitors for Maintenance
Schedule maintenance window or detect noisy false positives.
Call
POST /monitors/enable-disable
with{ "monitor_id": <id>, "enabled": false }
to disable the scheduled monitor.Re-enable when maintenance completes and verify the
enabled
flag viaGET /monitors?monitor_id={id}
.
Audit & Reporting
Regular job calls
POST /monitors/search
to enumerate monitors for a tenant or team.For each monitor, call
GET /monitors/{monitor_id}/history
to collect run results and incident counts.Aggregate results into dashboards, compliance reports, or SLA measurement.
For more information, see:
Last updated