Skip to main content

131 Ideas

Dynamic Date Partitioning for Cloud Storage DestinationsNew

Currently, when setting up a Cloud Storage destination (like Google Cloud Storage), it is difficult to organize data into a standard partitioned folder structure (e.g., folder/YYYY/MM/DD/file.parquet) because the "Upload Path" field often treats date tags as literal strings or lacks support for granular date variables.The Solution: I would like to suggest the implementation of native support for date variables within the Upload Path field. Ideally, this would include: Standardized Tags: Support for tags like {YYYY}, {MM}, and {DD}  that resolve based on the data's date range or the execution date. Dynamic Subfolder Creation: The ability to use the forward slash / character combined with these tags to automatically generate the directory structure in the bucket. Hive Partitioning Format: Enabling users to define paths like year={YYYY}/month={MM}/day={DD}/ to facilitate seamless integration with Data Lake tools like BigQuery External Tables, AWS Athena, or Spark. Why this is important: Data Organization: Manually managing massive amounts of data in a single root folder is not scalable. Query Performance: Partitioning is essential for optimizing query costs and speed in BigQuery/Athena. Automation: It eliminates the need for middleman scripts (like Cloud Functions or Glue) just to move files into the correct date-based folders. Use Case Example: A user wants to export Google Ads data daily. With this feature, the path would automatically resolve from google_ads/data/ to google_ads/2026/01/28/data.parquet without manual intervention.

geo_su
geo_suNewbie

Feature Request: Data Transfer Guardrails & Alerts for Access Failures (Meta Ads/BM & Other)New

Hi everyone,I’m reaching out to share a feature idea that would significantly improve the reliability of data pipelines. After discussing this with the Supermetrics support team, they suggested I post it here to gauge community interest and visibility.The Problem: Currently, when a data transfer is scheduled between a platform (like Meta Ads) and a Data Warehouse (BigQuery, Snowflake, etc.), the process can fail because of the following : Meta Business Manager (BM) access is revoked or restricted. Individual client account access is lost (common in Agency models). The Proposal: Data Transfer GuardrailsI would love to see Supermetrics implement native guardrails within the transfer settings that could: Proactive Connectivity Checks: Automatically verify if the underlying token still has access to all selected accounts before the transfer begins. Specific Error Masking: Differentiate between a "successful transfer with 0 rows" and a "failed transfer due to Permission/Auth errors." Custom Notifications: Allow us to set up specific alerts (email/Slack) triggered specifically by "Access Revoked" errors, separate from general system errors. Why this matters: For those of us managing dozens of client accounts, catching a "BM restricted" error 5 minutes after it happens—rather than 5 days later during a client presentation—is a game changer for data integrity.Has anyone else faced similar issues with Meta Ads or other connectors? If you think these guardrails would save you time and headaches, please upvote or share your thoughts below!