BI Connectors
Qarion integrates with popular BI tools to scrape query definitions, dashboard structures, and usage data. These connectors use the same infrastructure as metadata connectors and follow the same configuration patterns.
Supported BI Tools
| BI Tool | Scrape Targets | Status |
|---|---|---|
| Apache Superset | Saved queries, chart SQL, dashboard-to-dataset mappings | ✅ Available |
| Metabase | Questions/cards, query history, dashboard definitions | ✅ Available |
| Looker | Explore definitions, Look SQL, dashboard tile references | ✅ Available |
| Tableau | Workbook/datasource definitions via REST API | ✅ Available |
| Power BI | Dataset/report lineage via Scanner API | 🔜 Planned |
Setting Up a BI Connector
BI connectors are configured in the same way as source system connectors:
- Navigate to Source Systems in your space
- Click Add Connector
- Select the BI tool type (e.g., "Superset", "Metabase")
- Enter the connection details:
- URL — Base URL of your BI tool instance
- Credentials — API key, username/password, or service account token
- Configure a sync schedule using a cron expression (e.g.,
0 2 * * *for daily at 2 AM) - Click Save and Test
What Gets Scraped
Saved Queries & Charts
The connector extracts SQL from saved queries and chart definitions. Each query is parsed to identify referenced tables and columns, which are then linked to products in your catalog.
Dashboards
Dashboard definitions are scraped to build dashboard-to-table lineage:
- Each dashboard becomes a lineage node with the BI tool's icon
- Charts within the dashboard create edges to the underlying tables
- A link-back URL points to the original dashboard in the BI tool
Query History
Where supported, the connector pulls recent query history for aggregate usage metrics — query count per table, unique users, and popular columns.
Sync Schedule
BI connectors follow the same cron scheduling as metadata connectors. You can configure:
- Manual sync — Trigger on demand from the connector detail page
- Scheduled sync — Set a cron schedule for automatic periodic scraping
Query Log Mining
In addition to BI tool scraping, Qarion can mine query history directly from your data warehouse. This is configured on your existing warehouse connectors (Snowflake, BigQuery, PostgreSQL) and runs as a separate task that:
- Pulls recent query history from system views
- Parses SQL to extract table and column references
- Attributes queries to BI tool service accounts where identifiable
- Aggregates results into per-product usage statistics
Privacy & Anonymization
All query text is anonymized before storage — string and numeric literals are replaced with ? placeholders. This produces normalized SQL patterns suitable for grouping without exposing sensitive filter values.
-- Original:
SELECT * FROM orders WHERE customer_id = 'C-1234' AND amount > 1000
-- Anonymized:
SELECT * FROM orders WHERE customer_id = ? AND amount > ?