Power BI Composite Models Explained
Power BI
Power BI12 min read

Power BI Composite Models Explained

Combine Import and DirectQuery modes in Power BI composite models for optimal performance. Balance speed with real-time data access at enterprise scale.

By Errin O'Connor, Chief AI Architect

Composite models in Power BI combine Import and DirectQuery storage modes within a single semantic model, giving you the performance of in-memory analytics for frequently accessed data while maintaining real-time connectivity to large-scale data sources that would be impractical to fully import. This capability fundamentally changes how enterprises approach data modeling, eliminating the forced choice between fast dashboards and current data. Our Power BI consulting team implements composite model architectures for organizations processing billions of rows across data warehouses, lakehouses, and operational databases.

This guide covers composite model design patterns, performance optimization, aggregation strategies, and production deployment best practices for enterprise environments.

Understanding Composite Models

Before composite models, every Power BI dataset operated in a single mode: Import (fast but stale between refreshes) or DirectQuery (current but slower). Composite models eliminate this binary choice.

Storage mode options within a composite model:

Storage ModeData LocationQuery PerformanceData FreshnessBest For
ImportIn-memory VertiPaqSub-secondRefresh-dependentDimensions, aggregations, frequently queried data
DirectQuerySource databaseDepends on sourceReal-timeLarge fact tables, operational data
DualBoth locationsSub-second (when possible)HybridDimension tables referenced by both Import and DQ tables

How composite models work internally:

When a user interacts with a report, the Power BI engine determines the optimal execution plan. If all required data is in Import mode, the query executes against the in-memory VertiPaq engine (fastest). If DirectQuery tables are involved, the engine sends queries to the source database. If the query involves both, the engine retrieves DirectQuery results and joins them with Import data in memory.

When to Use Composite Models

Composite models solve specific architectural problems. Using them unnecessarily adds complexity without benefit.

Use composite models when:

  • Your fact table exceeds practical Import limits (over 1 billion rows or exceeding capacity memory)
  • Users need real-time data for operational dashboards alongside historical analysis
  • You want aggregation tables that automatically accelerate common queries
  • Multiple data sources must appear in a single report (data warehouse + CRM + operational DB)
  • You are migrating from DirectQuery to Import incrementally
  • Fabric Direct Lake is not yet available for your data sources

Do NOT use composite models when:

  • All data fits comfortably in Import mode (under 1 GB compressed)
  • A single DirectQuery source with acceptable performance serves all needs
  • You can use Direct Lake mode in Fabric (preferred approach for Fabric lakehouses)
  • The complexity would confuse your development team without measurable performance benefit

Design Pattern 1: Aggregation Acceleration

The most powerful composite model pattern uses Import-mode aggregation tables to accelerate queries against large DirectQuery fact tables.

Architecture:

``` User Query -> Power BI Engine -> Check: Can aggregation table answer this? -> YES: Query Import aggregation (sub-second response) -> NO: Query DirectQuery detail table (2-5 second response) ```

Implementation steps:

  1. **Identify aggregation candidates** — Analyze query patterns using Performance Analyzer to find the most common grouping dimensions
  2. Create aggregation table — Summarize the fact table at the appropriate grain (e.g., daily instead of transactional)
  3. Set storage mode — Aggregation table = Import, Detail fact table = DirectQuery
  4. Configure aggregation mappings — In model view, map aggregation columns to their detail table counterparts
  5. Set dimension tables to Dual — Dimensions referenced by both aggregation and detail tables must use Dual mode
  6. Hide the aggregation table — Users should never see or directly query aggregation tables

Aggregation mapping example:

Aggregation ColumnDetail ColumnAggregation Function
Agg_Sales_Daily.TotalRevenueSales_Detail.RevenueSum
Agg_Sales_Daily.TransactionCountSales_Detail.TransactionIDCount
Agg_Sales_Daily.DateKeySales_Detail.DateKeyGroupBy
Agg_Sales_Daily.ProductKeySales_Detail.ProductKeyGroupBy
Agg_Sales_Daily.RegionKeySales_Detail.RegionKeyGroupBy

**Aggregation hit rate is critical.** Monitor the percentage of queries served by aggregations versus falling through to DirectQuery. Target 80%+ aggregation hit rate for a well-designed model. Use DAX Studio to analyze query plans and verify aggregation usage.

Design Pattern 2: Real-Time Operational Dashboard

Combine historical Import data with real-time DirectQuery data for operational monitoring.

Use case: A manufacturing floor dashboard shows today's production metrics in real-time (DirectQuery against IoT database) alongside month-over-month trends (Import from data warehouse).

Architecture:

``` Historical Warehouse (Import) <-> Date Dimension (Dual) <-> Operational DB (DirectQuery) Monthly aggregates Shared dimension Live sensor data ```

**Design considerations:** - The Date dimension must be Dual mode since both Import and DirectQuery tables reference it - Incremental refresh keeps historical data current without full refresh - DirectQuery source must support the query load from concurrent users - Implement monitoring to track DirectQuery source performance

Design Pattern 3: Multi-Source Federation

Connect multiple data sources in a single model without moving all data to a central warehouse.

Common scenario: Combine Azure SQL data warehouse (primary analytics) with Salesforce CRM (customer context) and SharePoint lists (manual data entry).

Implementation guidance:

SourceStorage ModeRationale
Azure SQL DWImport or DirectQueryPrimary data source, mode depends on volume
SalesforceImportAPI rate limits make DirectQuery impractical
SharePoint listsImportSmall data volumes, infrequent changes
SAP HANADirectQueryReal-time requirement, SAP supports DQ natively

**Limitations of multi-source composite models:** - Cross-source joins happen in Power BI engine memory, not at the source - Query folding does not work across source boundaries - Relationship cardinality and cross-filter direction affect performance significantly - Security contexts do not propagate across sources (each source authenticates independently)

Performance Optimization

Composite model performance depends on careful design decisions. Poor choices create models slower than pure DirectQuery.

Performance optimization checklist:

OptimizationImpactImplementation
Minimize DirectQuery tablesHighOnly use DQ for tables that genuinely need real-time or exceed Import limits
Use Dual mode for shared dimensionsHighPrevents unnecessary DQ queries for dimension lookups
Design effective aggregationsHighCover 80%+ of query patterns with Import aggregations
Limit cross-source relationshipsMediumEach cross-source join adds processing overhead
Optimize DirectQuery sourceMediumEnsure proper indexing, statistics, and query optimization at the source
Reduce relationship complexityMediumAvoid many-to-many relationships between DQ tables
Implement DAX best practicesMediumEfficient measures reduce engine workload
Use query cachingLow-MediumEnable dataset-level caching for DirectQuery results

VertiPaq Analyzer diagnostics:

Use VertiPaq Analyzer to understand the Import portion of your composite model:

  • Total model size and per-table breakdown
  • Column cardinality and encoding efficiency
  • Relationship memory consumption
  • Dictionary size for text columns

Security Considerations

Composite models introduce security complexity because data flows through multiple paths.

Row-level security in composite models:

  • RLS filters defined on Import tables execute in the VertiPaq engine
  • RLS filters on DirectQuery tables get translated to source queries (pushed to database)
  • Dynamic RLS patterns work in both modes but test thoroughly
  • Cross-source RLS requires careful design — a filter on an Import dimension does not automatically filter a DirectQuery fact table unless the relationship is configured correctly

Security best practices:

  1. Test RLS with DAX query view using `CALCULATE(COUNTROWS(FactTable), USEROBJECTID())` patterns
  2. Verify DirectQuery source receives the expected WHERE clauses (use source-side query monitoring)
  3. Document which security boundaries are enforced by Power BI versus the source database
  4. Audit regularly using the Power BI Scanner API

Migration Path: DirectQuery to Composite to Import

Many organizations start with DirectQuery for quick deployment and migrate to composite models for performance. The migration path is incremental.

Phase 1: Pure DirectQuery — All tables in DirectQuery mode. Fast to deploy, slower query performance.

Phase 2: Add Aggregations — Create Import aggregation tables for common query patterns. Most queries accelerate while detail queries remain real-time.

Phase 3: Migrate dimensions to Import — Move dimension tables to Dual mode. Reduces DirectQuery load significantly.

**Phase 4: Migrate cold facts to Import** — Use incremental refresh to import historical fact partitions. Only current period remains DirectQuery.

Phase 5: Full Import (if appropriate) — If data volume allows and real-time is not required, migrate entirely to Import for maximum performance.

Composite Models with Microsoft Fabric

In the Fabric ecosystem, Direct Lake mode offers an alternative to traditional composite models by reading Parquet files directly from OneLake without Import or DirectQuery overhead.

When to use Direct Lake vs. Composite Models:

ScenarioRecommended Approach
Data in Fabric Lakehouse/WarehouseDirect Lake (preferred)
Data in Azure SQL or on-premisesComposite model
Mixed Fabric + external sourcesComposite model with Direct Lake tables
Need aggregation accelerationComposite model
Need real-time operational dataComposite model with DirectQuery

Frequently Asked Questions

Can I convert an existing Import model to composite? Yes. Change individual table storage modes in the model view. No need to rebuild from scratch. However, test thoroughly — performance characteristics change significantly.

Do composite models work with Power BI Pro? Limited. Composite models connecting to Power BI datasets or Azure Analysis Services require Premium or PPU. Basic composite models (Import + DirectQuery to a single source) work with Pro.

**What is the maximum model size for composite models?** Import portions are limited by capacity memory. DirectQuery portions have no Power BI size limit but depend on source database capabilities. For Fabric capacities, memory limits vary by SKU.

How do I troubleshoot slow composite model queries? Use Performance Analyzer in Power BI Desktop to identify which queries hit DirectQuery versus Import. Check aggregation hit rates in DAX Studio. Monitor source database query performance for DirectQuery operations.

Next Steps

Composite models are the bridge between Import performance and DirectQuery flexibility. Designing them correctly requires understanding your data volumes, query patterns, and freshness requirements. Our Power BI architecture team conducts composite model design workshops that analyze your current environment and produce an optimized model architecture. Contact us to schedule a consultation.

**Related resources:** - Composite Models and Aggregations Deep Dive - DirectQuery Optimization - Power BI Direct Lake Mode Guide - DAX Performance Optimization``` User Query -> Power BI Engine -> Check: Can aggregation table answer this? -> YES: Query Import aggregation (sub-second response) -> NO: Query DirectQuery detail table (2-5 second response) ```

Implementation steps:

  1. **Identify aggregation candidates** — Analyze query patterns using Performance Analyzer to find the most common grouping dimensions
  2. Create aggregation table — Summarize the fact table at the appropriate grain (e.g., daily instead of transactional)
  3. Set storage mode — Aggregation table = Import, Detail fact table = DirectQuery
  4. Configure aggregation mappings — In model view, map aggregation columns to their detail table counterparts
  5. Set dimension tables to Dual — Dimensions referenced by both aggregation and detail tables must use Dual mode
  6. Hide the aggregation table — Users should never see or directly query aggregation tables

Aggregation mapping example:

Aggregation ColumnDetail ColumnAggregation Function
Agg_Sales_Daily.TotalRevenueSales_Detail.RevenueSum
Agg_Sales_Daily.TransactionCountSales_Detail.TransactionIDCount
Agg_Sales_Daily.DateKeySales_Detail.DateKeyGroupBy
Agg_Sales_Daily.ProductKeySales_Detail.ProductKeyGroupBy
Agg_Sales_Daily.RegionKeySales_Detail.RegionKeyGroupBy

**Aggregation hit rate is critical.** Monitor the percentage of queries served by aggregations versus falling through to DirectQuery. Target 80%+ aggregation hit rate for a well-designed model. Use DAX Studio to analyze query plans and verify aggregation usage.

Design Pattern 2: Real-Time Operational Dashboard

Combine historical Import data with real-time DirectQuery data for operational monitoring.

Use case: A manufacturing floor dashboard shows today's production metrics in real-time (DirectQuery against IoT database) alongside month-over-month trends (Import from data warehouse).

Architecture:

``` Historical Warehouse (Import) <-> Date Dimension (Dual) <-> Operational DB (DirectQuery) Monthly aggregates Shared dimension Live sensor data ```

**Design considerations:** - The Date dimension must be Dual mode since both Import and DirectQuery tables reference it - Incremental refresh keeps historical data current without full refresh - DirectQuery source must support the query load from concurrent users - Implement monitoring to track DirectQuery source performance

Design Pattern 3: Multi-Source Federation

Connect multiple data sources in a single model without moving all data to a central warehouse.

Common scenario: Combine Azure SQL data warehouse (primary analytics) with Salesforce CRM (customer context) and SharePoint lists (manual data entry).

Implementation guidance:

SourceStorage ModeRationale
Azure SQL DWImport or DirectQueryPrimary data source, mode depends on volume
SalesforceImportAPI rate limits make DirectQuery impractical
SharePoint listsImportSmall data volumes, infrequent changes
SAP HANADirectQueryReal-time requirement, SAP supports DQ natively

**Limitations of multi-source composite models:** - Cross-source joins happen in Power BI engine memory, not at the source - Query folding does not work across source boundaries - Relationship cardinality and cross-filter direction affect performance significantly - Security contexts do not propagate across sources (each source authenticates independently)

Performance Optimization

Composite model performance depends on careful design decisions. Poor choices create models slower than pure DirectQuery.

Performance optimization checklist:

OptimizationImpactImplementation
Minimize DirectQuery tablesHighOnly use DQ for tables that genuinely need real-time or exceed Import limits
Use Dual mode for shared dimensionsHighPrevents unnecessary DQ queries for dimension lookups
Design effective aggregationsHighCover 80%+ of query patterns with Import aggregations
Limit cross-source relationshipsMediumEach cross-source join adds processing overhead
Optimize DirectQuery sourceMediumEnsure proper indexing, statistics, and query optimization at the source
Reduce relationship complexityMediumAvoid many-to-many relationships between DQ tables
Implement DAX best practicesMediumEfficient measures reduce engine workload
Use query cachingLow-MediumEnable dataset-level caching for DirectQuery results

VertiPaq Analyzer diagnostics:

Use VertiPaq Analyzer to understand the Import portion of your composite model:

  • Total model size and per-table breakdown
  • Column cardinality and encoding efficiency
  • Relationship memory consumption
  • Dictionary size for text columns

Security Considerations

Composite models introduce security complexity because data flows through multiple paths.

Row-level security in composite models:

  • RLS filters defined on Import tables execute in the VertiPaq engine
  • RLS filters on DirectQuery tables get translated to source queries (pushed to database)
  • Dynamic RLS patterns work in both modes but test thoroughly
  • Cross-source RLS requires careful design — a filter on an Import dimension does not automatically filter a DirectQuery fact table unless the relationship is configured correctly

Security best practices:

  1. Test RLS with DAX query view using `CALCULATE(COUNTROWS(FactTable), USEROBJECTID())` patterns
  2. Verify DirectQuery source receives the expected WHERE clauses (use source-side query monitoring)
  3. Document which security boundaries are enforced by Power BI versus the source database
  4. Audit regularly using the Power BI Scanner API

Migration Path: DirectQuery to Composite to Import

Many organizations start with DirectQuery for quick deployment and migrate to composite models for performance. The migration path is incremental.

Phase 1: Pure DirectQuery — All tables in DirectQuery mode. Fast to deploy, slower query performance.

Phase 2: Add Aggregations — Create Import aggregation tables for common query patterns. Most queries accelerate while detail queries remain real-time.

Phase 3: Migrate dimensions to Import — Move dimension tables to Dual mode. Reduces DirectQuery load significantly.

**Phase 4: Migrate cold facts to Import** — Use incremental refresh to import historical fact partitions. Only current period remains DirectQuery.

Phase 5: Full Import (if appropriate) — If data volume allows and real-time is not required, migrate entirely to Import for maximum performance.

Composite Models with Microsoft Fabric

In the Fabric ecosystem, Direct Lake mode offers an alternative to traditional composite models by reading Parquet files directly from OneLake without Import or DirectQuery overhead.

When to use Direct Lake vs. Composite Models:

ScenarioRecommended Approach
Data in Fabric Lakehouse/WarehouseDirect Lake (preferred)
Data in Azure SQL or on-premisesComposite model
Mixed Fabric + external sourcesComposite model with Direct Lake tables
Need aggregation accelerationComposite model
Need real-time operational dataComposite model with DirectQuery

Frequently Asked Questions

Can I convert an existing Import model to composite? Yes. Change individual table storage modes in the model view. No need to rebuild from scratch. However, test thoroughly — performance characteristics change significantly.

Do composite models work with Power BI Pro? Limited. Composite models connecting to Power BI datasets or Azure Analysis Services require Premium or PPU. Basic composite models (Import + DirectQuery to a single source) work with Pro.

**What is the maximum model size for composite models?** Import portions are limited by capacity memory. DirectQuery portions have no Power BI size limit but depend on source database capabilities. For Fabric capacities, memory limits vary by SKU.

How do I troubleshoot slow composite model queries? Use Performance Analyzer in Power BI Desktop to identify which queries hit DirectQuery versus Import. Check aggregation hit rates in DAX Studio. Monitor source database query performance for DirectQuery operations.

Next Steps

Composite models are the bridge between Import performance and DirectQuery flexibility. Designing them correctly requires understanding your data volumes, query patterns, and freshness requirements. Our Power BI architecture team conducts composite model design workshops that analyze your current environment and produce an optimized model architecture. Contact us to schedule a consultation.

**Related resources:** - Composite Models and Aggregations Deep Dive - DirectQuery Optimization - Power BI Direct Lake Mode Guide - DAX Performance Optimization

Frequently Asked Questions

Which data sources support composite models in Power BI?

Most DirectQuery-capable sources work with composite models including Azure SQL Database, Azure Synapse Analytics, SQL Server, Oracle, SAP HANA, Snowflake, Databricks, Google BigQuery, Amazon Redshift, and Microsoft Fabric (via Direct Lake). You can also create composite models by connecting to published Power BI datasets.

Do composite models require Premium licensing?

Basic composite models (mixing Import and DirectQuery) work with Power BI Pro. However, composite models that connect to published Power BI datasets (chaining models) and some advanced aggregation features require Power BI Premium or Premium Per User (PPU) licensing.

How do aggregations work with composite models?

Aggregation tables are pre-summarized Import tables that automatically answer high-level queries, while detailed queries fall through to DirectQuery source tables. You configure aggregation mappings in table properties. The Power BI query engine transparently routes each query to the most efficient storage - users never see the difference, but performance improves dramatically for summarized queries.

Power BIComposite ModelsDirectQuery

Industry Solutions

See how we apply these solutions across industries:

Need Help With Power BI?

Our experts can help you implement the solutions discussed in this article.

Ready to Transform Your Data Strategy?

Get a free consultation to discuss how Power BI and Microsoft Fabric can drive insights and growth for your organization.