Microsoft Fabric SQL Analytics Endpoint: Query Lakehouse Data with T-SQL
Use T-SQL to query Fabric lakehouse Delta tables through the auto-generated SQL analytics endpoint. Build views, security, and Power BI integration.
<h2>What is the SQL Analytics Endpoint?</h2> <p>Every Fabric lakehouse automatically generates a SQL analytics endpoint — a read-only T-SQL interface to your Delta Lake tables. This enables SQL Server professionals, BI tools, and applications to query lakehouse data using familiar T-SQL syntax without learning PySpark or managing compute infrastructure.</p> <p><a href="/services/microsoft-fabric">Microsoft Fabric consulting</a> helps organizations leverage the SQL analytics endpoint as a bridge between traditional SQL-based workflows and modern lakehouse architecture.</p>
<h2>Auto-Generated Schema</h2> <p>When you create tables in a Fabric lakehouse (via notebooks, dataflows, or Data Pipelines), the SQL analytics endpoint automatically reflects those tables with their schemas. Delta Lake metadata (column names, data types, partitioning) is translated to SQL Server-compatible definitions. The endpoint stays synchronized — new tables and schema changes in the lakehouse appear automatically.</p>
<h2>T-SQL Query Capabilities</h2> <p>The SQL analytics endpoint supports a wide range of T-SQL operations:</p> <ul> <li><strong>SELECT queries</strong> — Full SELECT syntax including JOINs, CTEs, subqueries, window functions</li> <li><strong>Aggregations</strong> — GROUP BY, HAVING, aggregate functions (SUM, AVG, COUNT, etc.)</li> <li><strong>Views</strong> — CREATE VIEW for reusable query definitions and semantic layer</li> <li><strong>Functions</strong> — Inline table-valued functions for parameterized queries</li> <li><strong>Cross-database queries</strong> — Query tables across multiple lakehouses in the same workspace</li> <li><strong>OPENROWSET</strong> — Query files in OneLake directly (Parquet, CSV, Delta)</li> </ul> <p>Note: INSERT, UPDATE, DELETE, and DDL operations on tables are <strong>not supported</strong> — the endpoint is read-only. Table creation and data modification must happen through the lakehouse interface (notebooks, dataflows, pipelines).</p>
<h2>Views and Stored Procedures</h2> <p>Create views to build a governed semantic layer on top of raw lakehouse tables:</p> <ul> <li><strong>Business views</strong> — Rename columns, add calculated fields, and join tables for business-friendly access</li> <li><strong>Security views</strong> — Implement row-level filtering for different user groups</li> <li><strong>Aggregation views</strong> — Pre-define common aggregations for faster query performance</li> </ul> <p>Views are stored in the SQL analytics endpoint metadata and do not duplicate data. They execute against the underlying Delta tables at query time.</p>
<h2>Cross-Database Queries</h2> <p>One of the most powerful features: query across multiple lakehouses within the same workspace using three-part naming (database.schema.table). This enables:</p> <ul> <li>Joining data from different domains (sales lakehouse + finance lakehouse)</li> <li>Creating unified views across organizational boundaries</li> <li>Building a virtual data warehouse without physical data movement</li> </ul>
<h2>Security Model</h2> <p>Security operates at multiple levels:</p> <ul> <li><strong>Workspace-level</strong> — Fabric workspace roles (Admin, Member, Contributor, Viewer) control access to the endpoint</li> <li><strong>Object-level security (OLS)</strong> — GRANT/DENY/REVOKE on views and functions</li> <li><strong>Row-level security (RLS)</strong> — Implement via views with security predicates</li> <li><strong>Column-level security (CLS)</strong> — Restrict column access through view definitions</li> </ul> <p>For enterprise security patterns, see <a href="/blog/power-bi-security-best-practices-enterprise-2026">security best practices</a>.</p>
<h2>Integration with Power BI</h2> <p>The SQL analytics endpoint is the foundation for <strong>Direct Lake</strong> mode in Power BI — the optimal way to build semantic models on Fabric lakehouse data. Direct Lake reads Delta Lake files directly from OneLake, providing import-like performance with DirectQuery-like freshness. The SQL analytics endpoint enables:</p> <ul> <li>Automatic table discovery in Power BI semantic model creation</li> <li>View-based semantic layer for Power BI</li> <li>T-SQL-based row-level security that propagates to Power BI</li> </ul> <p><a href="/blog/power-bi-composite-models-import-directquery-guide-2026">Composite models</a> can combine Direct Lake tables from the SQL analytics endpoint with other data sources.</p>
<h2>Performance Optimization</h2> <p>Optimize SQL analytics endpoint performance:</p> <ul> <li><strong>V-Order optimization</strong> — Ensure lakehouse tables use V-Order for optimal read performance</li> <li><strong>Table maintenance</strong> — Run OPTIMIZE and VACUUM on Delta tables to compact small files</li> <li><strong>Partitioning</strong> — Partition large tables by date or high-cardinality dimension for query pruning</li> <li><strong>Statistics</strong> — Delta Lake maintains file-level statistics for predicate pushdown</li> </ul>
<h2>SQL Analytics Endpoint vs Fabric Data Warehouse</h2> <p>The SQL analytics endpoint is read-only and auto-generated from lakehouse tables. The Fabric Data Warehouse is a full read-write T-SQL engine supporting INSERT, UPDATE, DELETE, stored procedures, and triggers. Choose the warehouse when you need write operations or complex T-SQL logic; choose the SQL analytics endpoint when read-only access to lakehouse data is sufficient. See <a href="/blog/microsoft-fabric-warehouse-vs-lakehouse-comparison-2026">warehouse vs lakehouse comparison</a> for detailed guidance.</p>
<h2>Migration from Synapse Serverless</h2> <p>Organizations using Azure Synapse serverless SQL pools can migrate to Fabric's SQL analytics endpoint. The T-SQL syntax is largely compatible, and OPENROWSET queries against OneLake files work similarly. Key migration steps: move data to OneLake, recreate views and security, and update connection strings in downstream tools.</p>
<p>Ready to leverage the SQL analytics endpoint? <a href="/contact">Contact EPC Group</a> for a free consultation on Fabric lakehouse architecture.</p>
Frequently Asked Questions
Can I write data through the SQL analytics endpoint?
No. The SQL analytics endpoint is read-only. You can query data with SELECT statements and create views and functions, but INSERT, UPDATE, DELETE, and table creation must happen through the lakehouse interface (notebooks, dataflows, or Data Pipelines). For read-write T-SQL, use a Fabric Data Warehouse instead.
How does the SQL analytics endpoint compare to Synapse serverless?
The SQL analytics endpoint is similar in concept — T-SQL access to lake data without provisioned compute. Key differences: the endpoint is auto-generated from lakehouse tables (no manual external table definitions), it integrates tightly with Direct Lake for Power BI, and it runs on Fabric capacity rather than per-query billing. T-SQL compatibility is high but not 100% identical.
Do I need to create the SQL analytics endpoint manually?
No. It is automatically created and maintained for every Fabric lakehouse. When you create or modify tables in the lakehouse, the SQL analytics endpoint schema updates automatically. You only need to manually create views and functions on top of the auto-generated tables.
How does cross-database querying work?
You can reference tables and views across multiple lakehouses within the same Fabric workspace using three-part naming: LakehouseName.SchemaName.TableName. This enables joining data across organizational domains without physically moving or copying data.
What is the performance of the SQL analytics endpoint compared to a dedicated warehouse?
For read queries, performance is comparable for well-optimized Delta tables (V-Order, compacted files, proper partitioning). The Fabric Data Warehouse may perform better for complex analytical queries with multiple joins and aggregations due to its dedicated SQL engine optimizations. For Power BI, Direct Lake mode via the SQL analytics endpoint typically outperforms DirectQuery to either option.