Introduction 

The Azure data landscape has transformed dramatically since our analysis comparing Microsoft Fabric and Azure Databricks 10 months ago. Power BI’s integration into Fabric has fundamentally shifted licensing models. Microsoft has poured resources into evolving Fabric’s capabilities, and organizations are increasingly asking not “Which platform should we choose?” but sometimes “How do we get the best of both?”

The stakes for getting this right have never been higher. As AI initiatives accelerate and data volumes explode, your lakehouse architecture impacts everything from operational costs to development velocity to your ability to capitalize on emerging AI opportunities. Technical and business leaders must navigate these waters thoughtfully–one path leads to a robust, future-proof foundation, while another risks creating technical debt that could take years to unwind.

Our assessment was clear ten months ago: while Microsoft Fabric showed promise, it wasn’t enterprise-ready, and Azure Databricks was the stronger choice for production lakehouse implementations. That analysis resonated strongly with organizations doing their platform evaluations, but much has changed since then. Microsoft’s ambitions for Fabric have become clearer, the platform has evolved significantly, and the recent changes to Power BI licensing have created new considerations that every Azure customer must address.

This follow-up article aims to:

  1. Assess whether Fabric has matured enough for enterprise production workloads
  2. Re-examine our recommendation of Databricks as the preferred lakehouse platform
  3. Present practical architectural patterns for organizations that need to leverage both platforms

For technical and business leaders charting their Azure data strategy, understanding whether to optimally combine these platforms (or choose between them) has become increasingly critical. Let’s examine what’s changed, what hasn’t, and how to architect a robust analytics foundation on Azure that will serve your organization today and into the future.

Has Fabric Matured Enough for Production? 

This year, Microsoft has demonstrated clear commitment to evolving Fabric through updates and new features. The platform has seen meaningful improvements across multiple areas, from expanded Data Factory capabilities to enhanced Git integration and new enterprise security features like Managed Private Endpoints. However, three findings from our original analysis continue to give us pause:

Performance and Scalability

Fabric’s performance becomes increasingly unpredictable as scale increases. Recent tests reveal significant degradation with complex queries and larger datasets (1TB+), which can become exacerbated by capacity unit throttling. The shared resource model means concurrent workloads compete for the same capacity, leading to “noisy neighbor” effects that can impact critical production workloads. While bursting can temporarily alleviate bottlenecks, it doesn’t address the core architectural limitations affecting consistent performance at scale.

Cost Management and Resource Optimization

Fabric’s capacity-based pricing, while appearing simple, introduces complex challenges for enterprise cost management. The platform uses a shared pool of Capacity Units (CUs) that requires careful capacity planning to use efficiently. When workloads exceed capacity allowances, Fabric implements a graduated throttling approach that creates a difficult choice: either over-provision capacity (leading to waste) or risk production impacts ranging from 20-second query delays to complete request rejections (tantamount to an outage). The challenge is compounded by Fabric’s coarse-grained capacity scaling, which only allows doubling or halving of resources rather than more granular adjustments.

Security and Governance

While Fabric has made strides with deeper Purview integration and preview support for mirroring Azure Databricks Catalogs, its security model remains fragmented and less mature than enterprise requirements demand. SQL granular permissions only work in specific experiences like Lakehouse and Warehouse, creating security blind spots when data moves between services. Shortcuts, while convenient, can circumvent governance since they inherit security from target locations rather than maintaining consistent policies.

These limitations are particularly apparent when compared to Databricks’ mature enterprise feature set. Where Fabric struggles with consistent performance at scale, Databricks’ optimized runtime and intelligent cluster management deliver predictable performance even for complex workloads. While Fabric requires careful capacity planning to avoid throttling, Databricks’ consumption-based model with granular controls provides better cost efficiency. Most importantly, Databricks’ Unity Catalog offers the comprehensive, unified governance that enterprises require–something Fabric’s fragmented security model has yet to match.

Our analysis suggests that while Fabric has evolved, its core architectural choices continue to present challenges for production enterprise workloads that require guaranteed performance, predictable costs, and comprehensive security. Organizations should carefully evaluate these considerations against their specific requirements when making platform decisions.

2

Why Databricks Remains the Premier Enterprise Lakehouse for Azure in 2024

When evaluating lakehouse platforms on Azure, our analysis continues to position Databricks as the clear leader for enterprise implementations. While Fabric has made strides, Databricks’ maturity and proven track record in production environments sets it apart in several crucial ways:

  • A Truly Unified Platform for All Data Teams: Databricks provides a single collaborative environment supporting data engineers, data scientists, analysts, and business users. It offers a seamless experience across data engineering, analytics, SQL warehousing, and AI/ML workloads with consistent governance and security. Built-in collaboration features break down silos between teams.
  • Enterprise-Grade Performance and Scalability: The Photon engine delivers consistently superior query performance, with benchmarks showing 4-14x faster processing than Fabric for complex workloads. Advanced query optimization and intelligent caching ensure predictable performance under heavy loads. Native support for clustering, partitioning, and data layout optimization prevents the “noisy neighbor” issues prevalent in Fabric.
  • Production-Ready Cost Management: Granular control over compute resources allows right-sizing for specific workloads. True consumption-based pricing means you pay only for what you use. Auto-termination prevents idle resource waste. Unlike Fabric’s “use it or lose it” model, there’s no need to overprovision capacity to avoid throttling.
  • Battle-tested Security and Governance: Unity Catalog provides comprehensive, unified governance across all platform capabilities with fine-grained access controls that work consistently. Built-in data lineage and impact analysis seamlessly integrate with enterprise security tools and standards.
  • Open Architecture with Rich Ecosystem: Databricks natively supports popular open-source frameworks and tools and has an extensive partner network with pre-built integrations. The active community contributes innovations and best practices. Open formats and standards provide freedom from vendor lock-in.

While Fabric shows promise particularly for analytics scenarios, Databricks remains the clear choice for organizations building production lakehouse implementations. Its unified approach, proven architecture, predictable performance, and comprehensive governance make it the most reliable foundation for Azure enterprise data and AI initiatives.

The differences become especially apparent at scale–while Fabric may work well for smaller implementations, Databricks consistently delivers as data volumes grow and workload complexity increases. For organizations building mission-critical data platforms, Databricks provides the enterprise capabilities and operational maturity required for success.

Most importantly, choosing Databricks doesn’t preclude future Fabric adoption. As a first-class Azure service built on open standards, Databricks integrates seamlessly with other Azure services including Fabric. Organizations can confidently build on Databricks today while maintaining flexibility for their future Azure analytics strategy.

Power BI and the New Reality of Platform Coexistence

Power BI, long considered the crown jewel of Microsoft’s analytics stack, has undergone significant changes with Microsoft Fabric. Since July 2024, Microsoft retired standalone Power BI Premium capacity, though existing Premium customers and Enterprise Agreement (EA) holders can maintain their current terms until their contracts expire. Non-EA customers who had Premium capacity are given until January 2025 to transition. This shift means that going forward, many organizations must now use Fabric capacity for their Power BI workloads, requiring some Power BI customers to adopt Fabric whether they want to or not.

5

It’s important to note that Azure Databricks and Power BI have long enjoyed a strong integration story. Even before Fabric, organizations could seamlessly connect Power BI to Databricks SQL warehouses, leverage DirectQuery for large datasets, publish Databricks catalogs to Power BI datasets, and build end-to-end analytics solutions. These integration capabilities remain robust and continue to improve, with recent enhancements supporting Unity Catalog security, composite models, and performance optimizations.

The licensing changes, however, create a new reality where Fabric and Databricks may need to coexist rather than compete. This isn’t necessarily a bad thing–while it does introduce some complexity, it also presents an opportunity to leverage the best aspects of both platforms. The key is developing a thoughtful architecture that maintains Databricks’ strengths in data processing, governance, and AI/ML while taking advantage of Fabric’s analytics capabilities and Microsoft ecosystem integration.

Getting the Best of Both Worlds 

Databricks and Fabric have significant overlap in capabilities, which can create confusion for organizations trying to determine which tool to use when. Both offer data engineering, SQL analytics, machine learning, and business intelligence features. However, a deeper examination reveals clear architectural principles that can guide these decisions and help organizations leverage the best of both platforms.

We believe that to address the previously mentioned production readiness issues, three “non-negotiables” should anchor your architecture:

  1. Unity Catalog Must Be Your Governance Foundation: Unity Catalog provides comprehensive, fine-grained security and governance across all data assets–including tables, files, ML models, and applications.  Most importantly, Unity Catalog can extend its governance to Fabric through Purview integration, providing a single, consistent security model across your entire data estate.
  2. Databricks SQL Should Power Your Core Analytics: Databricks SQL, powered by the Photon engine, consistently outperforms Fabric’s warehouse capabilities in both performance and scalability. Benchmarks show 4-14x faster processing for complex workloads, and unlike Fabric, Databricks SQL provides crucial enterprise features like clustering, partitioning, and workload isolation. Power BI can connect directly to Databricks SQL while maintaining Unity Catalog’s governance controls, providing the best of both worlds for enterprise analytics.
  3. Standardize on Mosaic AI for Enterprise AI/ML: Mosaic AI provides a comprehensive suite of AI capabilities including vector search, model serving, and streamlined Azure OpenAI integration through AI Gateway. While Fabric offers some AI features, Mosaic AI delivers production-grade capabilities that integrate seamlessly with your data platform. This becomes especially important as organizations scale their AI initiatives and require enterprise-grade model governance, serving, and monitoring.

 

Let’s examine how to leverage each platform across key architectural components:

6

Core Platform Foundation

Component

Recommendation

Why This Matters

Governance 

Standardize on Unity Catalog

  • Battle-tested enterprise governance
  • Consistent security across all data assets
  • Seamless integration with Purview
  • More mature than Fabric’s OneSecurity

Storage

Use ADLS Gen2 as a common storage layer for each platform (not OneLake for the time being)

  • Provides consistent access across both platforms
  • Enables unified governance through Unity Catalog
  • Avoids governance gaps
  • Future-proofs your architecture as integration capabilities evolve

Notes:

  • Databricks can currently only read/write to OneLake via ADLS credential passthrough and cluster-level configuration which is not Unity Catalog compatible, and therefore not recommended. 
  • OneLake offers simplified management but introduces limitations with Databricks integration and potential governance gaps. While OneLake is designed to be Fabric’s default storage, it is possible to use ADLS Gen2 alongside OneLake, achieved through the use of multicloud shortcuts and connectors. Databricks Catalogs can be mirrored to Fabric but at the loss of Row Level Security and Column Level Masking.

Workload Recommendations

Workload Type

Primary Platform

Supporting Platform

Best Practice

Core Data Engineering

Databricks Workflows, Lakeflow (Delta Live Tables)

Fabric Data Factory

  • Use Databricks for heavy data processing
  • Leverage Fabric for Azure service integration and low-code orchestration

Enterprise Analytics

Databricks SQL + Power BI

Databricks AI/BI Dashboards

Microsoft 365

  • Power core analytics with Databricks SQL’s superior performance
  • Connect Power BI directly to maintain governance
  • Use Fabric for self-service analytics

AI/ML Development

Mosaic AI

Azure OpenAI

Databricks Assistant

MLflow

SynapseML

Copilot

  • Build production AI/ML on Databricks and serve through Mosaic AI Gateway and Model Serving
  • Use Databricks AutoML or AzureML for Citizen Data Science
  • Maintain consistent MLflow governance

Specialized Analytics

Case-by-Case

Synapse Real-Time Analytics

Azure AI Services

Fabric Engines for Specialized, purpose-fit Analytics

  • Use Fabric for Kusto (logs), real-time analytics
  • Leverage Fabric-native Azure service integration
  • Consider Databricks when scale/performance critical

Integration Patterns

Scenario

Approach

Key Considerations

Data Movement

  • Minimize data movement
  • Use Delta format
  • Leverage shortcuts only as last resort
  • Avoid copying data between platforms
  • Ensure consistent governance across platforms
  • Be cautious with shortcuts that might bypass security

Power BI Integration

  • Connect directly to Databricks SQL
  • Use DirectQuery when possible
  • Publish to Power BI Service from Unity Catalog
  • Maintains Unity Catalog governance and keeps data models in sync
  • Leverages Photon engine performance
  • Provides consistent security model

Azure Service Integration

  • Use Fabric for native Azure services
  • Connect through managed endpoints
  • Simplifies Azure service connectivity
  • Maintains security best practices
  • Reduces development complexity

This architecture allows organizations to leverage each platform’s strengths while maintaining consistent governance and performance. Databricks provides the robust foundation for enterprise data and AI initiatives, while Fabric excels at business user enablement and Microsoft ecosystem integration. The key is maintaining Unity Catalog as your governance foundation while thoughtfully selecting the right tool for each specific use case.

Conclusion 

2024 has brought significant changes to the Azure data landscape. While Microsoft Fabric has made notable strides in analytics and Azure service integration, our analysis confirms that Azure Databricks remains the more mature and production-ready platform for enterprise lakehouse implementations. Its unified approach, proven performance at scale, granular cost controls, and comprehensive governance through Unity Catalog provide the robust foundation that organizations need for mission-critical data and AI initiatives.

However, the reality of Power BI’s integration into Fabric means many organizations will need to leverage both platforms. Fortunately, with thoughtful architecture, this presents an opportunity rather than a compromise. The path to success lies in clear architectural principles:

  1. Build your foundation on Azure Databricks
    • Use ADLS as your primary storage layer (until OneLake integration is ready)
    • Leverage Unity Catalog for comprehensive governance
    • Power core analytics with Databricks SQL and Power BI
    • Standardize on Mosaic AI for production AI/ML workloads
  2. Complement with Fabric’s strengths
    • Self-service analytics and business user enablement
    • Low-code integration capabilities
    • Specialized Azure service integration (Kusto, Cosmos DB, etc.)
    • Microsoft ecosystem integration

For organizations able to standardize on a single platform, Databricks remains our unequivocal recommendation. For those choosing to use both platforms, this architecture provides a clear blueprint for success – combining Databricks’ enterprise-grade lakehouse capabilities with Fabric’s analytics strengths and Microsoft ecosystem integration.

The future of data and AI on Azure is not about choosing sides – it’s about architecting intelligently to leverage the best of both worlds. By building on Databricks’ proven foundation while thoughtfully incorporating Fabric’s unique capabilities, organizations can create a robust, future-proof data platform that drives innovation and growth.




Share the Post:

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

Get Started