At IKEA, within the Customer Fulfilment, Inventory & Logistics Insights domain, I serve as the semantic model architect responsible for Microsoft Fabric and Power BI analytics infrastructure serving 30,000+ users across the global supply chain organization. Since February 2023, I have led the technical architecture for the organization's transition from legacy Import-mode Power BI to modern Microsoft Fabric—designing patterns that became the standard for enterprise-scale BI governance. My work spans semantic modeling, Direct Lake migration, CI/CD pipeline design, performance engineering, and AI-augmented development—each initiative driven by measurable business outcomes.
Legacy Import-mode semantic models required full data refresh cycles, causing hours of data latency, high Fabric capacity (CU) consumption, and user dissatisfaction across 30,000+ supply chain users. Decision-makers in inventory and fulfilment operations were working with stale data, delaying critical supply chain decisions.
Architected the migration to composite models with Direct Lake mode on Microsoft Fabric. Designed a hybrid architecture that balanced real-time data access via Direct Lake with calculated table requirements that still needed Import mode. Implemented an incremental migration strategy—model by model—to avoid service disruption for active users.
Delivered near real-time analytics for the entire supply chain organization, enabling faster inventory decisions and reducing data latency from hours to minutes.
The development process was ad-hoc and environment-dependent. Deployments were manual, error-prone, and differed across dev, test, and production environments. A model that worked in development would frequently fail in production due to hard-coded connection strings and inconsistent configurations.
Implemented systematic parameterization—source control parameters, row count control, and data integrity parameters—making the development process controllable and reproducible. Built a framework where any developer could deploy identically across all three environments without manual configuration changes.
Reproducible deployments across all environments, eliminating deployment failures and dramatically reducing the release cycle.
Power BI artifacts had no version control, no audit trail, and no governance. Changes were deployed directly without review, creating compliance risks. Onboarding new team members was slow because there was no documented history of what changed, when, or why.
Designed an end-to-end Git-based source control architecture using PBIP/TMDL format with CI/CD pipelines via Azure DevOps and GitHub. Established branching strategies, pull request workflows, automated validation gates, and deployment pipelines that enforced governance at every stage.
Full audit trail for compliance, zero-downtime deployments, and significantly faster onboarding for new analysts joining the team.
Semantic models had grown organically over time, accumulating redundant measures, unoptimized relationships, and bloated table structures. Report load times were degrading as data volumes increased, and users were experiencing slow dashboard responsiveness.
Conducted a systematic performance audit across all production semantic models. Refactored DAX calculations at the formula and storage engine level, removed unused measures, optimized relationships, and restructured tables for query efficiency.
Leaner, faster models that improved the end-user experience and reduced infrastructure load.
Traditional BI development cycles were slow—DAX authoring, debugging, and documentation consumed disproportionate development hours, creating bottlenecks in delivery timelines.
Embedded AI copilots into DAX authoring, performance diagnostics, and documentation workflows. Adopted the Power BI MCP (Model Context Protocol) server from its earliest public preview, enabling AI agents to interact directly with semantic models—creating measures, validating DAX, executing bulk operations, and enforcing best practices through natural language. Developed prompt engineering patterns specific to Power BI and DAX context. Established team guidelines for effective AI-augmented development that balanced speed with quality.
Fundamentally accelerated the development cycle without sacrificing code quality or governance standards. The MCP integration shifted AI from a code-suggestion tool to an autonomous development partner across the full semantic model lifecycle.
Fabric Premium capacity (CU) costs were high and growing, with no systematic approach to usage monitoring or optimization. Refresh patterns were wasteful, and capacity allocation was not aligned with actual workload demands.
Conducted hands-on CU usage analysis combining AI-supported performance diagnostics with manual tuning. Identified and eliminated wasteful refresh patterns, optimized query distribution across capacity, and right-sized allocation based on actual usage patterns.
Significant cost reduction while maintaining—and in many cases improving—performance for end users.
Analysts lacked semantic modeling expertise, DAX proficiency, and awareness of AI-augmented workflows—creating bottlenecks and single points of failure within the team.
Established an ongoing mentoring program covering semantic modeling best practices, DAX optimization patterns, and AI-powered report generation. Created documentation and training materials. Conducted hands-on workshops tailored to each analyst's skill level and responsibilities.
Upskilled the analyst team, eliminating single points of failure and building a self-sustaining BI engineering culture within the organization.