As a Senior Data Engineer, you will be at the forefront of our data transformation, specifically architecting and building the next generation of our Financial Transaction Processing engine within Microsoft Fabric. You will lead the migration of legacy financial pipelines into a unified Lakehouse architecture, ensuring that every pence of revenue and every byte of roaming data is accounted for with 100% accuracy.This role requires a blend of high-level architectural design, hands-on pipeline engineering, and a deep understanding of telecom financial domains (billing, reconciliation, and inter-carrier settlements).Key ResponsibilitiesArchitectural Design: Lead the design and implementation of a Medallion Architecture (Bronze, Silver, Gold) within MS Fabric to handle high-velocity telecom transactions.Data Pipeline Engineering: Develop end-to-end ETL/ELT workflows using Fabric Data Factory, Spark Notebooks (PySpark), and Dataflows Gen2.Financial Migration: Lead the migration of legacy SQL-based financial workloads into Fabric Lakehouses and Warehouses, ensuring zero data loss and maintaining historical integrity.Configuration & Governance: Configure Fabric workspaces, capacities, and OneLake security. Implement rigorous data governance and Row-Level Security (RLS) to protect sensitive financial PII.Performance Optimization: Tune large-scale Spark jobs and SQL queries to process millions of Call Detail Records (CDRs) and financial events with minimal latency.Analytics & Reporting: Build high-performance Semantic Models and Direct Lake datasets for real-time financial reporting and margin analysis.Mandatory Skills & QualificationsTechnical ExpertiseMicrosoft Fabric: Expert-level knowledge of the Fabric ecosystem (Lakehouse, Eventhouse, Data Factory, and OneLake).Pipeline Development: Strong experience with Azure Data Factory or Fabric Pipelines, including complex orchestration and incremental loading patterns.Coding & Transformation: Advanced proficiency in PySpark/Python and T-SQL for complex financial logic (e.g., multi-currency conversion, tax calculations).Architecture: Deep understanding of Delta Lake, Star Schema modeling, and SaaS-based data platform design.DevOps: Hands-on experience with Git integration, CI/CD for Fabric items, and Azure DevOps deployment pipelines.Domain KnowledgeTelecom Finance: Familiarity with telecom-specific data such as CDRs, roaming settlements, prepaid/postpaid billing cycles, and Churn analytics.Financial Accuracy: Experience building automated reconciliation frameworks to catch revenue leakage or transaction mismatches.Required Experience8+ years in Data Engineering3+ years hands-on with Microsoft FabricProven experience delivering large-scale data migration programmes in UK telecom or retailStrong SQL & PySpark expertiseExperience with financial data reconciliation and ledger alignmentStrong stakeholder engagement across Finance, Billing, and Technology teamsExperience operating within UK enterprise governance frameworksPreferred ExperienceCertifications: DP-600 (Microsoft Fabric Analytics Engineer) or DP-203 (Azure Data Engineer).Legacy Knowledge: Experience migrating from on-prem SQL Server or older Azure Synapse environments.Real-Time Intelligence: Knowledge of KQL (Kusto) for monitoring real-time transaction streams.Skills to be evaluated onMicrosoft FabricAzure-Data-FactoryFabric-AnalyticsDatabricksMandatory Skills : Microsoft Fabric