As a Senior Data Engineer, you will be at the forefront of our data transformation, specifically architecting and building the next generation of our Financial Transaction Processing engine within Microsoft Fabric. You will lead the migration of legacy financial pipelines into a unified Lakehouse architecture, ensuring that every pence of revenue and every byte of roaming data is accounted for with 100% accuracy.
This role requires a blend of high-level architectural design, hands-on pipeline engineering, and a deep understanding of telecom financial domains (billing, reconciliation, and inter-carrier settlements).
Key Responsibilities
Architectural Design: Lead the design and implementation of a Medallion Architecture (Bronze, Silver, Gold) within MS Fabric to handle high-velocity telecom transactions.
Data Pipeline Engineering: Develop end-to-end ETL/ELT workflows using Fabric Data Factory, Spark Notebooks (PySpark), and Dataflows Gen2.
Financial Migration: Lead the migration of legacy SQL-based financial workloads into Fabric Lakehouses and Warehouses, ensuring zero data loss and maintaining historical integrity.
Configuration & Governance: Configure Fabric workspaces, capacities, and OneLake security. Implement rigorous data governance and Row-Level Security (RLS) to protect sensitive financial PII.
Performance Optimization: Tune large-scale Spark jobs and SQL queries to process millions of Call Detail Records (CDRs) and financial events with minimal latency.
Analytics & Reporting: Build high-performance Semantic Models and Direct Lake datasets for real-time financial reporting and margin analysis.
Mandatory Skills & Qualifications
Technical Expertise
Microsoft Fabric: Expert-level knowledge of the Fabric ecosystem (Lakehouse, Eventhouse, Data Factory, and OneLake).
Pipeline Development: Strong experience with Azure Data Factory or Fabric Pipelines, including complex orchestration and incremental loading patterns.
Coding & Transformation: Advanced proficiency in PySpark/Python and T-SQL for complex financial logic (., multi-currency conversion, tax calculations).
Architecture: Deep understanding of Delta Lake, Star Schema modeling, and SaaS-based data platform design.
DevOps: Hands-on experience with Git integration, CI/CD for Fabric items, and Azure DevOps deployment pipelines.
Domain Knowledge
Telecom Finance: Familiarity with telecom-specific data such as CDRs, roaming settlements, prepaid/postpaid billing cycles, and Churn analytics.
Financial Accuracy: Experience building automated reconciliation frameworks to catch revenue leakage or transaction mismatches.
Required Experience
8+ years in Data Engineering
3+ years hands-on with Microsoft Fabric
Proven experience delivering large-scale data migration programmes in UK telecom or retail
Strong SQL & PySpark expertise
Experience with financial data reconciliation and ledger alignment
Strong stakeholder engagement across Finance, Billing, and Technology teams
Experience operating within UK enterprise governance frameworks
Preferred Experience
Certifications: DP-600 (Microsoft Fabric Analytics Engineer) or DP-203 (Azure Data Engineer).
Legacy Knowledge: Experience migrating from on-prem SQL Server or older Azure Synapse environments.
Real-Time Intelligence: Knowledge of KQL (Kusto) for monitoring real-time transaction streams.
Skills to be evaluated on
Microsoft FabricAzure-Data-FactoryFabric-AnalyticsDatabricks
Mandatory Skills : Microsoft Fabric
Experience
6 - 10 Years
No. of Openings
10
Education
Graduate
Role
Data Engineer
Industry Type
IT-Hardware & Networking
Gender
[ Male / Female ]
Job Country
India
Type of Job
Contractor
Work Location Type
Work from Home