Desired Role:
Solid knowledge of Datawarehouse and Big Data Frameworks, preferably Databricks.
Experience in Large scale Cloud DWH Migration, define and administer standard architecture methodologies, processes and tools across the engagement and best practices.
Experience working in cloud-based AWS tech stack that includes Databricks, Spark, Airflow, Python, and Scala. Designing big data infrastructure to run large scale and complex data pipelines that collect, organize, and standardize data.
Lead a team of Data engineers and design engineers.
Mandatory skills*
AWS, Databricks, Python, Spark, Scala
Desired skills*
Exasol
Domain*
Retail
new onsite position on 6 month contract. It's for 10-13 yrs experience in Bigdata tech / data bricks.
Open to Bangalore/ Pune/ Chennai/ Hyderabad/ Gurgaon once back from Germany