Primary skillset: Experience working with distributed technology tools fordeveloping Batch and Streamingpipelines using SQL,Spark, Python, PySpark [4+ years], Airflow [3+ years], Scala [2+ years]. Able to write code which is optimized for performance. Experience in Cloud platform, e.g., AWS, GCP, Azure, etc. Able to quickly pick up new programming languages, technologies, and frameworks. Strong skills building positive relationships across Product and Engineering. Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders Experience with creating/ configuring Jenkins pipeline for smooth CI/CD process for Managed Spark jobs, build Docker images, etc. Working knowledge of Data warehousing, Data modelling, Governance and Data Architecture