Performs analysis, design and development of ETL processes to support projectrequirements. Develop Informatica BDM mappings, SQL/stored procedures as well as data maps forPowerExchange, or Unix shell scripts. Develop Sqoop scripts to extract data to/from RDBMS to Hadoop. Develop CDC jobs using DES and creates data ingestion pipeline using DEIcomponent. Develop Hive & HBase tables and write impala queries. Develop Spark jobs in (Scala/Python/Java) in order to stream / publish or consume datafrom Hadoop. Performs unit testing, QA, and work with business partners to resolve any issuesdiscovered during UAT. Responsible for peer-review of mappings and workflows when required. Maintains development and test data environments by populating the data based onproject requirements. Works with production control and operations as needed to promotemappings/workflows, implement schedules and resolve the issues. Reviews ETL performance and conducts performance tuning as required on mappings /workflows or SQL. Maintains all applicable documentation pertaining to specific SDLC phases.