Responsibilities and Requirements:
Implement and enhance complex big data solutions with a focus on collecting, parsing, managing, analyzing, and visualizing large data sets that produce valuable business insights and discoveries
Expertise in ETL and automation processes. Experience developing Tabular models in SQL server 2012-2016 environments (T-SQL).
High Proficiency in Python. Experience with applications like Jupyter, Zeppelin, Hue, RStudio.
Scrape structure/unstructured data from various vendors/sources, and normalize them into databases in structured format. Comfortable operating with APIs and web hosting scrapings.
Develop SQL queries/stored procedures to retrieve and manipulate the data from the databases.
Build reports/dashboards in Tableau. Experience with Tableau, PowerBI, or Tibco Spotfire.
Experience with SQL Database, Data Factory, and BI solutions using the Microsoft Azure platform.
Automate daily processes using Python. Develop code in python for data manipulation and build models.
Working experience in CI/CD pipeline (test, build, deployment and monitoring automation)
Knowledgeable with basic machine learning and automation tasks
Good in Data Structure, Algorithms and Design Patterns
Education and Background:
At Least a Bachelors in Computer Science
Minimum 5years of programming background with experience in ETL automation
Ability to communicate and quickly develop solutions according to client needs
Demonstrates excellent problem-solving skills