Array ( [0] => etl [1] => chennai ) ETL Jobs in Chennai,ETL Job Vacancies in Chennai Tamil Nadu
26

ETL Job Vacancies in Chennai

filter
  • Location
  • Role
  • Functional Area
  • Qualification
  • Experience
  • Employer Type
  • 4 - 6 yrs
  • 7.0 Lac/Yr
  • Chennai
Data Governance Data Lake Data Loading Data Pipelines Data Transformation Query Optimization Performance Tuning Data Architecture Data Warehousing SQL ETL Scripting Data Integration Data Migration Data Modeling Database Design Snowflake Python Big Data Cloud Computing
We are looking for Certified Snow Flake Developer with 4 to 6 year experience in Chennai.Strong knowledge of SQLExperience with Snowflake architectureUnderstanding of Data Warehousing conceptsExperience with ETL / ELT toolsKnowledge of Cloud Platforms (AWS, Azure, GCP)Programming knowledge in Python, Java, or Scala
View all details

Looking For Data Engineer

BSRI Solutions Pvt Ltd

  • 3 - 5 yrs
  • 16.0 Lac/Yr
  • Chennai
Python Pyspark Developer Scala SQL Hive Hadoop Google Cloud Platform Kafka Developer Infrastructure AS Code GitHub Agile Methodology ETL
Required Qualifications : 3+ years of demonstrated ability with Hive, Python, Spark/Scala, SQL, etc. Google Cloud Platform Experience, Big Query, Cloud Storage, Dataproc, Data Flow, Cloud Composer, Cloud SQL, Pub Sub, Terraform, etc. Experience with Hadoop Ecosystem, Kafka, PCF cloud services Familiar with big data and machine learning tools and platforms Experience with BI tools, such as Alteryx, Data Stage, QlikSense, etc. Design data pipelines and data robots, take a vision and bring it to life Master data engineer; mentors others; works closely with IT architects to set strategy and design projects Provide extensive technical, and strategic advice and guidance to key stakeholders around the data transformation efforts Redesign data flows to prevent recurring data issues Strong analytical and problem-solving skills Possess excellent oral and written communication skills, as well as facilitationand presentation skills, and engaging presentation style. Ability to work as a global team member, as well as independently, in achanging environment and prioritize. Ability to establish and maintain coordinated and effective working relationships with application implementation teams, IT project teams, business customers, and end users. Ability to deliver work within deadlines. Experience with agile/lean methodologies Experience working independently and with minimal supervision Experience with Test Driven Development and Software Craftsmanship Experience with GitHub, Accurev, or other version-control systems Experience with Putty Experience with Datastage Strong Communications skills Ability to illustrate and convey ideas and prototypes effectively with team and partners Presence demonstrating confidence, ability to learn quickly, influence, and shape ideas Key Skills Required - Data Engineer- Python / PySpark / Scala- SQL & Hive- Hadoop Ecosystem- Data Pipeline Design & ETL Development- Google Cloud Platform (BigQuery, Dataproc, Dataflow, Cloud Storage)- Kafka / Streaming Data Processing- Terraform (Infrastructure as Code)- DataStage or Similar ETL Tools- Version Control (GitHub or equivalent)- Agile Methodologies- Strong Analytical & Problem-Solving Skills- Stakeholder Collaboration & CommunicationNice to Have:- Cloud Composer, Cloud SQL, Pub/Sub- BI Tools (Alteryx, QlikSense)- Machine Learning Platform Exposure- Test Driven Development (TDD)- Mentoring & Technical Leadership
View all details

Snowflake Developer

Firstwave Technology

  • 4 - 9 yrs
  • Chennai
SQL Azure ETL Tool Snow Flake Developer
Job Summary: We are seeking a Snowflake Data Engineer to join our Data & Analytics team. This role involvesdesigning, implementing, and optimizing Snowflake-based data solutions. The ideal candidate will haveproven, hands-on data engineering expertise in Snowflake, cloud data platforms, ETL/ELT processes,and Medallion data architecture best practices. The data engineer role has a day-to-day focus onimplementation, performance optimization and scalability. This is a tactical role requiring independentdata analysis and data discovery to understand our existing source systems, fact and dimension datamodels, and implement an enterprise data warehouse solution in Snowflake. This role will take directionfrom the Lead Snowflake Data Engineer and Director of Data Engineering for their work while bringingtheir own domain expertise and experience.Essential Functions and Tasks: Participate in the design, development, and maintenance of a scalable Snowflake data solution servingour enterprise data & analytics team. Implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake andrelated technologies. Optimize Snowflake database performance Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and softwareengineers, to define and implement data solutions. Ensure data quality, integrity, and governance. Troubleshoot and resolve data-related issues, ensuring high availability and performance of the dataplatform.Education and Experience Requirements: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. 4+ years of experience in-depth data engineering, with at least 1+ minimum year(s) of dedicatedexperience engineering solutions in an enterprise scale Snowflake environment. Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. Strong experience with cloud platforms (preference to Azure) and their data services. Experience in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, orFivetran. Hands-on experience with scripting languages like Python for data processing. Snowflake SnowPro certification; preference to the engineering course path. Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. Familiarity with BI and visualization tools such as PowerBI.Knowledge, Skills, and Abilities: Familiarity working in an agile scum team, including sprint planning, daily stand-ups, backloggrooming, and retrospectives. Ability to self-manage medium complexity deliverables and document user stories and tasksthrough Azure Dev Ops. Personal accountability to committed sprint user stories and tasks Strong analytical and problem-solving skills with the ability to handle complex data challenges Ability to read, understand, and apply state/federal laws, regulations, and policies. Ability to communicate with diverse personalities in a tactful, mature, and professional manner. Ability to remain flexible and work within a collaborative and fast paced environment. Understand and comply with company policies and procedures. Strong oral, written, and interpersonal communication skills. Strong time management and organizational skills.Physical Demands: 40 hours per week Occasional Standing Occasional Walking Sitting for prolonged periods of time Frequent hand, finger movement Communicate verbally and in writing Extensive use of computer keyboard and viewing of computer screen Specific vision abilities required by this job include close vision
View all details

Urgent Requirement For ETL Automation

E2E Infoware Management Services

Automation Pyspark SQL
Skill Name: ETL Automation TestingLocation: Bangalore, Chennai and Pune (wfo)Experience: 4+ YearsRequired:Experience in ETL Automation TestingStrong experience in PythonStrong experience in Pyspark.
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!

Hiring For Data Engineer

International Recruiters

DataPipelines Etlprocesses Etldeveloper BusinessIntelligence Business Intelligence Tool Data Integration BI Analyst Business Intelligence Analyst SQL SQL Developer Data Engineer Python For ETL API Development Data Scripting Dataware Housing Azure Administrator Ssis Developer Azure Data Engineering
Location: Guindy, Chennai, Tamil Nadu, IndiaExperience: 36 yearsReports To: BI ManagerJob Type: Full-timeKey Responsibilities Build and maintain scalable data pipelines and ETL processes for BI needs. Optimize data models and ensure clean, reliable, and well-structured data for Power BI. Integrate data from various internal systems (web apps, databases, accounting platforms). Collaborate with BI analysts and managers to meet reporting requirements. Automate data refreshes and performance-tune dashboards. Maintain metadata, data dictionaries, and lineage documentation.Required Skills Strong SQL development and performance tuning skills. Hands-on experience with Power BI dataset structuring and integration. Familiarity with Python for ETL, automation, or data cleaning tasks. Knowledge of APIs and scripting for data ingestion. Understanding of data warehousing and modelling techniques (star/snowflake schemas). Experience working with Azure, SSIS, or any cloud-based data services is a plus. Familiarity with version control systems like Git
View all details
AWS ETL Consultant ETL Tool
Job Openings for 5 AWS Developer Jobs with minimum 8 Years Experience in Chennai, Pune, Greater Hyderabad, Delhi NCR, having Educational qualification of : Professional Degree with Good knowledge in AWS, ETL Consultant, ETL Tool etc.
View all details

AWS Data Engineer

Hexaware Technologies

SQL AWS Python ETL TERRAFORM LAMDA
Work Mode: Hybrid6-9 years of overall IT experience, preferably in cloud environments.Minimum of 5 years of hands-on experience with AWS cloud development projects.Design and develop AWS data architectures and solutions.Build robust data pipelines and ETL processes using big data technologies.Utilize AWS data services such as Glue, Lambda, Redshift, and Athena effectively.Implement infrastructure as code (IaC) using Terraform.Proficiency in SQL, Python, and other relevant programming/scripting languages.Experience with orchestration tools like Apache Airflow or AWS Step Functions.Strong understanding of data warehousing concepts, data lakes, and data governance frameworks.Expertise in data modeling for both relational and non-relational databases.Excellent communication skills are essential for this role.
View all details
  • 2 - 6 yrs
  • 6.5 Lac/Yr
  • Bangalore Highway Chennai
Data Management Apache Data Integration SQL ETL Tool ETL Hadoop AWS Snowflake Azure
Invitation for B2B Partnerships: Seeking Software Development Support! We are looking to collaborate with companies that can provide 10 skilled Data Engineers to support our data engineering requirements pipeline for European clients on a contract basis.Requirements:Expertise in data engineering, including data processing and ETL.Proficiency in SQL and NoSQL databases.Experience with Hadoop or Spark.Experience in AWS or Azure.Snowflake is an added advantage.If your company is equipped to provide top-notch Data Engineers, we invite you to submit your proposal with terms and conditions.Please contact us by email or WhatsAppsrividya032001@gmail.com (or) 8884752389
View all details
  • 2 - 6 yrs
  • 6.5 Lac/Yr
  • Bangalore National Highway Chennai
Data Warehousing Apache Data Integration Data Management SQL ETL Tool Hadoop AWS Big Data Python Java Java-script React Js
Invitation for B2B Partnerships: Seeking Software Development Support! We are looking to collaborate with companies that can provide 10 skilled Data Engineers to support our data engineering requirements pipeline for Euoropean clients on a contract basis.Requirements:Expertise in data engineering, including data processing and ETL.Proficiency in SQL and NoSQL databases.Experience with Hadoop or Spark.Experience in AWS or Azure.Snowflake is an added advantage.If your company is equipped to provide top-notch Data Engineers, we invite you to submit your proposal with terms and conditions.Please contact us by email or Whatsapprakanalytics@gmail.com (or) 9900173022
View all details

Data Analyst

Emarlex Multiventure LLP

  • 3 - 8 yrs
  • 15.0 Lac/Yr
  • Chennai
Data Management Data Analysis Informatica SQL Data Mining SSIS Python Power BI ETL Developer
Key Responsibility Area: (Specifies Key Result Areas for the Incumbent)o Engage with clients to precisely identify and fulfill their data engineering needs.o Lead and manage special projects to meet strategic goals.o Develop advanced models with Snowflake and SSIS to enhance data accuracy and utility.o Continuously refine data processing rules and procedures for optimal results.o Design and implement scalable data architectures using Snowflake.o Maintain and enhance data pipelines, integrating new data sources and APIs as needed.o Monitor and ensure high data quality across systems for reliable decision-making.o Utilize SSIS for efficient data extraction, transformation, and loading processes.o Leverage Informatica knowledge to improve data management capabilities.Eligibility Criteria: (Skill set required to do the job- Knowledge, Tools, Technical Knowledge, Certifications etc.)o Demonstrated expertise in SQL and strong proficiency in programming languages such as Python.o Extensive experience in designing, building, and maintaining robust data pipelines and complex data architectures.o Prior experience with Informatica or equivalent ETL tools, providing a significant advantage.o In-depth understanding of ETL processes, data modeling techniques, and data warehousing principles.o Familiarity with big data frameworks like Hadoop or Spark, adding value to the role.o Strong knowledge of data serialization formats including Parquet, Avro, and JSON, essential for efficient data processing and storage.o Exceptional analytical skills, adept at problem-solving and technical troubleshooting.o Excellent communication and interpersonal skills, essential for effective client engagements and team collaboration.o Proven ability to independently manage multiple projects, demonstrating strong organizational and leadership skills.o Relevant professional certifications in Snowflake, SSIS, Informatica, or comparable data management technologies are highly regarded
View all details
Senior Software Tester ETL Developer Selenium AWS Developer Full Stack Developer Angular Developer
We are Looking for Software Engineers Freshers and Experience
View all details

ETL Architect (2-6 Years)

VERVENEST TECHNOLOGIES

ETL Developer SQL Informatica
ETL Architects We are looking for ETL Architects(Any ETL tools) who has vast experience in ETL. The candidate should possess the below:Expert level of experience in ETL-Nice to have good experience in XML and Webservice transformations -Good understanding on SQL and Stored ProceduresExperience in B2B Integration projects Should have good analytical and problem solving skills-Should be good team player and able to lead a team of 6+ members-Should be able to understand legacy ETL codes and re-design it to meet performance and best practices standards-Should be able to troubleshoot and identify the translation issues when data transferred from source to target-Should be able to write technical design on the requirements, which can be used by the developers -Support developers to produce ETL code which meet standards and best practices-Consult with users, and their management & technical personnel to clarify and validate business requirements -Should be able to implement data integration best practices and conventions -Should be able to identify potential problems and proactively suggest changes and solutions -Should be able to act as a reference for the data integration and reporting solution -Interact with and communicate detailed technical requirements to project development team Requirements :ETL Architect - Minimum 2+ years Hands on development experience is mandatoryETL tool Informatica - 5 YearsSQL experience is mandatory - 5 YearsNotice period: Immediate to 30daysLocation: Chennai / Thiruvananthapuram /Kochi Interview Process: 1 Technical. 1 Managerial and 1 HR discussionHybrid work modelSalary: as per market Standardsif you are looking for a job or job change, send me a contact number or emailRegards,Nandini,HR
View all details
ETL Architect
*Expert level of experience in ETL*Experience in XML and Webservice transformations*Experience in B2B Integration projects*Good understanding on SQL and Stored Procedures*Should have good analytical and problem solving skills*Understand legacy ETL codes and re-design it to meet performance and best practices standards*Write technical design on the requirements, which can be used by the developers*Support developers to produce ETL code which meet standards and best practices*Act as a reference for the data integration and reporting solution*implement data integration best practices and conventions*Be able to lead a team of 6+ members*Interact with and communicate detailed technical requirements to project development team *Identify potential problems and proactively suggest changes and solutions*Consult with users, and their management & technical personnel to clarify and validate business requirements
View all details

Datastage Developer

Cloud BC Labs

Teradata Informatica SQL ETL Tool
Datastage Sr. Developer - Job Description:Strong hands on experience on DataStage developmentKnowledge of PL/SQLStrong analytical skillsKnowledge of Teradata, Snowflake, Snaplogic will be add-ons, to primary skillsGood Communication SkillsReady to work in second shiftExperience : 5-7 years
View all details

AWS Glue Developer

Opportunity Next

AWS Glue ETL Development ETL Developer Python Spark SQL s3 Athena Redshift Data Modeling Data Lakes Work From Home
We are looking for a highly skilled AWS Glue Developer with a minimum of 5-10 years of experience in the industry to join our team at IBM India. The ideal candidate will be responsible for developing and maintaining data integration solutions using AWS Glue, building ETL pipelines, and working on data lakes.Responsibilities:Develop and maintain ETL pipelines using AWS Glue.Design and develop data integration solutions using AWS Glue.Develop and maintain data ingestion, transformation, and integration solutions.Work on data lakes and build efficient data models.Collaborate with cross-functional teams to understand business requirements and design solutions accordingly.Optimize ETL performance and troubleshoot issues.Develop and maintain technical documentation.Stay up to date with the latest AWS Glue developments and incorporate new features and functionality into existing solutions.Requirements:Bachelor's degree in Computer Science or a related field.Minimum of 5-10 years of experience in AWS Glue development.Strong experience with ETL development using AWS Glue.Proficient in Python, Spark, and SQL.Experience with AWS services such as S3, Athena, and Redshift.Familiarity with data modeling and designing data lakes.Experience working with cross-functional teams and collaborating on projects.Excellent communication and documentation skills.Ability to troubleshoot and optimize ETL performance.
View all details

PySpark Developer

Sadup Softech Pvt Ltd

  • 4 - 10 yrs
  • Chennai
Spark Python Pyspark ETL HDFS
Professional experience with a cloud platform Developer must have sound knowledge in Apache Spark and Python programming. Deep experience in developing data processing tasks using pySpark such as reading data from external sources, merge data, perform data enrichment and load in to target data destinations. Ability to design, build and unit test the application in Spark/Pyspark. In-depth knowledge of Hadoop, Spark, and similar frameworks. Ability to understand existing ETL & logic to convert into Spark/PySpark/ Spark SQL. Knowledge of Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types, HDFS compression codec. Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources.
View all details

ETL Consultant

Sapwood Ventures Pvt Ltd

Informatica ETL Consultant
Requirements:1. Proficient in design and development of data pipelines/integration jobs.2. Experience in Azure data factory/aws Glue/Talend/SSIS/informatica power centre.3. Experience in python data transformation with S3/BLOB,etc.4. Experience in constructing logical and physical data modelling to build the data warehouse/data lake.5 Experience in ETL methodology for performing Data Profiling, Data migration, extraction, transformation and loading using ETL tools.
View all details

SR Data Engineer

Proto talent consultants

Data Engineer Software Developer Business Intelligence Data Scientist SQL Data Base Administrator Python Data Modeling ETL Developer Work From Home
Basic Qualifications5+ years of experience as a Data Engineer or in a similar role3+ years of industry experience in software development, data engineering, business intelligence, data science, or related field3+ years of experience in SQL & PythonBachelor's degree in computer science, engineering, mathematics, or a related technical disciplineExperience with data modelling, data warehousing, and building ETL pipelinesProven track record of solving complex business problem with high quality & automated testing in agile environment.Experience using one of the big data technologies (Spark, EMR, Dbt cloud etc)Excellent communication & analytics skillsPreferred QualificationsExperience working with AWS big data technologies (EMR, Redshift, S3, Lambda, RDS)Experience working with SnowflakeDemonstrated strength in data modeling, ETL development, and data warehousingExcellent understanding of any other programming languages like Java or JavaScriptExperience working with one of big data schedulers like Airflow, Oozie, Azkaban etc.Experience in data streaming with Kafka, Spark Streaming and KinesisBasic knowledge on Containerization and Orchestration like Docker, KubernetesUnderstanding of python data libraries like NumPy, Scikit-Learn etc.Open to learn new technologies & AI libraries
View all details

SR Data Engineer

Spright technologies Inc

Python Spark Azure ETL Data Warehousing Master Data Management Data Engineer
Position: Data Engineer with Python.Location: RemoteDuration: 6 to 12 months+Work hours: 2/3 pm to 10/11 pm ISTType - ContractMinimum 5+ years in Data Engineer role.Expert in PythonFamiliarity with Spark, PySpark.Knowledge in at least any one of the Cloud platform - Azure or GCP or AWSExperience in writing complex and efficient ETL jobs and Data pipelines.Excellent in writing complex SQL Queries and ScriptsProven expertise in modern Data Architecture, Data Modeling, Database architecture, Database Design, Database programming (SQL, Python, etc..)Experience with analytics platforms in Cloud (Azure, GCP, AWS etc..)Experience designing and developing ETL and ELT processes in a variety of platforms (e.g. Azure Data Factory, Data Bricks, etc....)Experience with Data Collaboration Platforms (e.g. Calibra, Alation) and Automated meta data management solutionsExpertise in architecting Master Data Management, Operational Data Stores, Data Lakes, Data Warehousing solutions.
View all details

DATA ENGINEER (Informatica BDM)

KGP Manpower Consulting Pvt Ltd

Informatica Big Data Management SAS-Statistical Analysis System ETL Hadoop DATA ENGINEER Azure JSON XML Scala Spark Github DevOps Data Miration
DATA ENGINEER (Informatica BDM)Job Location: Dubai & Offshore (Chennai, Hyderabad, Bangalore)Experience: 5+ YearsNotice: 30 days or lessMax CTC: 15K AED per month/ 18LPA for offshoreJOB Description:Should have Strong expertise of Extraction, Transformation and Loading (ETL) mechanism using Informatica Big Data Management 10.2.X and various Push down mode using Spark, Blaze and Hive execution engine. Should have Strong expertise of Dynamic mapping Use case, Development, Deployment mechanism using Informatica Big Data Management 10.2.X. Should have experience on transforming and loading various Complex data source types such as Unstructured data sources ,No SQL Data Sources. Should have Strong expertise of Hive Database including Hive DDL, Partition and Hive Query Language. Should have Good Understanding of Hadoop Eco system (HDFS, Spark, Hive). Should have Strong expertise of SQL/PLSQL. Should have Good knowledge on working with Oracle/Sybase/SQL Databases. Should have Good knowledge of Data Lake and Dimensional data Modelling implementation. Should be able to understand the requirements and write Functional Specification Document, Design Document and Mapping Specifications.
View all details

Opening For GCP Developer

Hexaware Technologies

GCP Developer Data Base SQL Python DATA FLOW DATA CRAP ETL
Experience : 6-9yrs Notice Period : imm - 60 daysWork location : Pune, Bangalore, Chennai, Mumbai Work mode : Hybrid Key Requirements:- Strong experience in cloud migrations and pipelines- Good understanding of Database and Data Engineering concepts- Hands-on experience in SQL and Python- Experience in Java development- Proficiency in Google Cloud Platform tools like Data Flow, Data Transfer services, and AirFlow- Working knowledge of Data Preprocessing techniques using DataFlow, DataProc, and DataPrep- Familiarity with BigQuery, Kafka, PubSub, GCS, and Schedulers- Proficiency in PostgreSQL is preferred- Experience with real-time and scheduled pipelines- Cloud certification is a plus- Experience in implementing ETL pipelines- Familiarity with MicroServices or Enterprise Application Integration Patterns is advantageous
View all details

Data Engineer

United Technology

  • 1 - 3 yrs
  • 4.0 Lac/Yr
  • Chennai
Data Integration Data Engineer Hadoop ETL SQL Informatica Apache AWS Big Data Python
We are looking Data Engineer with 1 to 3 years experience in Chennai.Immediate joiners preferred
View all details
View More Jobs

Apply to 26 ETL Job Vacancies in Chennai

  • Chennai Jobs
  • Hyderabad Jobs
  • Ahmedabad Jobs
  • Bangalore Jobs
  • Mumbai Jobs
  • Pune Jobs
  • Kolkata Jobs
  • Delhi Jobs