41

ETL Developer Jobs

filter
  • Location
  • Role
  • Functional Area
  • Qualification
  • Experience
  • Employer Type
  • 4 - 6 yrs
  • 7.0 Lac/Yr
  • Chennai
Data Governance Data Lake Data Loading Data Pipelines Data Transformation Query Optimization Performance Tuning Data Architecture Data Warehousing SQL ETL Scripting Data Integration Data Migration Data Modeling Database Design Snowflake Python Big Data Cloud Computing
We are looking for Certified Snow Flake Developer with 4 to 6 year experience in Chennai.Strong knowledge of SQLExperience with Snowflake architectureUnderstanding of Data Warehousing conceptsExperience with ETL / ELT toolsKnowledge of Cloud Platforms (AWS, Azure, GCP)Programming knowledge in Python, Java, or Scala
View all details

Looking For Data Engineer

BSRI Solutions Pvt Ltd

  • 3 - 5 yrs
  • 16.0 Lac/Yr
  • Chennai
Python Pyspark Developer Scala SQL Hive Hadoop Google Cloud Platform Kafka Developer Infrastructure AS Code GitHub Agile Methodology ETL
Required Qualifications : 3+ years of demonstrated ability with Hive, Python, Spark/Scala, SQL, etc. Google Cloud Platform Experience, Big Query, Cloud Storage, Dataproc, Data Flow, Cloud Composer, Cloud SQL, Pub Sub, Terraform, etc. Experience with Hadoop Ecosystem, Kafka, PCF cloud services Familiar with big data and machine learning tools and platforms Experience with BI tools, such as Alteryx, Data Stage, QlikSense, etc. Design data pipelines and data robots, take a vision and bring it to life Master data engineer; mentors others; works closely with IT architects to set strategy and design projects Provide extensive technical, and strategic advice and guidance to key stakeholders around the data transformation efforts Redesign data flows to prevent recurring data issues Strong analytical and problem-solving skills Possess excellent oral and written communication skills, as well as facilitationand presentation skills, and engaging presentation style. Ability to work as a global team member, as well as independently, in achanging environment and prioritize. Ability to establish and maintain coordinated and effective working relationships with application implementation teams, IT project teams, business customers, and end users. Ability to deliver work within deadlines. Experience with agile/lean methodologies Experience working independently and with minimal supervision Experience with Test Driven Development and Software Craftsmanship Experience with GitHub, Accurev, or other version-control systems Experience with Putty Experience with Datastage Strong Communications skills Ability to illustrate and convey ideas and prototypes effectively with team and partners Presence demonstrating confidence, ability to learn quickly, influence, and shape ideas Key Skills Required - Data Engineer- Python / PySpark / Scala- SQL & Hive- Hadoop Ecosystem- Data Pipeline Design & ETL Development- Google Cloud Platform (BigQuery, Dataproc, Dataflow, Cloud Storage)- Kafka / Streaming Data Processing- Terraform (Infrastructure as Code)- DataStage or Similar ETL Tools- Version Control (GitHub or equivalent)- Agile Methodologies- Strong Analytical & Problem-Solving Skills- Stakeholder Collaboration & CommunicationNice to Have:- Cloud Composer, Cloud SQL, Pub/Sub- BI Tools (Alteryx, QlikSense)- Machine Learning Platform Exposure- Test Driven Development (TDD)- Mentoring & Technical Leadership
View all details
  • 2 - 5 yrs
  • 20.0 Lac/Yr
  • Bhubaneswar
AB Initio ETL SQL UNIXShell Scripting
We are hiring Ab Initio professionals with 3-6 years of experience in any domin should have knowldge on Ab Initio & PL/SQL. Selected candidates will undergo a mandatory 3-month training program to align with project standards. After successful completion of training and clearing the interview, candidates will be onboarded as full-time Ab Initio Developers.Training location is in BhubaneshwarTraining DetailsTraining Duration: 3 MonthsStipend During Training:-8,000 - -10,000 per monthPost-Training EmploymentRole: Ab Initio DeveloperCTC After Selection:-9 - 10 LPAEmployment Type: Full-Time, PermanentIntrested can share CV to Rekha.C@eagledrift.com
View all details

Snowflake Developer

Firstwave Technology

  • 4 - 9 yrs
  • Chennai
SQL Azure ETL Tool Snow Flake Developer
Job Summary: We are seeking a Snowflake Data Engineer to join our Data & Analytics team. This role involvesdesigning, implementing, and optimizing Snowflake-based data solutions. The ideal candidate will haveproven, hands-on data engineering expertise in Snowflake, cloud data platforms, ETL/ELT processes,and Medallion data architecture best practices. The data engineer role has a day-to-day focus onimplementation, performance optimization and scalability. This is a tactical role requiring independentdata analysis and data discovery to understand our existing source systems, fact and dimension datamodels, and implement an enterprise data warehouse solution in Snowflake. This role will take directionfrom the Lead Snowflake Data Engineer and Director of Data Engineering for their work while bringingtheir own domain expertise and experience.Essential Functions and Tasks: Participate in the design, development, and maintenance of a scalable Snowflake data solution servingour enterprise data & analytics team. Implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake andrelated technologies. Optimize Snowflake database performance Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and softwareengineers, to define and implement data solutions. Ensure data quality, integrity, and governance. Troubleshoot and resolve data-related issues, ensuring high availability and performance of the dataplatform.Education and Experience Requirements: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. 4+ years of experience in-depth data engineering, with at least 1+ minimum year(s) of dedicatedexperience engineering solutions in an enterprise scale Snowflake environment. Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. Strong experience with cloud platforms (preference to Azure) and their data services. Experience in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, orFivetran. Hands-on experience with scripting languages like Python for data processing. Snowflake SnowPro certification; preference to the engineering course path. Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. Familiarity with BI and visualization tools such as PowerBI.Knowledge, Skills, and Abilities: Familiarity working in an agile scum team, including sprint planning, daily stand-ups, backloggrooming, and retrospectives. Ability to self-manage medium complexity deliverables and document user stories and tasksthrough Azure Dev Ops. Personal accountability to committed sprint user stories and tasks Strong analytical and problem-solving skills with the ability to handle complex data challenges Ability to read, understand, and apply state/federal laws, regulations, and policies. Ability to communicate with diverse personalities in a tactful, mature, and professional manner. Ability to remain flexible and work within a collaborative and fast paced environment. Understand and comply with company policies and procedures. Strong oral, written, and interpersonal communication skills. Strong time management and organizational skills.Physical Demands: 40 hours per week Occasional Standing Occasional Walking Sitting for prolonged periods of time Frequent hand, finger movement Communicate verbally and in writing Extensive use of computer keyboard and viewing of computer screen Specific vision abilities required by this job include close vision
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!

Hiring For Data Engineer

International Recruiters

DataPipelines Etlprocesses Etldeveloper BusinessIntelligence Business Intelligence Tool Data Integration BI Analyst Business Intelligence Analyst SQL SQL Developer Data Engineer Python For ETL API Development Data Scripting Dataware Housing Azure Administrator Ssis Developer Azure Data Engineering
Location: Guindy, Chennai, Tamil Nadu, IndiaExperience: 36 yearsReports To: BI ManagerJob Type: Full-timeKey Responsibilities Build and maintain scalable data pipelines and ETL processes for BI needs. Optimize data models and ensure clean, reliable, and well-structured data for Power BI. Integrate data from various internal systems (web apps, databases, accounting platforms). Collaborate with BI analysts and managers to meet reporting requirements. Automate data refreshes and performance-tune dashboards. Maintain metadata, data dictionaries, and lineage documentation.Required Skills Strong SQL development and performance tuning skills. Hands-on experience with Power BI dataset structuring and integration. Familiarity with Python for ETL, automation, or data cleaning tasks. Knowledge of APIs and scripting for data ingestion. Understanding of data warehousing and modelling techniques (star/snowflake schemas). Experience working with Azure, SSIS, or any cloud-based data services is a plus. Familiarity with version control systems like Git
View all details
AWS ETL Consultant ETL Tool
Job Openings for 5 AWS Developer Jobs with minimum 8 Years Experience in Chennai, Pune, Greater Hyderabad, Delhi NCR, having Educational qualification of : Professional Degree with Good knowledge in AWS, ETL Consultant, ETL Tool etc.
View all details

Opening For GCP Developer

Hexaware Technologies

GCP Developer Data Base SQL Python DATA FLOW DATA CRAP ETL
Experience : 6-9yrs Notice Period : imm - 60 daysWork location : Pune, Bangalore, Chennai, Mumbai Work mode : Hybrid Key Requirements:- Strong experience in cloud migrations and pipelines- Good understanding of Database and Data Engineering concepts- Hands-on experience in SQL and Python- Experience in Java development- Proficiency in Google Cloud Platform tools like Data Flow, Data Transfer services, and AirFlow- Working knowledge of Data Preprocessing techniques using DataFlow, DataProc, and DataPrep- Familiarity with BigQuery, Kafka, PubSub, GCS, and Schedulers- Proficiency in PostgreSQL is preferred- Experience with real-time and scheduled pipelines- Cloud certification is a plus- Experience in implementing ETL pipelines- Familiarity with MicroServices or Enterprise Application Integration Patterns is advantageous
View all details
  • 7 - 8 yrs
  • 22.5 Lac/Yr
  • Bangalore
My Primary Skills Set Into Microsoft PowerBI Power Pivot and Data Modeling SQL Server and ETL
Designing the Reports and Dashboards as per client requirement and created the reports using Excel files and SQL(SSMS) as a data source. Imported data from SQL Server DB, SSMS & Excel to Power BI application to generate reports and dashboards Familiar with the JIRA and Zendesk to create and maintain workflow of the project. Having hands on experience in writing SQL queries to extract the data from SQL Server database Troubleshoot and debug automated processes and workflows Monitor and maintain the performance of automated processes and workflows Develop and maintain documentation for Power Automate processes and workflows Created and modified various Stored used in the application using T-SQL. Blended data from multiple databases into one report by selecting primary keys from each database for data validations. Engagement with Metrics, Analysis, Reviews and QC
View all details

Data Analyst

Emarlex Multiventure LLP

  • 3 - 8 yrs
  • 15.0 Lac/Yr
  • Chennai
Data Management Data Analysis Informatica SQL Data Mining SSIS Python Power BI ETL Developer
Key Responsibility Area: (Specifies Key Result Areas for the Incumbent)o Engage with clients to precisely identify and fulfill their data engineering needs.o Lead and manage special projects to meet strategic goals.o Develop advanced models with Snowflake and SSIS to enhance data accuracy and utility.o Continuously refine data processing rules and procedures for optimal results.o Design and implement scalable data architectures using Snowflake.o Maintain and enhance data pipelines, integrating new data sources and APIs as needed.o Monitor and ensure high data quality across systems for reliable decision-making.o Utilize SSIS for efficient data extraction, transformation, and loading processes.o Leverage Informatica knowledge to improve data management capabilities.Eligibility Criteria: (Skill set required to do the job- Knowledge, Tools, Technical Knowledge, Certifications etc.)o Demonstrated expertise in SQL and strong proficiency in programming languages such as Python.o Extensive experience in designing, building, and maintaining robust data pipelines and complex data architectures.o Prior experience with Informatica or equivalent ETL tools, providing a significant advantage.o In-depth understanding of ETL processes, data modeling techniques, and data warehousing principles.o Familiarity with big data frameworks like Hadoop or Spark, adding value to the role.o Strong knowledge of data serialization formats including Parquet, Avro, and JSON, essential for efficient data processing and storage.o Exceptional analytical skills, adept at problem-solving and technical troubleshooting.o Excellent communication and interpersonal skills, essential for effective client engagements and team collaboration.o Proven ability to independently manage multiple projects, demonstrating strong organizational and leadership skills.o Relevant professional certifications in Snowflake, SSIS, Informatica, or comparable data management technologies are highly regarded
View all details
Postgre SQL SQL SQL Database Administrator SQL Server Developer ETL Designer ETL Developer ETL Testing ETL Tool
Key Responsibilities:Database Design and Architecture:Design and implement SQL database structures to meet the organization's data storage and retrieval needs.Collaborate with software developers and stakeholders to understand data requirements and ensure efficient data modelling.Optimize schema design for data integrity, normalization, and performance.Database Performance Optimization:Identify and analyse performance bottlenecks and query optimization opportunities.Develop and implement strategies to improve database performance, including indexing, caching, and query optimization.Monitor and analyse database performance metrics, making recommendations for improvements.Query and Index Optimization:Analyse and optimize SQL queries to reduce execution times and resource consumption.Identify and create appropriate indexes to speed up data retrieval and processing.Provide guidance and best practices to application developers for writing efficient SQL queries.Data Migration and ETL:Plan and execute data migration and ETL processes, ensuring data integrity and performance.Optimize data transformation and loading processes for efficient ETL operations.Database Security:Implement and maintain database security measures to protect sensitive data.Audit and enforce data access controls and encryption as necessary.Backup and Recovery:Develop and maintain robust backup and recovery procedures to ensure data availability and integrity.Documentation and Training:Maintain detailed documentation of database structures, configurations, and performance optimization strategies.Provide training and knowledge transfer to team members and stakeholders as needed.Stay Informed:Stay up to date with the latest database technologies and trends to ensure the organization's databases remain competitive and efficient.
View all details
Senior Software Tester ETL Developer Selenium AWS Developer Full Stack Developer Angular Developer
We are Looking for Software Engineers Freshers and Experience
View all details

ETL Developer

Avenir Innovative Solutions

  • 2 - 4 yrs
  • Navi Mumbai
Data Warehousing SAS Statistical Analysis System Informatica Computer Science Finite Element Analysis ETL IBM Storage ETL Tool PLSQL Netezza SQL Server Database Microsoft Office Sharepoint Server SSAS SSIS
This is a full-time on-site role for a Software Application Developer in SBI, Navi Mumbai. The Software Application Developer will be responsible for handling and resolving technical issues, conducting troubleshooting, and debugging.
View all details

Opening For Software Developer

Nexperia Technologies Private Limited

SQL Azure ETL Developer Software Developer Work From Home
NEXPERIA TECHNOLOGIES (OPC) PRIVATE LIMITED is trying to be in the spotlight for being adept in the next big technologies company. What we can offer you is a space to explore varied technologies and quench your techie soul.We are hiring Software Developer for remote position.Desired Experience Range: 0-2 yearsLocation: RemoteDesired Competencies (Technical/Behavioral Competency)Job Description:Experience: Graduate. Working experience in Azure Data Factory. Working experience in SQL Code development, understands Table, Indexes, Functions and other basic objects of Database. Knowledge on ETL job creations, run and monitoring.Preferred Qualifications: Bachelors Degree in Computer Science/ B.Sc in Computer science with practical knowledge ,Computer Engineering or a closely related field.Required Skills: Java J2EE , Java Script , HTML , CSS Full stack development.Nice to have : - Different software systems integration understanding Adobe experience Manager knowledge
View all details

ETL Architect (2-6 Years)

VERVENEST TECHNOLOGIES

ETL Developer SQL Informatica
ETL Architects We are looking for ETL Architects(Any ETL tools) who has vast experience in ETL. The candidate should possess the below:Expert level of experience in ETL-Nice to have good experience in XML and Webservice transformations -Good understanding on SQL and Stored ProceduresExperience in B2B Integration projects Should have good analytical and problem solving skills-Should be good team player and able to lead a team of 6+ members-Should be able to understand legacy ETL codes and re-design it to meet performance and best practices standards-Should be able to troubleshoot and identify the translation issues when data transferred from source to target-Should be able to write technical design on the requirements, which can be used by the developers -Support developers to produce ETL code which meet standards and best practices-Consult with users, and their management & technical personnel to clarify and validate business requirements -Should be able to implement data integration best practices and conventions -Should be able to identify potential problems and proactively suggest changes and solutions -Should be able to act as a reference for the data integration and reporting solution -Interact with and communicate detailed technical requirements to project development team Requirements :ETL Architect - Minimum 2+ years Hands on development experience is mandatoryETL tool Informatica - 5 YearsSQL experience is mandatory - 5 YearsNotice period: Immediate to 30daysLocation: Chennai / Thiruvananthapuram /Kochi Interview Process: 1 Technical. 1 Managerial and 1 HR discussionHybrid work modelSalary: as per market Standardsif you are looking for a job or job change, send me a contact number or emailRegards,Nandini,HR
View all details
  • 8 - 12 yrs
  • 1.3 Lac/Yr
  • United States
Capital Market Snowsql Azure Data Factory ETL Developer Python
You will be responsible for designing and creating the data warehouse and all related extraction, transformation and load of data functions in the company. After the groundwork has been laid, you will also test your designs to ensure the system runs smoothly. Furthermore, you must be expert at taking a big-picture view of a company's data situation. A skill set that is a big plus is called data modeling. This means you will need to be able to read, analyze and digest what a business wants to accomplish with its data, and design the best possible ETL process around those goals. Database designs take on many forms, including star and snowflake schemas.Qualifications:Capital markets and derivatives business knowledge preferredTechnical expertise in SQL Server and Python Git change managementSnowSQLAzure Data Factory
View all details

Datastage Developer

Cloud BC Labs

Teradata Informatica SQL ETL Tool
Datastage Sr. Developer - Job Description:Strong hands on experience on DataStage developmentKnowledge of PL/SQLStrong analytical skillsKnowledge of Teradata, Snowflake, Snaplogic will be add-ons, to primary skillsGood Communication SkillsReady to work in second shiftExperience : 5-7 years
View all details

Data Integration Developer

Podium Systems Private Limited

  • 5 - 8 yrs
  • Thane
Data Integration ETL Tool
A Data Integration Developer is responsible for designing, developing, and maintaining systems andprocesses that facilitate the seamless flow of data between different systems, databases, andapplications.Some of the key responsibilities of a Data Integration Developer are:1. Gathering requirements from stakeholders regarding data integration needs.2. Mapping and transforming data elements from source systems to target systems.3. Developing Extract, Transform, Load (ETL) processes to extract, cleanse, validate, and loaddata.4. Designing and implementing integration architectures and patterns.5. Monitoring data integration processes, identifying and resolving issues.6. Optimizing performance and efficiency of data integration processes.7. Ensuring data accuracy, consistency, and quality throughout the integration process.8. Collaborating with team members and stakeholders to deliver effective data integrationsolutions.9. Documenting data integration processes, configurations, and workflows.10. Having hands on experience of ETL tools like SSIS, Power Query, Apache NiFi or Tableau Prepis desirableA Data Integration Developer plays a crucial role in designing and implementing data integrationsolutions, enabling organizations to efficiently and reliably exchange data between various systemsand applications.Qualification : BE / B. Tech / MCACompensation : Based on experience & qualificationJob Location : Head Office, ThaneIt is a work from office role with 5 days working.
View all details

Informatica ETL Developer

KGP Manpower Consultancy

  • 3 - 9 yrs
  • 100.0 Lac/Yr
  • United Arab Emirates +1 UAE
Informatica Kafka Cloud Computing Powercenter Hadoop Developer Informatica ETL Developer
Should have Strong expertise of Extraction, Transformation and Loading (ETL) mechanism using Informatica Big Data Management 10.2.X and various Push down mode using Spark, Blaze and Hive execution engine. Should have Strong expertise of Dynamic mapping Use case, Development, Deployment mechanism using Informatica Big Data Management 10.2.X. Should have experience on transforming and loading various Complex data sources types such as Unstructured data sources ,No SQL Data Sources. Should have Strong expertise of Hive Database including Hive DDL, Partition and Hive Query Language. Should have Good Understanding of Hadoop Eco system (HDFS, Spark, Hive). Should have Strong expertise of SQL/PLSQL. Should have Good knowledge on working with Oracle/Sybase/SQL Databases. Should have Good knowledge of Data Lake and Dimensional data Modeling implementation. Should be able to understand the requirements and write Functional Specification Document, Design Document and Mapping Specifications.
View all details

AWS Glue Developer

Opportunity Next

AWS Glue ETL Development ETL Developer Python Spark SQL s3 Athena Redshift Data Modeling Data Lakes Work From Home
We are looking for a highly skilled AWS Glue Developer with a minimum of 5-10 years of experience in the industry to join our team at IBM India. The ideal candidate will be responsible for developing and maintaining data integration solutions using AWS Glue, building ETL pipelines, and working on data lakes.Responsibilities:Develop and maintain ETL pipelines using AWS Glue.Design and develop data integration solutions using AWS Glue.Develop and maintain data ingestion, transformation, and integration solutions.Work on data lakes and build efficient data models.Collaborate with cross-functional teams to understand business requirements and design solutions accordingly.Optimize ETL performance and troubleshoot issues.Develop and maintain technical documentation.Stay up to date with the latest AWS Glue developments and incorporate new features and functionality into existing solutions.Requirements:Bachelor's degree in Computer Science or a related field.Minimum of 5-10 years of experience in AWS Glue development.Strong experience with ETL development using AWS Glue.Proficient in Python, Spark, and SQL.Experience with AWS services such as S3, Athena, and Redshift.Familiarity with data modeling and designing data lakes.Experience working with cross-functional teams and collaborating on projects.Excellent communication and documentation skills.Ability to troubleshoot and optimize ETL performance.
View all details

ETL Lead/ Informatica

Acentus Consulting Services LLP

  • 7 - 13 yrs
  • Pune
ETL Informatica Developer
Data Platform Lead Location: PuneExp: 7+Years We are looking to hire an ETL Technical Lead with strong hands-on experience in leading complex and large-scale data integration and consolidation initiatives. This role will involve understanding business requirements, analyzing technical options, and providing end-to-end ETL Solutions. It is a challenging role with the opportunity to build innovative Cloud Data integration solutions. Required Past Experience: 7+ Years experience in Informatica or DataStage ETL experience. Handled at least 4+ development projects from ETL Perspective. Designed and Developed ETL Pipeline for heterogeneous sources File (CSV, XML, JSON.) Processing Expertise in Shell/Python Scripting Hand-On Experience to write ETL Logic in SQL or PL/SQL ETL Testing and Troubleshooting Performed Performance Optimization of ETL Pipelines Provided technical guidance to Team Members. Performed Code Reviews Experience in Data Modeling Involvement in Data Pipeline Design with an Architect. Required Skills and Abilities: Mandatory Skills - Hands-on and deep experience working as a lead in ETL Development with Informatica or DataStage, Strong in SQL Query. Secondary Skills - Data Modeling and Shell Scripting Pipeline Design and Code Optimization Good to have - Ability Design and Build a Cloud ETL Pipeline Good verbal and written communication skills. Ability to communicate with customers, developers, and other stakeholders. Mentor and guide team members Good Presentation skills Strong Team Player About Us! A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
View all details
View More Jobs