15

ETL Job Vacancies in Mumbai

filter
  • Location
  • Role
  • Functional Area
  • Qualification
  • Experience
  • Employer Type
  • 3 - 5 yrs
  • 7.5 Lac/Yr
  • Andheri East Mumbai +1 111758
.NET SQL Server Oracle PostgreSQL NoSQL ASP.NET C# Web Services JavaScript JSON ETL HTML CSS XSLT MVC jQuery
Job Description: Technical ConsultantTrishiv AI Tech Services is hiring Technical Consultants for software development on enterprise projects with our client E4 Software Services Pvt. Ltd in Mumbai. The role involves contributing to CRMNEXT solutions as part of an experienced tech team.Responsibilities:Develop, test, and maintain .NET software applicationsWork with databases: SQL Server, PostgreSQL, Oracle, NoSQLDesign and integrate APIs and web servicesCollaborate with internal and client teams for scalable solutionsTroubleshoot and optimize software performanceSkills:.NET, ASP.NET, C#, SQL, Web Services, JavaScript, JSONDatabase management (SQL Server/PostgreSQL/Oracle/NoSQL)ETL concepts; HTML, CSS, XSLT, MVC, jQuery (preferred)Front-end and cloud/DevOps exposure is a plusEligibility:B.Tech / B.E / BCA / B.Sc (CS/IT/Engg)0-3 years experience in software development or consultingType: Full-timeLocation: Mumbai (Hybrid / On-site)Salary: 30,000 60,000 per monthJoin innovative teams at Trishiv AI Tech and E4 Software to work on enterprise AI and technology projects.
View all details
Functions SQL Tuning ETL Oracle Fusion Middleware Triggers Oracle Reports Oracle Forms
Skills Required-Oracle, SQL Query, Stored Objects, UTL packages, DML Statement development and execution, ETL activities, Fusion Middleware, SQL Tuning, ORACLE Forms 11g/12c, Reports11g/12c
View all details

AWS Data Engineer

Hexaware Technologies

SQL AWS Python ETL TERRAFORM LAMDA
Work Mode: Hybrid6-9 years of overall IT experience, preferably in cloud environments.Minimum of 5 years of hands-on experience with AWS cloud development projects.Design and develop AWS data architectures and solutions.Build robust data pipelines and ETL processes using big data technologies.Utilize AWS data services such as Glue, Lambda, Redshift, and Athena effectively.Implement infrastructure as code (IaC) using Terraform.Proficiency in SQL, Python, and other relevant programming/scripting languages.Experience with orchestration tools like Apache Airflow or AWS Step Functions.Strong understanding of data warehousing concepts, data lakes, and data governance frameworks.Expertise in data modeling for both relational and non-relational databases.Excellent communication skills are essential for this role.
View all details

Opening For Software Developer

Nexperia Technologies Private Limited

SQL Azure ETL Developer Software Developer Work From Home
NEXPERIA TECHNOLOGIES (OPC) PRIVATE LIMITED is trying to be in the spotlight for being adept in the next big technologies company. What we can offer you is a space to explore varied technologies and quench your techie soul.We are hiring Software Developer for remote position.Desired Experience Range: 0-2 yearsLocation: RemoteDesired Competencies (Technical/Behavioral Competency)Job Description:Experience: Graduate. Working experience in Azure Data Factory. Working experience in SQL Code development, understands Table, Indexes, Functions and other basic objects of Database. Knowledge on ETL job creations, run and monitoring.Preferred Qualifications: Bachelors Degree in Computer Science/ B.Sc in Computer science with practical knowledge ,Computer Engineering or a closely related field.Required Skills: Java J2EE , Java Script , HTML , CSS Full stack development.Nice to have : - Different software systems integration understanding Adobe experience Manager knowledge
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!

AWS Glue Developer

Opportunity Next

AWS Glue ETL Development ETL Developer Python Spark SQL s3 Athena Redshift Data Modeling Data Lakes Work From Home
We are looking for a highly skilled AWS Glue Developer with a minimum of 5-10 years of experience in the industry to join our team at IBM India. The ideal candidate will be responsible for developing and maintaining data integration solutions using AWS Glue, building ETL pipelines, and working on data lakes.Responsibilities:Develop and maintain ETL pipelines using AWS Glue.Design and develop data integration solutions using AWS Glue.Develop and maintain data ingestion, transformation, and integration solutions.Work on data lakes and build efficient data models.Collaborate with cross-functional teams to understand business requirements and design solutions accordingly.Optimize ETL performance and troubleshoot issues.Develop and maintain technical documentation.Stay up to date with the latest AWS Glue developments and incorporate new features and functionality into existing solutions.Requirements:Bachelor's degree in Computer Science or a related field.Minimum of 5-10 years of experience in AWS Glue development.Strong experience with ETL development using AWS Glue.Proficient in Python, Spark, and SQL.Experience with AWS services such as S3, Athena, and Redshift.Familiarity with data modeling and designing data lakes.Experience working with cross-functional teams and collaborating on projects.Excellent communication and documentation skills.Ability to troubleshoot and optimize ETL performance.
View all details
  • 10 - 12 yrs
  • Mumbai
TIBCO Clarity Tibco EBX TIBCO Jaspersoft TIBCO SingleStore ETL Tool Experience Informatica
1. Participate in the implementation of the strategy, solution, design, build and operations for Data Governance program.2. Define and oversee data governance processes within customer env for the data platforms3. Participate in the Implementation, evolve, and promote adoption and use of standard metadata in coordination with customer data stewards, architects, and developers4. Participate in the definition of and maintain Reference Data Governance Asset Data Model and Standards, ensuring that governance assets are understood, developed, and used in efficient and functional manner.5. Ensure that security and privacy are foundational in designing and building out data strategies.6. Evaluate and recommend technologies and tools to implement data governance strategy, including DBMs, data mastering tools, data integration tools, analytic tools, etc.7. Bring experience with different data use cases and design data models that meet various needs: customer facing analytics, internal advanced analytics, and internal reporting. Be familiar with different design and modeling approaches for those different use cases.8. Stay abreast of information management trends and standards, master data management, data services, self-service business intelligence, metadata management, data quality, and data governance. 12 + yrs experience in the IT industry with experience in migration preferably in Oil and Gas . BE / Masters Degree ETL tool Experience / Exposure Experience in TIBCO technology stack, TIBCO Jaspersoft, TIBCO Spotfire ADS, TIBCO Clarity, Tibco EBX, BusinessWorks. Or Informatica. Experience in TIBCO SingleStore. Experience in Informatica or similar ETL and data governance tool. Desired Domain Exposure - Oil&Gas
View all details

Data Engineer

Caliber Hunt

  • 1 - 7 yrs
  • 12.0 Lac/Yr
  • Mumbai +1 Pune
ETL Hadoop Python AWS Spark Data Engineer Walk in
Technologies / Skills: Advanced SQL, Python and associated libraries like Pandas, Numpy etc., Pyspark , Shell scripting, Data- Modelling, Big data, Hadoop, Hive, ETL pipelines and IaC tools like Terraform etc.Responsibilities: Efficient communication skills to coordinate with users, technical teams and DataSolution architects. Document technical design documents for given requirements or JIRA stories. Communicate results and business impacts of insight initiatives to key stakeholders to collaboratively solve business problems. Working closely with the overall Enterprise Data & Analytics Architect and Engineering practice leads to ensure adherence with the best practices and design principles. Assures quality, security and compliance requirements are met for supported area. Develop fault-tolerance data pipelines running on cluster Ability to come up with scalable and modular solutionsRequired Qualification: 1-8 yrs of hands-on experience developing data pipelines for Data Ingestion or transformation using Python (PySpark) /Spark SQL in AWS cloud Experience in development of data pipelines and processing of data at scale using technologies like EMR, Lambda, Glue, Athena, Redshift, Step Functions. Advanced experience in writing and optimizing efficient SQL queries with Python and Hive handling Large Data Sets in Big-Data Environments Experience in debugging, tunning and optimizing PySpark data pipelines Should have implemented concepts and have good knowledge of Pyspark data frames, joins, partitioning, parallelism etc. Understanding of Spark UI, Event Timelines, DAG, Spark config parameters, in order to tune the long running data pipelines. Experience working in Agile implementations Experience with Git and CI/CD pipelines to deploy cloud applications Good knowledge of designing Hive tables with partitioning for performanceThanks and RegardsHR TEAM
View all details

Data Engineer

Deputize Consultancy

ETL Data Engineer Work From Home
Responsibilities for Data Engineer1.Create and maintain optimal data pipeline architecture,Assemble large, complex data sets that meet functional / non-functional business requirements.2.Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.3.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies.Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.4.Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.Work with data and analytics experts to strive for greater functionality in our data systems.5.Qualifications for Data EngineerAdvanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.Experience building and optimizing big data data pipelines, architectures and data sets.Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Strong analytic skills related to working with unstructured datasets.6.Build processes supporting data transformation, data structures, metadata, dependency and workload management.A successful history of manipulating, processing and extracting value from larg
View all details
Big Data React JS Python AWS C++ Angular Spark Programming ETL SQL Work From Home
**Preference will be given to the candidates who can join on or before 1st of October, 2022**You will:Write excellent production code and tests and help others improve in code-reviewsAnalyze high-level requirements to design, document, estimate, and build systemsCoordinate across teams to identify, resolve, mitigate and prevent technical issuesCoach and mentor engineers within the team to develop their skills and abilitiesContinuously improve the team's practices in code-quality, reliability, performance, testing, automation, logging, monitoring, alerting, and build processesYou have:For (Full stack):2 - 10 Years of experienceStrong with DS & AlgorithmsHands on Experience in the Programming languages: JavaScript (React or Angular), Python, SQL.Experience with AWS.For (Backend):2 - 10 years of experienceHands on product development experience using Java/ C++/PythonExperience with AWS,SQL,GITStrong with Data structures and AlgorithmsAdditional nice to have skills/certifications:For Java skill set:Mockito, Grizzly, Netty, VertX, Jersey / JAX-RS, Swagger / Open API, Nginx, Protocol Buffers, Thrift, Aerospike, Redis, Kinesis, Sed, Awk, PerlFor Python skill set: Data Engineering experience, Athena, Lambda, EMR, Spark, Glue, Step Functions, Hadoop, Kinesis, Orc, Parquet, Perl, Awk, RedshiftFor (Data Engineering):2 - 10 years of experienceExperience with object-oriented/object function scripting languages: Python.Experience with AWS cloud services: EC2, RDS, Redshift,S3,Athena, GlueMust be proficient in GIT, Jenkins, CICD (Continuous Integration Continuous Deployment)Experience in big data technologies like Hadoop, Map Reduce, Spark, etcExperience with Amazon Web Services and DockersFor (Geo Team):4 - 10 years of experienceExperience with Big Data technologies like Hadoop, Spark, Map Reduce, Kafka, etcExperience using object-oriented languages (Java, Python)Experience in working with different AWS technologies.Experience in software
View all details

SR Data Engineer

Proto talent consultants

Data Engineer Software Developer Business Intelligence Data Scientist SQL Data Base Administrator Python Data Modeling ETL Developer Work From Home
Basic Qualifications5+ years of experience as a Data Engineer or in a similar role3+ years of industry experience in software development, data engineering, business intelligence, data science, or related field3+ years of experience in SQL & PythonBachelor's degree in computer science, engineering, mathematics, or a related technical disciplineExperience with data modelling, data warehousing, and building ETL pipelinesProven track record of solving complex business problem with high quality & automated testing in agile environment.Experience using one of the big data technologies (Spark, EMR, Dbt cloud etc)Excellent communication & analytics skillsPreferred QualificationsExperience working with AWS big data technologies (EMR, Redshift, S3, Lambda, RDS)Experience working with SnowflakeDemonstrated strength in data modeling, ETL development, and data warehousingExcellent understanding of any other programming languages like Java or JavaScriptExperience working with one of big data schedulers like Airflow, Oozie, Azkaban etc.Experience in data streaming with Kafka, Spark Streaming and KinesisBasic knowledge on Containerization and Orchestration like Docker, KubernetesUnderstanding of python data libraries like NumPy, Scikit-Learn etc.Open to learn new technologies & AI libraries
View all details

ALteryx Developer

Netrtech Solutions

  • 4 - 9 yrs
  • 25.0 Lac/Yr
  • Mumbai
ALteryx Developer Python Azure Database Analyst Warehousing SQL Server Informatica Testing ETL Consult
Job responsibility: 2+ years Alteryx experience as an Administrator and Developer Hands-on skills and experience in Alteryx Designer, Alteryx Server, and the tools with Alteryx. Work experience on similar tools like Informatica and others ETL tools Strong Data warehousing and Business Intelligence experience Knowledge on Alteryx and Tableau Server and Dashboard / report Optimization Advanced knowledge on performance tuning of Dashboards, workbooks, data sources etc Working experience with Data Extract Excellent requirements gathering skills Data modelling and advanced Reporting experience Strong knowledge on writing SQL query and database concepts Strong problem solving and analytical skills Ability to engage and establish relationships with business users, peers and managers across different departments, geographies and time zones Strong Team Player with inclination towards knowledge sharing Self-starter with ability to adapt to changing business priorities Add-ons: 4+ years writing SQL queries against any RDBMS with query optimization. Good understanding of unit testing, software change management, and software release management Good understanding the star schema and data models in the existing data warehouse.Salary up to 25,00,000 paConnect to HR
View all details

Urgent Required For Data Engineer Executive

Perfect Solution Group (Spectrum Placement Services)

Data Engineer Executive Computer Operator SAS-Statistical Analysis System ETL Hadoop DATA ENGINEER Azure JSON XML Scala Spark Github DevOps Data Miration Walk in
Profile - Data Engineer ExecutiveQualification - Graduate With Good Communication SkillExperience - Minimum 1 Year RequiredCandidate Should Have Knowledge of AWS,Spark, Py- Spark, Python, HarkSalary - 24 LPA TO 42 LPA Gender - Male & Female Can ApplyLocation - Pen IndiaDuties & Responsibilities-----Analyze and organize raw data.Build data systems and pipelines.Evaluate business needs and objectives.Interpret trends and patterns.Conduct complex data analysis and report on results.Prepare data for prescriptive and predictive modeling.Build algorithms and prototypes.Only Serious Candidate Apply
View all details

SSIS Developer

Enrich & Enlight

ETL SQL Server Integration Services SSRS SSIS Developer
Job opening for SSIS developer with TOP MNC Company Across IndiaDear Candidate, Greetings from Enrich & Enlight!!! Enrich & Enlight is an Executive Search Consulting co, specializing in Middle & Senior Management positions for the selected clientele. We are associated as recruitment partner with reputed organizations in IT, ITES, KPO, Publishing, Consulting and manufacturing industries supporting across India. Currently we are looking professionals for SSIS DeveloperPlease find below the details of the position and reply with your interest. Job Summary:Yrs of Exp: 2 to 8yrs Notice Period: Immediate, 15 Days & 30 Days. Job Location: Across IndiaPosition: Permanent Role: SSIS developerJob Description: Experience in SSIS developer and SQL.Req Skills : ETL, SSIS, SSRS, SQL, Preferred Skill: .Net, SSRS ,SDLC,C#,C Excellent communication skills.Eligibility Criteria: Candidates with Good communication Skills. At least 15 years of regular education. Candidate should not have a career gap of more than 2 years / 24 months in the profile.Interested candidates please do share your updated resumeNote: Only available candidates for Telephonic can share your resumesGood Day!!!Thanks & Regards,Archana
View all details

Informatica ETL Developer

Whiteklay Technologies Pvt ltd

  • 4 - 7 yrs
  • 12.0 Lac/Yr
  • Mumbai
Informatica Informatica Big Data ETL Tool Oracle SQL Hive MapReduce Hadoop
At least 3 years of experience developing ETL processes.- Strong in Informatica design concepts using its products.- Hands-on knowledge of Mapplets, Mappings, Workflows, and Applications.- Proficient in Creating Mappings, workflows and implementing ETL concepts.- Solid data warehousing concepts - dimensional modelling, facts, dimensions, helper tables, SCD concepts etc.- Strong ETL and data modelling experience.- Experience in development of database processes using Oracle SQL, Hive, MapReduce.- Sound Unix shell scripting and command level experience.- Knowledge of Hadoop, Map Reduce, Hive, Spark is an advantage.- Excellent knowledge of debugging, tuning and optimising the performance of database queries.- Thorough knowledge of software methodologies, distributed networking, databases, communications, and multiprocessing applications.- Experience in Netezza- Actively participate in business requirements sessions, design review and test case review meetings.Basic understanding of any programming language. As developer, should worked on change requests or enhancements by making some code changes. .:Good to have Understanding of SAS programming language.
View all details

SSIS Developer

SK Infotech Spectrum

MS SQL Server ETL SSAS SSIS Developer SQL ETL Process Work From Home Walk in
3-5 years of good ETL/SQL experience. Strong in Data Warehousing concepts (SSIS). Responsible for designing, developing, testing, deploying and troubleshooting SSIS packages that implement complex ETL processes Translate business requirements into technical designs to map and load data from different sources to data warehouses Ensure quality and accuracy of the data Should have knowledge of SQL queries
View all details

Opening For GCP Developer

Hexaware Technologies

GCP Developer Data Base SQL Python DATA FLOW DATA CRAP ETL
Experience : 6-9yrs Notice Period : imm - 60 daysWork location : Pune, Bangalore, Chennai, Mumbai Work mode : Hybrid Key Requirements:- Strong experience in cloud migrations and pipelines- Good understanding of Database and Data Engineering concepts- Hands-on experience in SQL and Python- Experience in Java development- Proficiency in Google Cloud Platform tools like Data Flow, Data Transfer services, and AirFlow- Working knowledge of Data Preprocessing techniques using DataFlow, DataProc, and DataPrep- Familiarity with BigQuery, Kafka, PubSub, GCS, and Schedulers- Proficiency in PostgreSQL is preferred- Experience with real-time and scheduled pipelines- Cloud certification is a plus- Experience in implementing ETL pipelines- Familiarity with MicroServices or Enterprise Application Integration Patterns is advantageous
View all details

Apply to 15 ETL Job Vacancies in Mumbai

  • Mumbai Jobs
  • Hyderabad Jobs
  • Ahmedabad Jobs
  • Bangalore Jobs
  • Pune Jobs
  • Chennai Jobs
  • Kolkata Jobs
  • Delhi Jobs