Array ( [0] => hadoop [1] => chennai ) Hadoop Jobs in Chennai,Hadoop Job Vacancies in Chennai Tamil Nadu
17

Hadoop Job Vacancies in Chennai

filter
  • Location
  • Role
  • Functional Area
  • Qualification
  • Experience
  • Employer Type

Looking For Data Engineer

BSRI Solutions Pvt Ltd

  • 3 - 5 yrs
  • 16.0 Lac/Yr
  • Chennai
Python Pyspark Developer Scala SQL Hive Hadoop Google Cloud Platform Kafka Developer Infrastructure AS Code GitHub Agile Methodology ETL
Required Qualifications : 3+ years of demonstrated ability with Hive, Python, Spark/Scala, SQL, etc. Google Cloud Platform Experience, Big Query, Cloud Storage, Dataproc, Data Flow, Cloud Composer, Cloud SQL, Pub Sub, Terraform, etc. Experience with Hadoop Ecosystem, Kafka, PCF cloud services Familiar with big data and machine learning tools and platforms Experience with BI tools, such as Alteryx, Data Stage, QlikSense, etc. Design data pipelines and data robots, take a vision and bring it to life Master data engineer; mentors others; works closely with IT architects to set strategy and design projects Provide extensive technical, and strategic advice and guidance to key stakeholders around the data transformation efforts Redesign data flows to prevent recurring data issues Strong analytical and problem-solving skills Possess excellent oral and written communication skills, as well as facilitationand presentation skills, and engaging presentation style. Ability to work as a global team member, as well as independently, in achanging environment and prioritize. Ability to establish and maintain coordinated and effective working relationships with application implementation teams, IT project teams, business customers, and end users. Ability to deliver work within deadlines. Experience with agile/lean methodologies Experience working independently and with minimal supervision Experience with Test Driven Development and Software Craftsmanship Experience with GitHub, Accurev, or other version-control systems Experience with Putty Experience with Datastage Strong Communications skills Ability to illustrate and convey ideas and prototypes effectively with team and partners Presence demonstrating confidence, ability to learn quickly, influence, and shape ideas Key Skills Required - Data Engineer- Python / PySpark / Scala- SQL & Hive- Hadoop Ecosystem- Data Pipeline Design & ETL Development- Google Cloud Platform (BigQuery, Dataproc, Dataflow, Cloud Storage)- Kafka / Streaming Data Processing- Terraform (Infrastructure as Code)- DataStage or Similar ETL Tools- Version Control (GitHub or equivalent)- Agile Methodologies- Strong Analytical & Problem-Solving Skills- Stakeholder Collaboration & CommunicationNice to Have:- Cloud Composer, Cloud SQL, Pub/Sub- BI Tools (Alteryx, QlikSense)- Machine Learning Platform Exposure- Test Driven Development (TDD)- Mentoring & Technical Leadership
View all details

Data Engineer

United Technology

  • 1 - 3 yrs
  • 4.0 Lac/Yr
  • Chennai
Data Integration Data Engineer Hadoop ETL SQL Informatica Apache AWS Big Data Python
We are looking Data Engineer with 1 to 3 years experience in Chennai.Immediate joiners preferred
View all details
  • 2 - 6 yrs
  • 6.5 Lac/Yr
  • Bangalore Highway Chennai
Data Management Apache Data Integration SQL ETL Tool ETL Hadoop AWS Snowflake Azure
Invitation for B2B Partnerships: Seeking Software Development Support! We are looking to collaborate with companies that can provide 10 skilled Data Engineers to support our data engineering requirements pipeline for European clients on a contract basis.Requirements:Expertise in data engineering, including data processing and ETL.Proficiency in SQL and NoSQL databases.Experience with Hadoop or Spark.Experience in AWS or Azure.Snowflake is an added advantage.If your company is equipped to provide top-notch Data Engineers, we invite you to submit your proposal with terms and conditions.Please contact us by email or WhatsAppsrividya032001@gmail.com (or) 8884752389
View all details
  • 2 - 6 yrs
  • 6.5 Lac/Yr
  • Bangalore National Highway Chennai
Data Warehousing Apache Data Integration Data Management SQL ETL Tool Hadoop AWS Big Data Python Java Java-script React Js
Invitation for B2B Partnerships: Seeking Software Development Support! We are looking to collaborate with companies that can provide 10 skilled Data Engineers to support our data engineering requirements pipeline for Euoropean clients on a contract basis.Requirements:Expertise in data engineering, including data processing and ETL.Proficiency in SQL and NoSQL databases.Experience with Hadoop or Spark.Experience in AWS or Azure.Snowflake is an added advantage.If your company is equipped to provide top-notch Data Engineers, we invite you to submit your proposal with terms and conditions.Please contact us by email or Whatsapprakanalytics@gmail.com (or) 9900173022
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!
  • 5 - 11 yrs
  • Chennai
Hadoop Cloud Computing GCP SQL Spark Hive DataFlow DataProc
Job Title: Hadoop DeveloperLocation: Chennai, IndiaDuration: Permanent positionWork Type: Onsite Role Industry: Financial TechnicalExperience: 05 YearsPlease share the resume at victor@theqctech.comJob Description: 4 Years of relevant experience is mandatory Chennai is a preferred location but anywhere is also fine. Experience extracting data from a variety of sources, and a desire to expand those skills (working knowledge in SQL and Spark is mandatory) Excellent Data Analysis skills. Must be comfortable with querying and analyzing large amounts of data on Hadoop HDFS using Hive and Spark. Preferably candidates must have experience building applications using Google Cloud Platform-related frameworks such as DataFlow/DataProc/PubSub. Excellent Communication Skills to Understand and Pass on Requirements. Pure GCP with the above experience is also ok
View all details

Azure Data ENgineer

Prevaj Consultants Pvt Ltd

  • 5 - 10 yrs
  • 10.0 Lac/Yr
  • Chennai
Big Data HDFS Hadoop Hive Yarn Pig HBase Sqoop Flume Azure Work From Home
Role: Azure Data EngineerExperience Required: Minimum 1+ Years of experience as a Azure Data Engineer.Skills Required:As a Data Engineer, you will collaborate with a team of business domain experts, data scientists and application developers to identify relevant data for analysis and develop the Big Data solution.Analyze business problems and help develop solutions for near real-time stream processing as well as batch processing on the Big Data platform.Set up and run Hadoop development frameworks.Explore and learn new technologies for creative business problem solving.Experience in Azure Data Engineer.Ability to develop and manage scalable Hadoop cluster environmentsExperience in Big Data technologies like HDFS, Hadoop, Hive, Yarn, Pig, HBase, Sqoop, Flume, etcWorking experience in Big Data services on any cloud based environment.Experience in Spark, Scala, Kafka, ADF, Akka and core or advance Java and DatabricksExperience in NOSQL technologies like Hbase, Cassandra, MongoDB, Cloudera or Hortonworks Hadoop distribution (good to have)Familiar with data warehousing concepts, distributed systems, data pipelines and ETLGood communication and interpersonal skillsMinimum 5 years of professional experience with 3 years of Big Data project experience
View all details
Java Druid Hadoop Hive Kubernetes
Requirement: Strong NoSQL resource who has good knowledge of Java. Development experience is mandatory. Exposure to working knowledge of Druid will be ideal. Someone who has a high level of skill on Hadoop, Hive, etc will also be quickly able to learn Druid. Experience in working with druid data modeling Experience in performance tuning of druid configuration, query optimization Experience in working with Kafka, Pulsar, and Druid for real-time streaming data ingestion Experience in working with cloud platforms and Kubernetes Good knowledge of Java Min 6 months to 1-year experience in Druid is a MUST Experience in Java programming is a MUST Strong NoSQL experience Candidate must have Experience in Hadoop and HiveKey Skill: Java, Druid, Hadoop, Hive, Kubernetes, Job Type: PermanentNo of Vacancy: 03Gender: BothExp. Required: 4-8 YearsSalary: INR 10,00,000 - 20,00,000 PAQualification: Any GraduateCandidate Profile: Min 6 months to 1-year experience in Druid is a MUSTExperience in Java programming is a MUSTStrong NoSQL experienceCandidate must have experience in Hadoop and HiveNotice Period Immediate to 30 DaysIndustry: IT-Software / Software ServicesFunctional Area: IT Software - Application Programming, Maintenance (Software Developer)
View all details

Hadoop Developer

Sadup Softech Pvt Ltd

  • 5 - 11 yrs
  • Chennai
Unix Hadoop Hive Spark Python
6+ Years of Industry Work ExperienceExperience extracting data from a variety of sources, and a desire to expand those skills (working knowledge in SQL and Spark is mandatory)Excellent Data Analysis skills. Must be comfortable with querying and analyzing large amount of data on Hadoop HDFS using Hive and Spark.Experience with Object Oriented Programming using Python and its design patterns.Experience handling Unix systems, for optimal usage to host enterprise web applications. Excellent Communication Skills to Understand and Pass on Requirements.
View all details

DATA ENGINEER (Informatica BDM)

KGP Manpower Consulting Pvt Ltd

Informatica Big Data Management SAS-Statistical Analysis System ETL Hadoop DATA ENGINEER Azure JSON XML Scala Spark Github DevOps Data Miration
DATA ENGINEER (Informatica BDM)Job Location: Dubai & Offshore (Chennai, Hyderabad, Bangalore)Experience: 5+ YearsNotice: 30 days or lessMax CTC: 15K AED per month/ 18LPA for offshoreJOB Description:Should have Strong expertise of Extraction, Transformation and Loading (ETL) mechanism using Informatica Big Data Management 10.2.X and various Push down mode using Spark, Blaze and Hive execution engine. Should have Strong expertise of Dynamic mapping Use case, Development, Deployment mechanism using Informatica Big Data Management 10.2.X. Should have experience on transforming and loading various Complex data source types such as Unstructured data sources ,No SQL Data Sources. Should have Strong expertise of Hive Database including Hive DDL, Partition and Hive Query Language. Should have Good Understanding of Hadoop Eco system (HDFS, Spark, Hive). Should have Strong expertise of SQL/PLSQL. Should have Good knowledge on working with Oracle/Sybase/SQL Databases. Should have Good knowledge of Data Lake and Dimensional data Modelling implementation. Should be able to understand the requirements and write Functional Specification Document, Design Document and Mapping Specifications.
View all details

Senior Data Engineer

Darshini management solutions

  • 5 - 10 yrs
  • Chennai
SAS-Statistical Analysis System ETL Hadoop DATA ENGINEER Azure JSON XML Scala Spark Github DevOps Data Miration Work From Home Walk in
Senior Data EngineerFull Time, Chennai, RemoteAbout OptiSolOptiSol a full services digital solution partner for Start-ups and Enterprises worldwide. We work on cutting edge digital solutions AI/ML, Modern Web and Mobile, Enterprise Cloud, Devops and Microservices. We are 100% Agile embracing modern application development strategies. People and Culture are our greatest differentiators. We are growing stronger with a 300+ team focused on new-age software development for the digital world. We invite people with a Growth Mindset to join us in our endeavor to create a better world through digital solutions for our customers. Position Summary:OptiSol DataOps Team engineers award winning cloud native applications to our worldwide customers. We provide simple digital solutions to complex business problems of our enterprise and startup customers. We look for agile developers with team spirit to continuously deliver value to our customers. We are a tribe of happy people excited in problem solving. Our team is geographically distributed, we speak many languages and we come from a variety of culture backgrounds. We are rapidly expanding, and we are looking for passionately curious people obsessed with customer happiness. Experience required:5 to 10 years Skills Required:As a Data Engineer you will be implementing foundational, robust and production ready data platforms to enable business data-discovery, self-service, AI/ML functions across a range of client types and industries, allowing them to do more with their data.Key responsibilities include;Deploying data repositories such as lakes and warehouses.Contribute to our growing portfolio of data solutions.Ongoing optimisation and management of data platforms.Development of transformational logic for data pipelines.Data evangelism. We want to show our clients how to followthe best practices for data.A passion for data!Focused data experience working with SQL and/or NoSQL solutions.Solid exposur
View all details

Salesforce Developer

Amrsen Solutions Pvt. Ltd.

  • 4 - 10 yrs
  • 22.5 Lac/Yr
  • Chennai
Triggers Hadoop Salesforce CRM Lightning Apex Java Software Development Testing SFDC Visualforce Eclipse HTML XML JavaScript SQL CSS
Primary Responsibilities: Analyze potential Salesforce enhancements and requests to determine the most optimal way to develop and build a best-in-class solution against the stipulated timelines Bridge the technology and business worlds by breaking down complex problems to technical requirements to drive prioritized feature implementation Assist in determining the best technical implementation methods, track Salesforce Sales and Service cloud development, create Report, Dashboard. Develop Integration, Data load needed for Salesforce implementation. Apply, promote agile process. Experience using code source control tools such as Git or Bitbucket Experience using project management tools such as JIRA, confluenceQualifications Bachelors degree in Computer Science or equivalent 7 years of experience in technical development of CRM/Salesforce Experience in customizing the Salesforce Sales Cloud and Service Cloud platform on a continuing basis. Implementation experience of Omni channels for customer service. Strong understanding of Salesforce architecture. Experience in custom development using Visualforce Apex code, Trigger, Lightning flow, lightning components, Salesforce designer, community builder designer etc. Strong proficiency in HTML, XML, JavaScript, SQL, CSS, Java and REST-based web services. Experience in debugging issues, performance improvement etc. Salesforce certifications preferred. Handle inbound User requests, errors, and process questions Proficient with No-code components (Process Builder, Flow, Workflow Rules) Good understanding and implementation experience with order and billing system. Create and maintain fields, objects, formulas, workflows, views, reports, dashboards, validations, assignment rules, and templates Proactively seek out and identify needed system changes, updates, inefficiencies and/or inaccuracies that impact the efficiency and reliability of the system Ability to independently
View all details

Data Engineer

Harishree Recruitment Services Pvt Ltd

  • 4 - 10 yrs
  • 25.0 Lac/Yr
  • Chennai
Big Data Kafka Spark Beam EC2 Ecosystems SAS-Statistical Analysis System ETL Hadoop DATA ENGINEER Azure JSON XML Scala Github DevOps Data Miration Work From Home Walk in
Position:- Junior & Senior Data Engineer.Role - Data EngineerExp:- Jr.:- 4+ Years & Sr.: 7+ YearsSalary:- Jr. :- upto 20LPA & Sr.:- Upto 25 LPALocation:- WFHJob Description 7+ years of experience in IT programming, application/product development. At least 5 years' experience working in any of the Bigdata cloud ecosystems likeAWS/GCP Strong in SQL/RDBMS and any of the programming languages like Java/Python/Scala. Experience with one or more of the big data tools like Hadoop, Kafka, Spark, Beam etc. Good experience with AWS services like EC2, EMR, RedNote:- notice period max 30 Days ca apply
View all details

Data Engineer

Acies Global

  • 3 - 8 yrs
  • Chennai
SQL Database SSIS MySQL SQL Server Hive Hadoop Amazon Redshift Data Engineer Big Data Work From Home
Looking For: SSIS developer with 3+ years of experience working in SQL Server and its related technologies. SSIS and other pivotal platforms is a key area of interest and work.Should have extensive working knowledge in various ETL processes and Data Warehousing. Must be experienced in designing and building complete ETL/SSIS process pipelines for moving and transforming data for Data WarehousingMust be able to follow a streamlined process flow and ensure accuracy at every step of the development process; starting from design and planning. Should have experience in logging, configurations and deployment of SSIS packages. Should have extensive working knowledge on various data sources for integration viz., APIs, Excel, Flat File, CSV, XML, SQL server, Access DB etcShould be proficient in data warehouse concepts viz., star schema, facts and dimensions.Should possess expertise in Data Flow Task components viz., Conditional split, Merge, Merge Join, Union all, Lookup, Derived Column etcShould have design, planning and implementation knowledge of ETL-A processes using SSIS/SSRS/SSASShould be efficient and knowledgeable in Identifying if there are issues and risks for the data migration activities.Should be proficient and spontaneous to identify a mitigation plan for the risk discovered. Should be able to design a quality management scheme and plan for data migration activities including validation of loaded data to ensure the maximum efficiency and accuracy of data populated within the new solution.Should be able to create and maintain a progress tracker for all data migration activities and stages.
View all details

Data Specialist

Perex Engineering Private Limited

Snowflake Python ETL Hadoop Big Data Data Specialist Work From Home
We have vacant of 20 Data Engineer Jobs in Hyderabad,Bangalore,Chennai,Pune Experience Required : 3 Years to 10 Years Educational Qualification : Other Bachelor Degree Skill Snowflake,Python,ETL,Hadoop,Big data etc.
View all details
  • 4 - 8 yrs
  • 10.0 Lac/Yr
  • Chennai
C# C++ C JAVA Oracle PL SQL MSSQL MAPR Hadoop Weblogic Tomcat Software Engineer
Designation: Software EngineerRole: Permanent Position (Full time)Location: Kandanchavadi , OMR , Chennai - ( WFH as off now)Shifts: Open to Work in shiftsYears of Experience: 3 to 5 years Desired Profile: 3 - 5 years operational/technical experience Experience on web services request using WSDL. Operating Systems: RHEL Linux, UNIX (HP-UX, Sun OS, AIX) Programming Languages: C#, C++, C, JAVA (Java required) Data Base: Oracle (PL/SQL), MSSQL, MAPR Hadoop ETL/DQ Tool: SSIS and MSSQL, Informatica Power Center, Datastage, IDQ MDM Tool: Any MDM, and IDD. Scripting & Middleware: ksh, bash, WAS, Weblogic, JBOSS, TOMCAT. Business analysis with experience in IBM MDM and with governance considerations Related work experience in the areas of Master Data Management, Data Governance working with data and databases, Including data profiling and analysis. Good hands-on with any MDM tool required Knowledge of Data Quality tool. Knowledge of any ETL Tool. Understanding of SOA, Data integration, Data Quality, data architecture and Master Data Management. Excellent analytical and problem solving skills. Excellent verbal and written communication skills Excellent data integration skills, understanding solutioning ETL and Data Quality components of MDM solutions Successful teamwork experience & demonstrated leadership abilities Ability to work on multiple tasks/opportunities concurrently Ability to excel in a performance-based environmentAbout usDucen is a trusted technology solutions provider working with Fortune 1000 companies to drive business outcomes and enhance their customer experience. We build and deploy custom advanced analytics solutions with our enterprise analytics platform and offer a comprehensive services portfolio covering data management, cybersecurity, and application development services to help clients stay ahead of the technology curve.
View all details

Salesforce Developer

Consultomia Business Solutions Private Limited

Hadoop Salesforce CRM APEX Trigger Visualforce Lightning Salesforce Developer Jquery Walk in
Minimum 2 years of hands-on experience in developing and delivering solutions on Salesforce Force.com platform including apex, triggers, Batch Apex, Visualforce and Integration with external systems A self-starter with ability to work effectively with limited supervision Ability to work with functional teams through all phases of the Software Development Lifecycle independent of the delivery methodology chosen Strong in mapping business requirements to salesforce product features Strong consulting skills to engage business for pipeline work In-depth knowledge on out of the box features, proposing solutions as per the business needs Experience with Agile development practices will be a plus, Experience in HTML, JavaScript, and CSS Solid knowledge of SOQL. Strong oral and written communication skills Experience with jQuery will be a plus Good grasping power to learn new things quickly and to understand the client' requirements Multitasking abilities, solid technical (domain) skills, and excellent interpersonal ability Strong analytical, problem-solving, and troubleshooting skills Knowledge of SOAP and REST APIs and JSON Good knowledge of salesforce recommended best practice around design and development Ability to work independently and as part of a team Salesforce.com Certified Developer DEV401, Platform Developer IRole Description : Business requirement gathering, client interaction and be able to work in offshore / onshore model Design, develop, test, document, and deploy high quality business solutions on the Force.com platform leveraging design patterns (standards solution specific) Customize, develop and support solutions on salesforce.com, Force.com platform Engage with the client and participate in process flow analysis and design Communicate with other resources and with clients regarding status, technical issues, and creative solutions
View all details
Python SCALA JAVA AWS - EMR Hadoop Spark Kafka SQL NoSQL Data Architecture Data Structures Storm Flink
ResponsibilitiesCreate and maintain optimal data pipeline architectureAssemble large, complex data sets that meet functional / non-functional business requirements.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Open Source and AWS big data technologiesBuild analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.Work with data and analytics experts to strive for greater functionality in our data systems.QualificationsExperience building and optimizing big data pipelines, architectures and datasets.Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Experience interacting with customers and various stakeholders.Strong analytical skills related to working with unstructured datasets.Build processes supporting data transformation, data structures, metadata, dependency and workload management.Working knowledge of message queuing, stream processing, and highly scalable big data lakes.Strong project management and organizational skills.Experience supporting and working with cross-functional teams in a dynamic environment.They should also have experience using the following software/tools:Big data technologies: Hadoop, Spark, Kafka, etc.Relational SQL and NoSQL databases, including Postgres and Cassandra.Data pipeline and workflow management tools: Airflow, NiFi etc.Cloud services: AWS - EMR, RDS, Redshift, Glue. Azure - Databricks, Data Factory. GCP - Dataproc, Pub/SubStream-processing systems: Storm, Spark Streaming, Flink etc.
View all details
  • 5 - 11 yrs
  • 14.0 Lac/Yr
  • Chennai
Hadoop Cloud Computing GCP SQL Spark Hive DataFlow DataProc
Job Title: GCP DeveloperLocation: Chennai, IndiaDuration: Permanent positionWork Type: Onsite Role Industry: Financial TechnicalExperience: 05 YearsPlease share the resume at victor@theqctech.comJob Description: 4 Years of relevant experience is mandatory Chennai is a preferred location but anywhere is also fine. Experience extracting data from a variety of sources, and a desire to expand those skills (working knowledge in SQL and Spark is mandatory) Excellent Data Analysis skills. Must be comfortable with querying and analyzing large amounts of data on Hadoop HDFS using Hive and Spark. Preferably candidates must have experience building applications using Google Cloud Platform-related frameworks such as DataFlow/DataProc/PubSub. Excellent Communication Skills to Understand and Pass on Requirements. Pure GCP with the above experience is also ok
View all details

Apply to 17 Hadoop Job Vacancies in Chennai

  • Chennai Jobs
  • Hyderabad Jobs
  • Ahmedabad Jobs
  • Bangalore Jobs
  • Mumbai Jobs
  • Pune Jobs
  • Kolkata Jobs
  • Delhi Jobs