8

Big Data Job Vacancies in Mumbai

filter
  • Location
  • Role
  • Functional Area
  • Qualification
  • Experience
  • Employer Type

Big Data Lead

Hexaware Technologies

Snowflake Python SQL
Must have 4-6 years of experience in Data warehouse, ETL, BI projects Must have atleast 4+ years of experience in Snowflake Expertise in Snowflake architecture is must. Must have atleast 3+ years of experience and strong hold in Python/PySpark Must have experience implementing complex stored Procedures and standard DWH and ETL concepts Proficient in Oracle database, complex PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Good to have experience with AWS services and creating DevOps templates for various AWS services. Experience in using Github, Jenkins Good communication and Analytical skills Snowflake certification is desirable
View all details

Corporate Trainer

The Magic Data

Java Full Stack Software Testing Life Cycle Machine Learning Python Trainer Big Data Robotic Process Automation RPA Developer Azure Deep Learning Corporate Trainer
We are looking for a trainer who can conduct training in any one or more of the topics mentioned below:1. Java Full Stack2. Software Testing3. Machine Learning (Python)4. Robotic Process Automation (RPA)5. Big Data
View all details
Hadoop Hive Kafa Python Big Data Engineer JSON Work From Home
Brief about the Company:AdZapier Corporation is a global technology and enablement services company with a vision to transform data into value for everyone. Through a simple open approach, in connecting systems and data, we provide the data foundation for the worlds best marketers. By making it safe and easy to activate, validate, enhance, and unify data. We provide marketers with the ability to deliver relevant messages at scale and tie those messages back to actual results. Our products and services enable individual-based marketing, allowing our clients to generate a higher ROI and drive better omni-channel customer experiences.Position Description:Join our Information Technology team where you will work on new technologies and find ways to meet our customers needs and make it easy for them to do business with us.You will use functional expertise to act as an advisor to management and make recommendations on more complex projects. You will use professional concepts and company policies & procedures to solve a wide range of difficult problems creatively and practically.ResponsibilitiesYou will be responsible for operations and administration of Cloudera Hadoop platform.You will work independently on day to day monitoring and operations of Data Analytics platform. You will be required to develop automation using scripting languages. After initial training, you will be able to handle critical operation tasks as well as on demand requests.Minimum Requirements: 5+ years of experience in Software Development including Big Data Analytics area Experience in Hadoop Big Data Platform Operations and Administration High Proficiency working with Hadoop platform including Hadoop, Hive, Spark/Scala, Java, Kafka, Flume etc. Experience with any scripting language such as BASH, Scala or Python Good understanding of file formats including JSON, Parquet, Avro, and otherWork Hours - 2.30 pm noon to 11.30 pm (Mon-Fri) (US Shift)
View all details
Big Data React JS Python AWS C++ Angular Spark Programming ETL SQL Work From Home
**Preference will be given to the candidates who can join on or before 1st of October, 2022**You will:Write excellent production code and tests and help others improve in code-reviewsAnalyze high-level requirements to design, document, estimate, and build systemsCoordinate across teams to identify, resolve, mitigate and prevent technical issuesCoach and mentor engineers within the team to develop their skills and abilitiesContinuously improve the team's practices in code-quality, reliability, performance, testing, automation, logging, monitoring, alerting, and build processesYou have:For (Full stack):2 - 10 Years of experienceStrong with DS & AlgorithmsHands on Experience in the Programming languages: JavaScript (React or Angular), Python, SQL.Experience with AWS.For (Backend):2 - 10 years of experienceHands on product development experience using Java/ C++/PythonExperience with AWS,SQL,GITStrong with Data structures and AlgorithmsAdditional nice to have skills/certifications:For Java skill set:Mockito, Grizzly, Netty, VertX, Jersey / JAX-RS, Swagger / Open API, Nginx, Protocol Buffers, Thrift, Aerospike, Redis, Kinesis, Sed, Awk, PerlFor Python skill set: Data Engineering experience, Athena, Lambda, EMR, Spark, Glue, Step Functions, Hadoop, Kinesis, Orc, Parquet, Perl, Awk, RedshiftFor (Data Engineering):2 - 10 years of experienceExperience with object-oriented/object function scripting languages: Python.Experience with AWS cloud services: EC2, RDS, Redshift,S3,Athena, GlueMust be proficient in GIT, Jenkins, CICD (Continuous Integration Continuous Deployment)Experience in big data technologies like Hadoop, Map Reduce, Spark, etcExperience with Amazon Web Services and DockersFor (Geo Team):4 - 10 years of experienceExperience with Big Data technologies like Hadoop, Spark, Map Reduce, Kafka, etcExperience using object-oriented languages (Java, Python)Experience in working with different AWS technologies.Experience in software
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!

Data Engineer

Maxdata Solutions

Big Data Spark SCALA Impala HBase Kafka MongoDB PostgreSQL Rabbitmq Sqoop
Currently we are hiring as Data Engineer,Job Location: Mumbai / Bangalore/ NoidaHands-on experience programming language: Python, Java, Scala ? Passionate and knowledgeable about big data stacks: ? Distributed systems: Spark(PySpark), Hadoop, Presto, Hive, etc. ? Message Queueing systems: Kafka, rabbitMQ, NSQ, etc are good to have. ? Database (Relational & NoSQL): PostgreSQL, MySQL, MongoDB, etc. ? Experience gathering and analyzing system requirements ? In-depth understanding of database structure principles, data warehousing, data mining concepts, and segmentation techniques ? Experience with cloud computing platforms (AWS, GCP, etc.) and UNIX environment. ? experience in AWS services eg EMR, Lambda, Step Functions, S3, Redshift etc is a plus. ? Experience in designing, implementing, and monitoring big data analytics solutions ? Have fast learning capability and natural curiosity about big data ? DevOps/DataOps skills are plus points ? Background: Fields of study is Computer Science (preferred) or Any other graduation degree.If you are interested then please share your updated resume onPrakash Rathod
View all details
Python SCALA JAVA AWS - EMR Hadoop Spark Kafka SQL NoSQL Data Architecture Data Structures Storm Flink
ResponsibilitiesCreate and maintain optimal data pipeline architectureAssemble large, complex data sets that meet functional / non-functional business requirements.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Open Source and AWS big data technologiesBuild analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.Work with data and analytics experts to strive for greater functionality in our data systems.QualificationsExperience building and optimizing big data pipelines, architectures and datasets.Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Experience interacting with customers and various stakeholders.Strong analytical skills related to working with unstructured datasets.Build processes supporting data transformation, data structures, metadata, dependency and workload management.Working knowledge of message queuing, stream processing, and highly scalable big data lakes.Strong project management and organizational skills.Experience supporting and working with cross-functional teams in a dynamic environment.They should also have experience using the following software/tools:Big data technologies: Hadoop, Spark, Kafka, etc.Relational SQL and NoSQL databases, including Postgres and Cassandra.Data pipeline and workflow management tools: Airflow, NiFi etc.Cloud services: AWS - EMR, RDS, Redshift, Glue. Azure - Databricks, Data Factory. GCP - Dataproc, Pub/SubStream-processing systems: Storm, Spark Streaming, Flink etc.
View all details

Backend Developer C# - Remote

Smart Apartment Data

C# .Net Backend Developer Backend Big Data API Work From Home
Are you looking to work with the latest technologies in Big Data for an award-winning software company? Smart Apartment Data is an employee-rated five-star company and a winner of the Houston Best & Brightest Companies to Work for , we place strong emphasis on offering a fun work-environment, generous pay with bonuses, individual growth, and a healthy work-life balance. Who are we?Founded 16 years ago, Smart is now the premier data analytics platform for the housing industry in the United States. We are utilizing Big Data to offer market intelligence real estate brokers, investors and analysts.Who are we looking for?We are looking for someone fluent in English with the ability to express their creativity and talents. You will work on interesting projects with experienced developers using the AWS cloud.Sneak peak of projects:*Big Data Orchestration*Data Warehousing in AWS*API Design with C# 9We are looking for someone who:*Fluent in English (C1 advanced)*Ability to work flexible hours*2+ years of experience as a C# developer*Good understanding of REST API design methodologies using C#We offer:*Location Independent / 100% Remote*Flexible working hours*Competitive base salary with guaranteed annual bonus*Work with latest and cutting-edge technologies*Training and mentoring by senior developersThink you fit the description?Apply as soon as possible and send an email and CV
View all details

Informatica ETL Developer

Whiteklay Technologies Pvt ltd

  • 4 - 7 yrs
  • 12.0 Lac/Yr
  • Mumbai
Informatica Informatica Big Data ETL Tool Oracle SQL Hive MapReduce Hadoop
At least 3 years of experience developing ETL processes.- Strong in Informatica design concepts using its products.- Hands-on knowledge of Mapplets, Mappings, Workflows, and Applications.- Proficient in Creating Mappings, workflows and implementing ETL concepts.- Solid data warehousing concepts - dimensional modelling, facts, dimensions, helper tables, SCD concepts etc.- Strong ETL and data modelling experience.- Experience in development of database processes using Oracle SQL, Hive, MapReduce.- Sound Unix shell scripting and command level experience.- Knowledge of Hadoop, Map Reduce, Hive, Spark is an advantage.- Excellent knowledge of debugging, tuning and optimising the performance of database queries.- Thorough knowledge of software methodologies, distributed networking, databases, communications, and multiprocessing applications.- Experience in Netezza- Actively participate in business requirements sessions, design review and test case review meetings.Basic understanding of any programming language. As developer, should worked on change requests or enhancements by making some code changes. .:Good to have Understanding of SAS programming language.
View all details
Frontend Frontend Developer MVC Big Data Angular Angular Developer Typescript Javascript API HTML CSS Express Framework Web Developer Work From Home
Are you looking to work with the latest technologies in Big Data for an award-winning software company? Smart Apartment Data is an employee-rated five-star company and a winner of the Houston Best & Brightest Companies to Work for , we place strong emphasis on offering a fun work-environment, generous pay with bonuses, individual growth, and a healthy work-life balance. Who are we?Founded 16 years ago, Smart is now the premier data analytics platform for the housing industry in the United States. We are utilizing Big Data to offer market intelligence real estate brokers, investors and analysts.Who are we looking for?We are looking for someone fluent in English with the ability to express their creativity and talents. You will work on interesting projects with experienced developers using the AWS cloud.Sneak peak of projects:*Big Data Analytics *Spatial map reporting We are looking for someone who:*Fluent in English *Minimum 2 years with Angular 9+*Good understanding of front-end enterprise app designBenefits*Location Independent / 100% Remote*Flexible working hours*Competitive base salary with guaranteed annual bonus*Work with latest and cutting-edge technologies*Training and mentoring by senior developersThink you fit the description?Apply as soon as possible and send an email and CV
View all details