3

Data Architect Job Vacancies in Greater Noida

AWS Data Engineer Lead / Architect

Vision Excel Career Solutions

Python Data Architect Data Engineer AWS
Are you a Mid/Senior level T-Shaped AWS expert with specialization in DevOps and Data Engineering space? If yes, We have an exciting opportunity just for you.One of our reputed European Client is looking for AWS engineers to help them build secure, resilient and cost-effective solutions on AWS platform to reap the benefits from their investment in AWS platform and services.We are looking for self-motivated, highly experienced engineers, possessing great analytical and excellent communication skills for this client facing role.What do we expect from you?Role: Data Engineer (AWS)*Mandatory*Experience in Developing Data Pipelines that process large volumes of data using Python, PySpark, Pandas etc, preferably on AWS.Experience in ingesting batch and streaming data from various data sources.Experience in writing complex SQL using any RDBMS (Oracle, PostgreSQL, SQL Server etc.)Experience in developing ETL, OLAP based and Analytical Applications.Ability to quickly learn and develop expertise in existing highly complex applications and architectures.Comfortable working in Agile projects*Desirable*Exposure to AWS platform's data services (AWS Lambda, Glue, Athena, Redshift, Kinesis etc.)Knowledge of DevOps and CD/CD tools.Experience in handling unstructured dataKnowledge of Financial Markets domainKeywords: Data Engineer, Data Pipelines, Data Ingestion, AWS Lambda, AWS Athena
View all details

Hiring For AWS Data Engineer

Right Time Placement

Data Engineer Data Architect AWS Data Warehousing
Job Description AWS Data Engineer with min of 5 to 7 years of experience. Collaborate with business analysts to understand and gather requirements for existing or new ETL pipelines. Connect with stakeholders daily to discuss project progress and updates. Work within an Agile process to deliver projects in a timely and efficient manner. Design and develop Airflow DAGs to schedule and manage ETL workflows. Transform SQL queries into Spark SQL code for ETL pipelines. Develop custom Python functions to handle data quality and validation. Write PySpark scripts to process data and perform transformations. Perform data validation and ensure data accuracy and completeness by creating automated tests and implementing data validation processes. Run Spark jobs on AWS EMR cluster using Airflow DAGs. Monitor and troubleshoot ETL pipelines to ensure smooth operation. Implement best practices for data engineering, including data modeling, data warehousing, and data pipeline architecture. Collaborate with other members of the data engineering team to improve processes and implement new technologies. Stay up to date with emerging trends and technologies in data engineering and suggest ways to improve the team's efficiency and effectiveness.
View all details

Blockchain Developer

PLANET WEB IT SERVICES

Blockchain Architecture Cryptography Data Structures Web Development Javascript AWS Azure Cloud Java Walk in
We are looking for a highly capable blockchain developer to design, implement, and distribute a secure blockchain-based network. You will be analyzing our blockchain needs, designing customized blockchain technologies, and launching and maintaining our blockchain network.To ensure success as a blockchain developer, you should possess extensive knowledge of programming languages used for blockchain development and experience in cryptography. An outstanding blockchain developer will be someone whose expertise translates into secure, fast, and efficient digital transactions.
View all details
MDM Azure Server Data Warehousing
Job DescriptionAs a Data & Analytics Architect, you will lead key data initiatives, including cloud transformation, data governance, and AI projects. You'll define cloud architectures, guide data science teams in model development, and ensure alignment with data architecture principles across complex solutions. Additionally, you will create and govern architectural blueprints, ensuring standards are met and promoting best practices for data integration and consumption.Strong cloud data architecture knowledge (preference for Microsoft Azure)8-10+ years of experience in data architecture, with proven experience in cloud data transformation, MDM, data governance, and data science capabilities.Design reusable data architecture and best practices to support batch/streaming ingestion, efficient batch, real-time, and near real-time integration/ETL, integrating quality rules, and structuring data for analytic consumption by end uses.Ability to lead software evaluations including RFP development, capabilities assessment, formal scoring models, and delivery of executive presentations supporting a final recommendation.Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Standards, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, non-traditional data and multi-media, ETL, ESB).Experience with cloud data technologies such as Azure data factory, Azure Data Fabric, Azure storage, Azure data lake storage, Azure data bricks, Azure AD, Azure ML etc.Experience with big data technologies such as Cloudera, Spark, Sqoop, Hive, HDFS, Flume, Storm, and Kafka.
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!
Data Bricks Data Governance Data Warehouse Developer
10+ years' of experience in a technical role with expertise in data governance and data warehousing - such as setting up DataBricks as a Service modelProduction deployment experience with data governance solutions and hands-on experience with cloud data lakes.Experience with design and implementation of data warehousing technologies.Deep Specialty Expertise with scaling big data workloads that are performant and cost-effective - including technologies such as Delta LakeSupport customers by authoring reference architectures, how-to, and demo applicationsExperience working with Enterprise AccountsIntegrate Databricks with 3rd-party applications to support customer architectures.Experience designing and implementing architectures within public clouds (AWS, Azure or GCP)Good communication Skillsif you are interested kinly drop a mail we reach us
View all details

Big Data Architect

NMS Consultant

  • 8 - 14 yrs
  • Gurgaon
Architect Hadoop CI CD Design Development Big Data
Required Skills (must have) : Strong Knowledges/hands-on experience about offers and features of Big data technologies (especially Hadoop, Hortonworks) Strong experience of development using Spark Scala, Java, JavaScript, Nifi, Kafka, Hive, Hbase Strong knowledge of API development Strong knowledge of the java frameworks (Spring MVC, Spring Security) Hands-on knowledge of implementing multi-staged CI / CD with tools like AWS DevOps, Jenkins, BitBucket. Experience in CI / CD integration within the Java / JavaScript ecosystem with build tools like Maven, Grunt, Gulp and other Devops tooling: Jenkins, , GitLab, SonarQube, GERRIT, SBT, Nexus, Docker Experience of AGILE methods (Scrum, Kanban) Active contributions to forum and dev communityRequired Skills (should have) : Knowledges about Elastic Search/Kibana (ELK) knowledge Knowledges about Linux, Unix, Windows environments Strong knowledge on various app monitoring tools. Strong knowledge of web services (WSDL Soap, Restful) Exposure to various Data Visualization tools such as PowerBI, Tableau, and Pentaho etc. Experience with MySQL, NoSql (MongoDB, Redis, DynamoDB) Scripting Skills: Strong scripting (e.g. Python) and automation skills. Operating Systems: Windows and Linux system administration. Monitoring Tools: Experience with system monitoring tools (e.g. Nagios). Problem Solving: Ability to analyze and resolve complex infrastructure resource and application deployment issues
View all details