357

Data Engineer Jobs

filter
  • Location
  • Role
  • Functional Area
  • Qualification
  • Experience
  • Employer Type
  • 6 - 10 yrs
  • Jamshedpur
Microsoft Fabric Python Pyspark Azure Devops Architecture SQL ETL
As a Senior Data Engineer, you will be at the forefront of our data transformation, specifically architecting and building the next generation of our Financial Transaction Processing engine within Microsoft Fabric. You will lead the migration of legacy financial pipelines into a unified Lakehouse architecture, ensuring that every pence of revenue and every byte of roaming data is accounted for with 100% accuracy.This role requires a blend of high-level architectural design, hands-on pipeline engineering, and a deep understanding of telecom financial domains (billing, reconciliation, and inter-carrier settlements).Key ResponsibilitiesArchitectural Design: Lead the design and implementation of a Medallion Architecture (Bronze, Silver, Gold) within MS Fabric to handle high-velocity telecom transactions.Data Pipeline Engineering: Develop end-to-end ETL/ELT workflows using Fabric Data Factory, Spark Notebooks (PySpark), and Dataflows Gen2.Financial Migration: Lead the migration of legacy SQL-based financial workloads into Fabric Lakehouses and Warehouses, ensuring zero data loss and maintaining historical integrity.Configuration & Governance: Configure Fabric workspaces, capacities, and OneLake security. Implement rigorous data governance and Row-Level Security (RLS) to protect sensitive financial PII.Performance Optimization: Tune large-scale Spark jobs and SQL queries to process millions of Call Detail Records (CDRs) and financial events with minimal latency.Analytics & Reporting: Build high-performance Semantic Models and Direct Lake datasets for real-time financial reporting and margin analysis.Mandatory Skills & QualificationsTechnical ExpertiseMicrosoft Fabric: Expert-level knowledge of the Fabric ecosystem (Lakehouse, Eventhouse, Data Factory, and OneLake).Pipeline Development: Strong experience with Azure Data Factory or Fabric Pipelines, including complex orchestration and incremental loading patterns.Coding & Transformation: Advanced proficiency in PySpark/Python and T-SQL for complex financial logic (e.g., multi-currency conversion, tax calculations).Architecture: Deep understanding of Delta Lake, Star Schema modeling, and SaaS-based data platform design.DevOps: Hands-on experience with Git integration, CI/CD for Fabric items, and Azure DevOps deployment pipelines.Domain KnowledgeTelecom Finance: Familiarity with telecom-specific data such as CDRs, roaming settlements, prepaid/postpaid billing cycles, and Churn analytics.Financial Accuracy: Experience building automated reconciliation frameworks to catch revenue leakage or transaction mismatches.Required Experience8+ years in Data Engineering3+ years hands-on with Microsoft FabricProven experience delivering large-scale data migration programmes in UK telecom or retailStrong SQL & PySpark expertiseExperience with financial data reconciliation and ledger alignmentStrong stakeholder engagement across Finance, Billing, and Technology teamsExperience operating within UK enterprise governance frameworksPreferred ExperienceCertifications: DP-600 (Microsoft Fabric Analytics Engineer) or DP-203 (Azure Data Engineer).Legacy Knowledge: Experience migrating from on-prem SQL Server or older Azure Synapse environments.Real-Time Intelligence: Knowledge of KQL (Kusto) for monitoring real-time transaction streams.Skills to be evaluated onMicrosoft FabricAzure-Data-FactoryFabric-AnalyticsDatabricksMandatory Skills : Microsoft Fabric
View all details
  • Fresher
  • 4.0 Lac/Yr
  • Central Railway Station Chennai
Data Management Data Processing Data Sheets Data Migration Data Analysis Data Modeling ETL Tools ETL Processes Data Visualization Data Warehousing
We are looking for a Data Processing Engineer to join our team. This is a part-time position.
View all details
  • 8 - 11 yrs
  • 20.0 Lac/Yr
  • Coimbatore
Python SQL Pyspark Scala Snowflake Databricks Bigquery Airflow Dbt
We are looking Lead Data Engineer with 8 to 11 years in Coimbatore.Strong expertise in SQL, Python, and distributed processing (PySpark/Scala)Hands-on experience with cloud data platforms (Snowflake, Databricks, BigQuery, Redshift)Experience with workflow orchestration tools (Airflow, Dagster, Prefect)Deep understanding of data modeling techniques (Kimball, Data Vault 2.0)Experience with streaming systems (Kafka, Flink, Kinesis, Spark Streaming)Familiarity with Infrastructure-as-Code (Terraform) and CI/CD pipelinesStrong knowledge of distributed systems, cloud architecture, and performance optimization
View all details
  • 0 - 2 yrs
  • 5.0 Lac/Yr
  • Chennai
Data Management Data Extraction Data Entry Data Validation
We are looking for a Data Engineer to join our team in Chennai. This part-time role allows you to work from home and is suitable for candidates with 0 to 2 years of experience. **Key Responsibilities:**- **Data Collection:** Gather data from various sources to ensure we have accurate and relevant information for analysis. This includes pulling data from databases, APIs, and other data repositories.- **Data Cleaning:** Improve data quality by identifying errors and inconsistencies. This means ensuring that all data is accurate, complete, and usable for analysis.- **Data Storage:** Help design and manage data storage solutions, ensuring that data is stored securely and can be accessed efficiently by team members.- **Data Transformation:** Convert raw data into a usable format for analysis. This involves using tools and programming languages to modify and structure data appropriately.- **Collaboration:** Work with data scientists and analysts to understand their data needs. This role requires communicating effectively with team members to support their projects.**Required Skills and Expectations:**- Basic understanding of data engineering concepts and tools, such as SQL, Python, or similar programming languages.- Strong attention to detail, as accuracy in data handling is crucial.- Ability to work independently and manage your time effectively while meeting deadlines.- Good communication skills to share ideas and collaborate with others in the team. - A curious mindset and willingness to learn about new technologies and data processes.
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!

Gen AI Developer (5+ Yrs Exp)

Turn5tech HR Solution

  • 5 - 9 yrs
  • Guindy Chennai
Azure Developer Gen AI Developer Python Developer Data Engineer
We are looking for a Gen AI Developer with 5 to 9 years of experience to join our team in Guindy. The successful candidate will work on developing innovative artificial intelligence solutions.**Key Responsibilities:**- **AI Model Development:** Create and implement machine learning models to solve complex problems, ensuring they meet performance expectations.- **Data Analysis:** Analyze large datasets to extract meaningful insights and improve AI functionalities, enhancing the effectiveness of AI solutions.- **Collaboration with Teams:** Work closely with cross-functional teams, including data scientists and software developers, to integrate AI capabilities into existing systems and products.- **Testing and Optimization:** Conduct thorough testing of AI models and applications, optimizing them for better performance and user experience.- **Continuous Learning:** Stay updated with the latest trends and advancements in AI technology, and apply new knowledge to improve current practices.**Required Skills and Expectations:**- Candidate should hold a Bachelors degree in Computer Applications (B.C.A), Bachelor of Science (B.Sc), or Bachelor of Engineering (B.E).- A strong understanding of machine learning algorithms, natural language processing, and deep learning frameworks is essential.- Proficiency in programming languages such as Python, R, or Java is required to develop and implement AI models.- Excellent problem-solving skills and the ability to work independently as well as part of a team are necessary.- Strong communication skills to effectively collaborate with team members and stakeholders are important for success in this role.Work Location: Chennai Guindy Tamil Nadu 626001Interested For Job Please Call Immediately & Share you updated profileCall/Whatsapp: Mr. Pradeep HR careerturn5techhrsolution@gmail.com
View all details

Looking For Big Data Engineer

Talent Zone Consultant

  • 6 - 12 yrs
  • Bangalore
Python SQL Spark Hadoop ETL Tools Data Warehousing Airflow Programming Data Visualization Data Lakes Data Modeling
Key Responsibilities:Build and manage data pipelines and ETL processesWork with large datasets using tools like Spark, Hadoop, or SQLEnsure data quality and performance optimizationRequirements:Experience in Python/SQLHands-on with ETL tools and big data technologiesUnderstanding of data warehousing conceptsBrief Summary:Develops scalable data systems to support analytics and business insights.
View all details
Biomedical Device Design Biomedical Signal Processing Biomaterials Medical Device Validation Project Management Healthcare Industry Knowledge Data Analysis Clinical Research Biomechanics Biocompatibility Testing Medical Imaging Technology Technical Documentation
We are seeking a Senior Biomedical Engineer with at least four years of experience to join our team in Perth. The ideal candidate will be responsible for designing, developing, and maintaining medical equipment and devices that improve patient care and safety.Key Responsibilities:1. **Design and Development**: Create and modify biomedical devices by applying engineering principles and knowledge of medical practices to meet healthcare needs.2. **Quality Assurance**: Ensure that all products comply with regulatory standards and undergo rigorous testing to guarantee safety and effectiveness.3. **Collaboration**: Work closely with cross-functional teams including physicians, manufacturers, and regulatory bodies to gather input and address challenges throughout the product lifecycle.4. **Troubleshooting and Maintenance**: Diagnose issues with existing medical devices and implement timely solutions to maintain device efficiency and reliability.5. **Training and Support**: Provide training and technical support to healthcare professionals on the use and maintenance of biomedical equipment.The ideal candidate will possess strong problem-solving skills, attention to detail, and the ability to communicate complex ideas to a non-technical audience. A solid understanding of relevant regulations and quality standards in the biomedical field is essential. Candidates should have a passion for improving healthcare through technological innovations and a commitment to continuous learning and development.
View all details

Instrument & Control Engineer - Borivali East Mumbai

Pacific Placements and Business Consultancy Pvt. Ltd.

  • 1 - 7 yrs
  • Borivali East Mumbai
Control Systems Data Acquisition DCS Systems Electrical Engineering Field Instruments HMI Development Instrument Design Loop Tuning
We are looking for an Instrument & Control Engineer who will be responsible for designing, developing, and maintaining equipment used for controlling and monitoring engineering systems. The ideal candidate will have a strong background in instrumentation and control systems, with 1 to 7 years of experience.**Key Responsibilities:**- **Design Control Systems:** Develop and design control systems for various engineering processes to ensure efficiency and compliance with industry standards.- **Install and Calibrate Instruments:** Oversee the installation and calibration of instrumentation equipment, ensuring everything functions correctly according to specifications.- **Troubleshooting:** Identify and resolve issues in control systems and instrumentation to maintain operational efficiency and minimize downtime.- **Documentation and Reporting:** Maintain accurate documentation of control systems, including design changes, procedures, and performance reports, facilitating clear communication within the team.- **Collaboration:** Work closely with engineers and technicians across departments to implement control strategies and improve system performance.**Required Skills and Expectations:**Candidates should possess a diploma in engineering, preferably in Instrumentation or related fields, with a minimum of 1 year of relevant experience. Strong analytical and problem-solving skills are essential, along with the ability to work independently and in a team. Familiarity with instrumentation software and tools is crucial. The candidate should also have excellent communication skills to collaborate effectively with team members and other departments. A proactive attitude towards troubleshooting and a keen willingness to learn new technologies are highly valued.
View all details
  • Fresher
  • Female
  • Chennai
Data Cleansing Big Data Technologies Data Transformation Programming Data Warehousing
We are seeking a motivated and enthusiastic Data Processing Engineer to join our team. This role is perfect for recent graduates or individuals looking to start their career in data processing.As a Data Processing Engineer, you will be responsible for handling and organizing data efficiently. Your key responsibilities will include:- **Data Entry**: Accurately input data into our systems while ensuring all information is correct and up to date.- **Data Quality Assurance**: Review and validate data to identify any errors or inconsistencies, and fix them promptly.- **Data Maintenance**: Regularly update and maintain databases to keep them organized and accessible for team members.- **Reporting**: Generate basic reports from the data you process, helping the team make informed decisions.To be successful in this role, you should possess strong attention to detail and the ability to work independently from home. You must be comfortable using computers and familiar with basic data processing tools. Strong communication skills are essential to collaborate effectively with team members. Having a proactive approach to problem-solving will also be important as you work through data challenges.We welcome applications from females who have completed their 10th grade education and are eager to begin a full-time position in data processing. This is a fantastic opportunity to learn, grow, and launch your career in the field of data.
View all details
  • 4 - 10 yrs
  • Qatar
ADF Hana SAP SQL
Minimum 4+ years of hands-on experience in data engineering or data management, preferably in the energy or industrial domain. Design, develop, and maintain scalable ETL/ELT pipelines to process structured and unstructured data from diverse sources including SAP ECC / S/4HANA, SQL Server, Oracle, other RDBMS, Excel/CSV files, Parquet, JSON, XML, and APIs. Collaborate with business and functional teams to define data integration and transformation strategies aligned with enterprise objectives. Optimize data pipelines, storage, and query performance for high-volume datasets across cloud or on-prem environments. Implement robust data quality, validation, and monitoring frameworks to ensure accuracy and reliability. Develop and automate data workflows, versioning, and deployment pipelines to streamline data operations. Support the deployment, monitoring, and governance of data infrastructure and warehouse/lakehouse environments. Work closely with Data Architects to establish best practices for data modeling, warehousing, and lineage tracking. Enable incremental data processing (CDC) and efficient handling of batch and near real-time data pipelines. Collaborate with analytics teams to enable insightful visualization and reporting solutions. Prepare and maintain technical documentation for pipelines, transformations, and data flows. Collaborate with business stakeholders to understand requirements and ensure alignment with business goals.Qualification and Experience: BE/B.Tech / Science Graduate in Computer Science, Information Technology, or related field. Strong foundation in data modeling, performance tuning, and ETL/ELT orchestration. Experience working with enterprise data sources such as SAP ECC / S/4HANA, SQL/Oracle databases, and file-based data (Excel, CSV, Parquet, JSON). Experience with one or more cloud platforms such as Azure, AWS, or GCP. Familiarity with tools such as Microsoft Fabric, Databricks, Power BI, or equivalent tools is an advantage. Experience in data migration or transformation projects is a strong advantage. Understanding of data governance, data quality, and metadata/lineage concepts is a plus.Key Deliverables: ETL/ELT pipelines for all assigned data sources Data ingestion from SAP, databases, files, and APIs Bronze/Silver/Gold data layer implementation Data quality checks and monitoring setup Optimized and scalable data pipelines Develop the Power BI dashboard. Technical documentation and runbooks
View all details

Opening For Data Engineer

Ratul Puri Travel

  • 5 - 7 yrs
  • 3.0 Lac/Yr
  • United States
Information Technology Computer Hardware Problem Solving Hardware Troubleshooting Hardware Support Data Engineer
We are seeking a skilled Mid-Level Data Engineer to design, build, and optimize scalable data pipelines and architectures. The ideal candidate will have strong experience in data processing, ETL development, and cloud-based data platforms, supporting data-driven decision-making across the organization.
View all details
Mining Operations Mining Mining Engineering Mining Technician Mining Geologist Data Mining
As a Geologist, you will play a key role in studying the Earth's structure and processes. Your insights will help in various sectors like natural resource exploration and environmental management. We are looking for someone with 2 to 8 years of relevant experience.**Key Responsibilities:**- **Conduct Field Studies:** You will gather geological samples and data from various locations, helping to analyze soil, rock, and mineral compositions.- **Analyze Data:** You'll interpret geological data using various software and techniques to understand the Earths characteristics and identify natural resources.- **Prepare Reports:** Document your findings and create comprehensive reports that explain your studies and recommendations for stakeholders.- **Collaborate with Teams:** Work closely with other professionals, such as engineers and environmental scientists, ensuring safe and informed project practices.- **Stay Updated:** Keep abreast of the latest developments and technologies in geology to enhance our practices and efficiencies.**Required Skills and Expectations:**You should have a degree in Geology or a related field. Strong analytical skills are essential, allowing you to interpret complex geological data effectively. Proficiency in geological software and tools is expected. You must be detail-oriented and able to work independently as well as in a team setting. Good communication skills are necessary to convey technical information clearly to various stakeholders. A commitment to safety and environmental sustainability is also crucial in this role.
View all details

Looking For Lead Process Engineer

Cynosure Corporate Solutions

  • 7 - 10 yrs
  • Chennai
DMAIC Statistical Analysis SPC FMEA Process Improvement Process Mining Power BI PythonMinitab Time Series Forecasting Capacity Planning GenAIAI Tools Data Modeling Stakeholder Management
We are looking for a highly analytical Lead Process Engineer to drive operational excellence and process transformation within lending operations. The role involves leveraging advanced statistical methods, process intelligence, and AI-assisted tools to improve efficiency, reduce variation, and enable scalable business operations. This position also includes mentoring junior team members while collaborating with cross-functional stakeholders.Key Responsibilities:Lead end-to-end process improvement initiatives using DMAIC methodologyApply statistical analysis, DoE, and SPC techniques to enhance process performanceDesign and optimize operational workflows across lending value streamsBuild and maintain process intelligence dashboards for real-time insightsConduct FMEA-based risk assessments and implement mitigation strategiesDrive AI-assisted process improvements using GenAI tools and automation frameworksPerform capacity forecasting and time series modeling for scaling operationsCollaborate with Product, Data, and Operations teams for transformation initiativesPresent ROI-driven insights and business cases to senior stakeholdersMentor and develop Process Analysts and Junior EngineersRequired Skills & Qualifications:7-9 years of experience in process engineering, quality engineering, or operations improvementStrong expertise in Lean Six Sigma methodologies (DMAIC execution experience mandatory)Hands-on experience in statistical tools: DoE, SPC, regression, hypothesis testingExperience with process mining / intelligence tools (Power BI, Celonis, etc.)Proven experience in FMEA and risk assessment frameworksStrong capability in capacity planning and time series forecastingExposure to AI/GenAI tools in operational environmentsProficiency in Excel (advanced), Python or Minitab, and data visualization toolsStrong stakeholder management and communication skills
View all details

Urgent Requirement For DR Automation Engineer

EPM Staffing Services (OPC) Private Limited

  • 8 - 14 yrs
  • 25.0 Lac/Yr
  • Mumbai
Salesforce Bulk API V2 BigQuery SQL GCP Airflow Orchestration Data Migration
Qualification: Data Migration EngineerMandatory Skills:Salesforce Bulk API V2BigQuery SQLGCPAirflow Orchestration on GCPSQL performance optimization - GCP BigQueryRoot cause analysis for data issuesAdditionally, good understanding of Salesforce components e.g. Object Model, Picklists etc.
View all details
  • 0 - 1 yrs
  • 3.0 Lac/Yr
  • Nashik
Project Management Quality Assurance Software Development Team Collaboration Project Coordination Technical Support Data Analysis Documentation Risk Management
Key ResponsibilitiesTechnical Documentation: Maintaining project blueprints, technical manuals, and compliance records.Progress Monitoring: Tracking the project timeline against the master schedule (using tools like MS Project or Excel) and highlighting potential delays.Resource Coordination: Ensuring that parts, tools, and materials arrive at the site or production line exactly when the team needs them.Quality Assurance: Assisting in site visits or shop floor inspections to ensure that work is being done according to the technical specifications.Communication Bridge: Facilitating information flow between departments-for example, explaining design changes to the production team or clarifying technical issues to the client.Risk Mitigation: Identifying small technical issues early on before they snowball into costly project delays.
View all details

Data Engineer Jobs For M.C.A Freshers

SECRET TECHNOLOGIES INDIA VMS GROUP

  • 0 - 4 yrs
  • 40.0 Lac/Yr
  • Pune
Data Management Data Analysis Data Mining Informatica PLSQL SQL Oracle SQL Data Collection
As a Data Engineer, your responsibilities will include collecting and analyzing data to help inform business decisions. You will be responsible for data management, ensuring that data is accurate and up-to-date. This will involve using tools such as Informatica, PLSQL, SQL, and Oracle SQL to manipulate and query large datasets.Your skills should include a strong understanding of data analysis techniques, such as data mining and statistical analysis.
View all details
  • 0 - 1 yrs
  • 8.0 Lac/Yr
  • Female
  • Mall Road Amritsar
Data Integration Data Warehousing SQL Informatica ETL Hadoop Big Data Python
We are looking for a motivated Data Engineer to join our team. This part-time position allows you to work from home and is suitable for individuals with little to no experience. The ideal candidate will help us manage and process data to ensure it meets the needs of the business.**Key Responsibilities:**- **Data Collection:** Gather data from various sources to prepare for analysis. Its important to ensure the data is accurate and up-to-date.- **Data Cleaning:** Clean and organize raw data to make it usable. This involves removing errors and inconsistencies, which is crucial for reliable analysis.- **Data Storage:** Help in storing data in databases or cloud storage systems. Proper organization helps in easy access and retrieval of data when needed.- **Collaboration:** Work with other team members to understand their data needs. Communication is key to delivering the right data for their projects.- **Support:** Assist in monitoring data systems and providing technical support. Being proactive in identifying issues helps keep the data flow smooth.**Required Skills and Expectations:**Candidates should have a basic understanding of data management principles. Familiarity with data cleaning tools and database management systems is a plus. The ability to learn new software quickly and a strong attention to detail are essential. Good communication skills are important for working with teammates and understanding project requirements. We encourage fresh graduates and those with relevant qualifications to apply.
View all details
  • 3 - 8 yrs
  • Bangalore
Web Scraping Python
Data Extraction Engineer designs extraction systems (and not just scripts). They build and maintain a next-generation data acquisition platform that treats web scraping as a declarative, specification-driven discipline. Instead of hard-coding XPaths for every site, Web Scraping Developer defines what data is neededusing schemas, natural language descriptions, or visual blueprintsand lets intelligent pipelines figure out how to get it.Key Responsibilities:Specification-Driven Extraction Engineering-Design and maintain declarative extraction specificationsusing Pydantic models, JSON schemas, or domain-specific languagesthat describe exactly which fields to capture, their types, and validation rules.Implement pipelines that translate these specifications into executable extraction plans, leveraging both classical (Scrapy, Playwright) and AI-augmented (LLM-based semantic parsing) backends.Build reusable specification libraries for recurring data types (product prices, tariff codes, regulatory texts) to accelerate onboarding of new sources.Autonomous & Self-Healing Systems-Deploy self-healing spiders that automatically detect website layout changes and repair themselves using Model Context Protocol (MCP) servers (e.g., Scrapy MCP Server, Playwright MCP).Integrate semantic extraction (Scrapy-LLM, custom LLM pipelines) to eliminate selector brittlenessspiders rely on field descriptions, not fragile XPaths.Orchestrate complex, multi-step browsing workflows with agentic frameworks (BMAD/TEA, AutoGPT-like agents) that reason about page state, adapt to anti-bot measures, and correct their own behaviour in real time.Platform Thinking & Reusability-Move beyond one-off scrapers: build a component-based extraction platform where selectors, login handlers, and pagination logic are shared, versioned, and tested.Implement monitoring, alerting, and automatic rollback for failed extraction runs.Champion ethical crawling by designrate limiting, robots.txt respect, and compliance with GDPR/CCPA are built into the specification layer, not retrofitted.Collaboration & Continuous Innovation-Partner with data scientists and domain experts to refine extraction specifications for complex, unstructured domains (e.g., legal texts, tariff classifications).Evaluate and pilot emerging tools to push automation coverage beyond 90%.Document and evangelise specification-driven best practices across the engineering organisation.Candidate Profile:Education and Experience -Bachelors degree in Computer Science3+ years of experience in web scraping or data extractionSkills and competences-Specification-Driven Extraction Experience defining extraction requirements via schemas (Pydantic, JSON Schema) and executing them through both traditional crawlers and LLM-based semantic parsers.SelfHealing & Semantic Extraction Handson use of ScrapyLLM, Scrapy MCP Server, or similar systems that decouple field definitions from page structure.Agentic Workflows Familiarity with frameworks that give LLMs browser control (Playwright + MCP, BMAD/TEA) to handle complex, nondeterministic crawling tasks.Classical Scraping Fundamentals You still know how to write a Scrapy spider or a Playwright script when needed, but you actively seek to replace that work with reusable, specification-driven components.Data Validation & Storage Ability to define validation rules within specifications and land clean data into SQL/NoSQL databases or data lakes.Python proficiency: the focus is on an extraction engineer who happens to use Python.HTTP, DOM, XPath, CSS.Basic API integration and authentication flows.Preferred / Nice-to-Have Skills:Contributions to open-source scraping or AI-automation projects.Experience training or fine-tuning small LLMs for domain-specific extraction.Familiarity with data privacy engineering (GDPR, CCPA) baked into specification design.DevOps light Docker, CI/CD for testing extraction specifications.Mindset & Approach (Non-Negotiable):Strong belief that the future of scraping is declarative, not imperative. Youd rather write a schema that says extract the price than debug an XPath when a website redesigns.Looking to shift from code that scrapes to systems that understand extraction.
View all details
  • 5 - 11 yrs
  • 25.0 Lac/Yr
  • Bangalore
Apache Kafka Azure Grafana Data Warehousing
Role: Data Engineer 2.0Location - RemoteExperience: Min. 5 YearsNotice Period- Immediate to 15 Days or serving notice periodKey Responsibilities:Design and implement manual test strategies for real-time streaming use cases using Azure Service Bus, Event Hubs, Kafka, and Azure Functions.Validate Spark Streaming applications, including unbounded data flows, streaming DataFrames, checkpoints, and streaming joins.Develop test plans for containerized microservices deployed on Kubernetes, ensuring scalability and fault tolerance.Test data ingestion and transformation workflows across open table formats like Delta Lake, Apache Iceberg, and Hudi.Good to Have:Monitoring and troubleshooting system performance using observability stacks such as Prometheus, Grafana, and ELK.functional and performance testing on analytical databases and query engines such as Trino, StarRocks, and ClickHouse.Testing and validation of data products designed under data mesh architecture, ensuring domain-oriented data quality and governance.
View all details

AI/ML Engineer

Kasa Talent Pvt Ltd

  • Fresher
  • 4.0 Lac/Yr
  • Pune
Data Analysis C++ Python LLM AWS Google Cloud Azure AI SQL Data Cleaning
We are seeking a talented AI/ML Engineer to design, develop, and deploy machine learning models that solve real-world business problems.Key ResponsibilitiesDevelop, train, and optimize machine learning and deep learning models.Design and implement AI solutions for automation, prediction, and data analysis.Work with large datasets to clean, preprocess, and engineer features.Deploy models into production environments and monitor performance.Build scalable ML pipelines and integrate models with applications.Conduct experiments, model evaluations, and performance tuning.Collaborate with cross-functional teams including data engineers and product managers.Stay updated with the latest research and advancements in AI/ML.Note: Only Pune-based candidates can eligible to apply.
View all details

Interview For Software Developer || B.A - Freshers

SECRET TECHNOLOGIES INDIA VMS GROUP

Testing & Development Back Office Data Entry Technical Support Non-Voice Process BDE (canadian US UK)-In House Business Development Executive On Field Operations Customer Service Customer Service (Web) BPO MNC and Domestic ( Hindi Marathi or Any Local Language) Accounts & Finance Financial Analyst Java-script Software Project Management JEE Agile Methodology Application Software Support Mysql
Dear Candidate ,Give dynamic start to your career with VMS Group SECRET TECHNOLOGIES INDIA , we are looking for freshers and experienced candidates who want to make good career in IT or NON It field.Job Type: Full time.Eligibility criteria : Fresher /Experience 0-5 Year.Location : Pune. OR PAN INDIADesired Qualification - Any Bachelor degree Minimum BA ,B.Com - Commerce, B.E , B.Tech, M.Sc , M.C.A B.C.A ,Any Graduates, Post Graduates, can also apply. Should be comfortable to work any location of pune or Pan India.Your application for theTesting & DevelopmentBack Office / Data EntryTechnical Support, Non-Voice ProcessBDE (canadian, US, UK)-In HouseBusiness Development Executive, On FieldOperations/Customer Service , Customer Service (Web)BPO MNC and Domestic ( Hindi Marathi or Any local language)Accounts & Finance /Financial AnalystPosition stood out to us and we would like to invite you for a Face to face interview Telephone interview at our office to get to know you a bit better.You will meet with the MR. Prem Sir or Aishwarya Ma'am.The interview will last about 20 minutes and you'll have the chance to discuss the above position and learn more about our company. Please carry your updated resume at list Two CopiesDesired Candidate - Candidate should be Dynamic, Enterprising, Presentable with excellent communication and written skills.If candidate is interested please visit our office. Monday to Saturday 10 am to 5 pm bitwin anytime Meet- Prem Sir / Ms. Aishwarya Ma'am Interview Venue-building A -15th , city vista, foundation road, Kharadi, Pune, Maharashtra 411014We will take 2 rounds of an interview.-Contact No. 7066165593 Aishwarya Ma'am 7066147637 Prem Sir 1. HR Round and simple logic checking P.I round.2. Technical HR Final round by Senior HR Manager.Thanks & Regards SECRET TECHNOLOGIES INDIA VMS GROUP Placement / Contract Labour / Real EstateAddress -building A -15th , city vista, foundation road, Kharadi, Pune, Mahara
View all details

Cloud Engineer - Full Time

Talent Zone Consultant

  • 9 - 15 yrs
  • Bangalore
AWSAzure Docker Kubernetes Terraform Ansible CICD Tools Linux DevOps Integration Network Security Data Management IT Security Windows Server Administration Statistical Programming Troubleshooting Skills
Key Responsibilities: Cloud EngineerDesign, implement, and manage CI/CD pipelinesAutomate deployments and infrastructure using tools like Terraform/AnsibleMonitor system performance and ensure high availabilityRequirements:Experience with AWS/AzureKnowledge of Docker, Kubernetes, and scriptingStrong problem-solving skillsBrief Summary:Responsible for automating and optimizing cloud infrastructure and deployment processes.
View all details
  • 7 - 10 yrs
  • 35.0 Lac/Yr
  • Bangalore
Solution Architecting Data Engineering Design Architect Pipeline Management Data Pipeline Python AWS AWS Cloud Cloud Architect
Key Responsibilities Requirement Analysis: Collaborate with stakeholders to understand businessrequirements and data sources, and define the architecture and design of dataengineering models to meet these requirements. Architecture Design: Design scalable, reliable, and efficient data engineering models,including algorithms, data pipelines, and data processing systems, to support businessrequirements and quantitative analysis. Technology Selection: Evaluate using POCs and recommend appropriate technologies,frameworks, and tools for building and managing data engineering models, consideringfactors like performance, scalability, and cost-effectiveness. Data Processing: Develop and implement data processing logic, including data cleansing,transformation, and aggregation, using technologies such as AWS Glue, Batch, Lambda. Quantitative Analysis: Collaborate with data scientists and analysts to develop algorithmsand models for quantitative analysis, using techniques such as regression analysis,clustering, and predictive modeling. Model Evaluation: Evaluate the performance of data engineering models using metricsand validation techniques, and iterate on models to improve their accuracy andeffectiveness. Data Visualization: Create visualizations of data and model outputs to communicateinsights and findings to stakeholders.Data Engineering: Understanding of Data engineering principles and practices, includingdata ingestion, processing, transformation, and storage, using tools and technologiessuch as AWS Glue, Batch, Lambda. Quantitative Analysis: Proficiency in quantitative analysis techniques, including statisticalmodeling, machine learning, and data mining, with experience in implementingalgorithms for regression analysis, clustering, classification, and predictive modeling. Programming Languages: Proficiency in programming languages commonly used in dataengineering and quantitative analysis, such as Python, R, Java, or Scala,
View all details
  • Fresher
  • 4.5 Lac/Yr
  • Bahadurgarh Patiala
Work From Home Data Transformation Big Data Technologies Data Visualization Programming ETL Processes ETL Tools Database Management Data Analysis Data Cleansing Data Quali
As a Data Processing Engineer, you will play a vital role in managing and processing data accurately and efficiently. This position is part-time and allows you to work from the comfort of your home. We welcome freshers who have completed their 10th standard.**Key Responsibilities:**- **Data Entry**: Entering data from various sources into databases or spreadsheets accurately and swiftly, ensuring high-quality data for analysis.- **Data Cleaning**: Reviewing and correcting errors in datasets to maintain data integrity and quality, which is crucial for reliable results.- **Data Analysis Support**: Assisting in analyzing processed data to help identify trends and patterns that support decision-making in projects.- **Reporting**: Generating reports based on processed data to present findings clearly and effectively to stakeholders.- **Collaboration**: Working with team members to understand data requirements and provide necessary support for ongoing projects.**Required Skills and Expectations:**You should possess strong attention to detail, as accuracy is essential in data processing. Basic computer skills, including familiarity with spreadsheets and databases, are necessary. Effective communication skills will help you collaborate with teammates and understand project requirements. A willingness to learn and adapt to new tools and technologies will enhance your growth in this role. Being organized and managing time efficiently is important to meet deadlines and accomplish tasks successfully.
View all details
  • 2 - 5 yrs
  • 3.0 Lac/Yr
  • Nashik
Project Management Quality Assurance Software Development Team Collaboration Project Coordination Technical Support Data Analysis Documentation Risk Management Time Management Customer Service Reporting Problem Solving Troubleshooting Resource Planning Process Improvement Technical Skills Analytical Skills Project Planning
Key ResponsibilitiesTechnical Design & Drafting: Using CAD software to create layouts for Medical Gas Pipeline Systems (MGPS), nurse call systems, or ambulance interior configurations.Site Supervision: Managing on-site installation teams (technicians, welders, and fitters) to ensure work meets ISO and AIS-125 safety standards.Project Scheduling: Developing timelines (Gantt charts) to ensure hospital wings or ambulance fleets are delivered on time.Quality Control & Testing: Conducting pressure tests on gas lines and load-testing stretchers/cots before handover to the client.Procurement Coordination: Working with the supply chain to ensure specialized medical-grade materials (like degreased copper pipes or reinforced alloys) are available on-site.Regulatory Compliance: Ensuring every aspect of the project adheres to healthcare building codes and medical device regulations.
View all details
  • 0 - 1 yrs
  • 3.3 Lac/Yr
  • Nashik
Software Tools Engineer Data Validation Data Analysis Data Manager Data Structures
Job Location: NashikPosition: Sports Data AnalystNumber of Vacancies: 150180Eligibility:Qualification: Any Graduate (Completed), especially BCA / BCS / BCom / Diploma / DegreeAge Limit: 22 to 26 yearsBasic computer knowledgeGender: Male( Priority ) / FemaleExperience: FreshersSalary Package: 1.44 LPA 3.5 LPA (Basically 12K starting salary )Job Responsibilities:Analyze sports gameplay videos using our advanced software toolsAnnotate key events, player movements, and in-game strategiesProvide accurate and detailed video analysis reports
View all details

Hardware Networking Engineer

Impact HR & KM Solutions

  • 1 - 3 yrs
  • 2.0 Lac/Yr
  • Nashik
Microsoft Word Excel Sheet Basic Computers Online Data Entry Computer Skills Data Management
System Monitoring & Operation:Monitor computer systems, networks, and peripheral devices to ensure optimal performance and identify any anomalies or errors.Execute routine batch jobs, scheduled tasks, and system processes as per established procedures.Perform system startup and shutdown procedures as required.Maintain logs of system activities, incidents, and resolutions.Data Entry & Management:Accurately input data into various software applications, databases, or spreadsheets (e.g., MS Excel, specialized ERP/CRM software, Tally).Verify data for accuracy and completeness, correcting any errors or inconsistencies.Organize and maintain digital files and documents in a structured manner.Perform data backup and recovery procedures as per schedule to prevent data loss.Printing & Documentation:Manage and operate printers, scanners, and other office equipment.Handle large-scale printing jobs, including reports, invoices, labels, and other essential documents.Ensure proper paper handling, toner/ink replacement, and basic maintenance of printing devices.Assist in maintaining organized physical and digital records.Basic Troubleshooting & Support:Perform first-level troubleshooting for common hardware and software issues (e.g., printer jams, network connectivity problems, application errors).Escalate complex technical issues to IT support or senior personnel as needed.Assist users with basic computer-related queries and provide guidance on software usage.System Maintenance & Security:Assist in routine system maintenance tasks, such as disk cleanup, anti-virus scans, and software updates.Ensure adherence to data security protocols and confidentiality policies.Report any suspicious activities or security breaches.Compliance & Reporting:Adhere to company policies and procedures regarding computer operations and data handling.Generate routine operational reports as requested by management.
View all details

AI/ML Engineer

United Technology

  • 5 - 7 yrs
  • 12.0 Lac/Yr
  • Coimbatore
Python Numpy Pandas TensorFlow Docker Git CICD AzureAWS FAST API Aiml Engineer Data Engineer
We are looking fo AI/ML Engineer with 5+ year experience in Coimbatore.Skills Set:Strong Python (Pandas, NumPy, Scikit-learn)-Deep Learning using TensorFlow or PyTorch-Expertise in time-series modeling and feature engineering-Experience with ETL/ELT pipelines, Docker, Git, CI/CD-Cloud exposure (Azure/AWS) and hybrid deployments-API development (FastAPI preferred)
View all details
  • 5 - 7 yrs
  • 12.0 Lac/Yr
  • Chennai
Snow Flake Developer Dbt Dagster SQL Python Git Cicd Pipelines Data Modeling Datawarehouse Architecture Claude Copilot Data Extraction
We are looking for Senior Data Engineer (Snowflake / dbt / Dagster / AI-Assisted Development) with 5+ Year Experience in Chennai.Design and optimize data pipelines from SQL Server to Snowflake Work with healthcare data formats including EDI 835 / 837 if applicable Use AI tools (LLMs, code assistants, automation agents) to improve engineeringproductivity and quality
View all details

Data Engineer

United Technology

  • 1 - 3 yrs
  • 4.0 Lac/Yr
  • Chennai
Data Integration Data Engineer Hadoop ETL SQL Informatica Apache AWS Big Data Python
We are looking Data Engineer with 1 to 3 years experience in Chennai.Immediate joiners preferred
View all details

Python Architect (12-19 Years)

Cynosure Corporate Solutions

  • 12 - 19 yrs
  • Chennai
Python Architecture Advanced Python Development System Design Microservices Data Engineering AIML Platforms API Design Performance Optimization Code Governance
We are looking for a highly experienced Python Architect to design and lead large-scale, enterprise-grade Python solutions for AI, data, and analytics platforms. The role requires deep hands-on expertise in Python across architecture, development, and production systems, supporting end-to-end AI and data-driven solutions.Key Responsibilities:Architect scalable, high-performance systems using Python as the core technologyDefine technical architecture for AI, data engineering, and analytics platformsLead design and development of microservices, APIs, and backend frameworks in PythonEnsure production-ready implementations with strong focus on performance, security, and reliabilityCollaborate with data science, ML, DevOps, and product teams to operationalize AI solutionsEstablish coding standards, best practices, and architectural guidelinesReview code, mentor senior engineers, and provide technical leadershipDrive innovation and continuous improvement in Python-based system designRequired Skills & Qualifications:1218 years of experience working fully and extensively in PythonStrong expertise in Python system architecture and large-scale backend designExperience supporting AI/ML, data engineering, or analytics platformsSolid understanding of microservices, REST APIs, and distributed systemsHands-on experience with cloud platforms (AWS/Azure/GCP)Strong problem-solving skills and ability to lead complex technical initiativesExcellent communication and stakeholder collaboration skills
View all details

Opening For Data Engineer

Cynosure Corporate Solutions

  • 3 - 9 yrs
  • Delhi
Apache Python Hadoop SCALA
Job Description: We are looking for Data Engineers to join our team. You will use various methods to transform raw data into useful data systems. For example, youll create algorithms and conduct statistical analysis. Overall, youll strive for efficiency by aligning data systems with business goals. To succeed in this position, you should have strong analytical skills and the ability to combine data from different sources. Data engineer skills also include familiarity with several programming languages and knowledge of machine learning methods. Job Requirements: Participate in the customers system design meetings and collect the functional/technical requirements. Build up data pipelines for consumption by the data science team. Skillful in ETL process and tools. Clear understanding and experience with Python and PySpark or Spark and SCALA, with HIVE, Airflow, Impala, and Hadoop and RDBMS architecture. Experience in writing Python programs and SQL queries. Experience in SQL Query tuning. Experienced in Shell Scripting (Unix/Linux). Build and maintain data pipelines in Spark/Pyspark with SQL and Python or SCALA. Knowledge of Cloud (Azure/AWS/GCP, etc..) technologies is additional. Good to have knowledge of Kubernetes, CI/CD concepts, Apache Kafka Suggest and implement best practices in data integration. Guide the QA team in defining system integration tests as needed. Split the planned deliverables into tasks and assign them to the team. Needs to Maintain/Deploy the ETL code and follow the Agile methodology Needs to work on optimization wherever applicable. Good oral, written and presentation skills. Preferred Qualifications: Degree in Computer Science, IT, or a similar field; a Masters is a plus. Hands-on experience with Python and Pyspark Or Hands-on experience with Spark and SCALA. Great numerical and analytical skills. Working knowledge of cloud platforms such as MS Azure, AWS, etc..
View all details

Looking For Data Engineer

Cynosure Corporate Solutions

  • 8 - 14 yrs
  • Chennai
Data Engineer Python AWS
Responsibilities: Designing and implementing data pipelines to collect, clean, and transform data from various sources Building and maintaining data storage and processing systems, such as databases, data warehouses, and data lakes Ensuring data is properly secured and protected Developing and implementing data governance policies and proceduresCollaborating with business analysts, data analysts, data scientists, and other stakeholders to understand their data needs and ensure they have access to the data they need Sharing knowledge with the wider business, working with other BAs and technology teams to make sure processes and ways of working is documented. Collaborate with Big Data Solution Architects to design, prototype, implement, and optimize data ingestion pipelines so that data is shared effectively across various business systems. Ensure the design, code and procedural aspects of solution are production ready, in terms of operational, security and compliance standards. Participate in day-to-day project and agile meetings and provide technical support for faster resolution of issues. Clearly and concisely communicating to the business, on status of items and blockers. Have an end-to-end knowledge of the data landscape within the company. Skills & Experience: 10+ years of design & development experience with big data technologies like Azure, AWS or GCP Preferred is Azure & Databricks, with experience in Azure DevOps Experience in data visualising technology in DL, like PowerBI Proficient in Python, PySpark and SQL Proficient in querying and manipulating data from various DB (relational and big data). Experience of writing effective and maintainable unit and integration tests for ingestion pipelines. Experience of using static analysis and code quality tools and building CI/CD pipelines. Excellent communication, problem-solving, and leadership skills, and be able to work well in a fast-paced, dynamic environment.
View all details
View More Jobs