112

ETL Jobs

filter
  • Location
  • Role
  • Functional Area
  • Qualification
  • Experience
  • Employer Type
  • Fresher
  • 4.5 Lac/Yr
  • Gajularamaram Hyderabad
Data Quality Assurance Data Transformation ETL Processes Data Manipulation SQL Queries Data Profiling Structured Data Formats ETL Tools Database Management Data Warehousing Technical Documentation Data Extraction Data Validation Data Analysis Data Integration Data Loading Data Mapping Data Cleansing Data Conversion Tools Data Migration
As a Data Conversion Specialist, you will fit into our team by transforming and organizing various types of data, ensuring its accuracy and usability. This part-time position allows you to work from home, making it suitable for recent graduates or individuals seeking flexible hours.Key responsibilities include:- **Data Input and Management**: You will be responsible for entering data into our systems accurately. This includes checking for errors and making sure that information is well-organized.- **Data Conversion**: You will convert data from one format to another, ensuring that it is easy to access and use. Familiarity with various software tools may be beneficial.- **Quality Assurance**: It will be your job to review data for any discrepancies. Checking your work meticulously helps maintain high standards of data quality.- **Collaboration**: You will work closely with team members, communicating any issues or suggestions to improve processes and workflows effectively.To excel in this position, you should have a keen eye for detail and be comfortable using computers. Strong organizational skills are essential to manage tasks efficiently. A positive attitude and the ability to work independently from home are also important. As a fresher, you will receive training and support as you grow into this role.
View all details
  • 0 - 1 yrs
  • 8.0 Lac/Yr
  • Female
  • Mall Road Amritsar
Data Integration Data Warehousing SQL Informatica ETL Hadoop Big Data Python
We are looking for a motivated Data Engineer to join our team. This part-time position allows you to work from home and is suitable for individuals with little to no experience. The ideal candidate will help us manage and process data to ensure it meets the needs of the business.**Key Responsibilities:**- **Data Collection:** Gather data from various sources to prepare for analysis. Its important to ensure the data is accurate and up-to-date.- **Data Cleaning:** Clean and organize raw data to make it usable. This involves removing errors and inconsistencies, which is crucial for reliable analysis.- **Data Storage:** Help in storing data in databases or cloud storage systems. Proper organization helps in easy access and retrieval of data when needed.- **Collaboration:** Work with other team members to understand their data needs. Communication is key to delivering the right data for their projects.- **Support:** Assist in monitoring data systems and providing technical support. Being proactive in identifying issues helps keep the data flow smooth.**Required Skills and Expectations:**Candidates should have a basic understanding of data management principles. Familiarity with data cleaning tools and database management systems is a plus. The ability to learn new software quickly and a strong attention to detail are essential. Good communication skills are important for working with teammates and understanding project requirements. We encourage fresh graduates and those with relevant qualifications to apply.
View all details
  • 4 - 6 yrs
  • 7.0 Lac/Yr
  • Chennai
Data Governance Data Lake Data Loading Data Pipelines Data Transformation Query Optimization Performance Tuning Data Architecture Data Warehousing SQL ETL Scripting Data Integration Data Migration Data Modeling Database Design Snowflake Python Big Data Cloud Computing
We are looking for Certified Snow Flake Developer with 4 to 6 year experience in Chennai.Strong knowledge of SQLExperience with Snowflake architectureUnderstanding of Data Warehousing conceptsExperience with ETL / ELT toolsKnowledge of Cloud Platforms (AWS, Azure, GCP)Programming knowledge in Python, Java, or Scala
View all details

Looking For Data Engineer

BSRI Solutions Pvt Ltd

  • 3 - 5 yrs
  • 16.0 Lac/Yr
  • Chennai
Python Pyspark Developer Scala SQL Hive Hadoop Google Cloud Platform Kafka Developer Infrastructure AS Code GitHub Agile Methodology ETL
Required Qualifications : 3+ years of demonstrated ability with Hive, Python, Spark/Scala, SQL, etc. Google Cloud Platform Experience, Big Query, Cloud Storage, Dataproc, Data Flow, Cloud Composer, Cloud SQL, Pub Sub, Terraform, etc. Experience with Hadoop Ecosystem, Kafka, PCF cloud services Familiar with big data and machine learning tools and platforms Experience with BI tools, such as Alteryx, Data Stage, QlikSense, etc. Design data pipelines and data robots, take a vision and bring it to life Master data engineer; mentors others; works closely with IT architects to set strategy and design projects Provide extensive technical, and strategic advice and guidance to key stakeholders around the data transformation efforts Redesign data flows to prevent recurring data issues Strong analytical and problem-solving skills Possess excellent oral and written communication skills, as well as facilitationand presentation skills, and engaging presentation style. Ability to work as a global team member, as well as independently, in achanging environment and prioritize. Ability to establish and maintain coordinated and effective working relationships with application implementation teams, IT project teams, business customers, and end users. Ability to deliver work within deadlines. Experience with agile/lean methodologies Experience working independently and with minimal supervision Experience with Test Driven Development and Software Craftsmanship Experience with GitHub, Accurev, or other version-control systems Experience with Putty Experience with Datastage Strong Communications skills Ability to illustrate and convey ideas and prototypes effectively with team and partners Presence demonstrating confidence, ability to learn quickly, influence, and shape ideas Key Skills Required - Data Engineer- Python / PySpark / Scala- SQL & Hive- Hadoop Ecosystem- Data Pipeline Design & ETL Development- Google Cloud Platform (BigQuery, Dataproc, Dataflow, Cloud Storage)- Kafka / Streaming Data Processing- Terraform (Infrastructure as Code)- DataStage or Similar ETL Tools- Version Control (GitHub or equivalent)- Agile Methodologies- Strong Analytical & Problem-Solving Skills- Stakeholder Collaboration & CommunicationNice to Have:- Cloud Composer, Cloud SQL, Pub/Sub- BI Tools (Alteryx, QlikSense)- Machine Learning Platform Exposure- Test Driven Development (TDD)- Mentoring & Technical Leadership
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!
  • 2 - 5 yrs
  • 20.0 Lac/Yr
  • Bhubaneswar
AB Initio ETL SQL UNIXShell Scripting
We are hiring Ab Initio professionals with 3-6 years of experience in any domin should have knowldge on Ab Initio & PL/SQL. Selected candidates will undergo a mandatory 3-month training program to align with project standards. After successful completion of training and clearing the interview, candidates will be onboarded as full-time Ab Initio Developers.Training location is in BhubaneshwarTraining DetailsTraining Duration: 3 MonthsStipend During Training:-8,000 - -10,000 per monthPost-Training EmploymentRole: Ab Initio DeveloperCTC After Selection:-9 - 10 LPAEmployment Type: Full-Time, PermanentIntrested can share CV to Rekha.C@eagledrift.com
View all details
  • 4 - 10 yrs
  • 5.0 Lac/Yr
  • Bangalore
ETL ELT SQL Python Dbt Spark Hadoop Cloud Data CICD Data Security Data Warehousing
Design, build, and maintain ETL/ELT data pipelines and data lake solutions to support analytics and AI/ML use cases. Ensure data quality, performance, and reliability across enterprise data platforms.Key ResponsibilitiesPipeline DevelopmentData Lake EngineeringPerformance & OptimizationCollaboration & SupportRequired Skills & Experience 4+ years of experience in data engineering or ETL development. Proficiency in SQL and Python (or Scala/Java) for data transformations. Hands-on with ETL tools (Informatica, Talend, dbt, SSIS, Glue, or similar). Exposure to big data technologies (Hadoop, Spark, Hive, Delta Lake). Familiarity with cloud data platforms (AWS Glue/Redshift, Azure Data Factory/Synapse, GCP Dataflow/BigQuery). Understanding of workflow orchestration (Airflow, Oozie, Prefect, or Temporal).Preferred Knowledge Experience with real-time data pipelines using Kafka, Kinesis, or Pub/Sub. Basic understanding of data warehousing and dimensional modeling. Exposure to containerization and CI/CD pipelines for data engineering. Knowledge of data security practices (masking, encryption, RBAC).Education & Certifications Bachelors degree in Computer Science, IT, or related field.Preferred certifications:o AWS Data Analytics Specialty / Azure Data Engineer Associate / GCP Data Engineer.o dbt or Informatica/Talend certifications.
View all details

Looking For Data Engineer

InfiCare Technologies

  • 10 - 15 yrs
  • 22.5 Lac/Yr
  • Delhi
AZURE AWS ETL Data Factory Data Warehousing ETL Tool SQL
Key Responsibilities:- Design and manage data pipelines to transform and integrate structured and unstructured data.- Ensure high data quality and performance.- Support analytics, reporting, and business intelligence needs by preparing reliable data sets and models for stakeholders.- Collaborate with Analysts, Digital Project Managers, Developers, and business teams to ensure data accessibility and usefulness.- Enforce standards for data governance, security, and cost-effective operations.Ideal candidates will thrive in a collaborative, mission-focused environment and excel in ETL/ELT engineering. They should have experience building scalable data solutions using modern data engineering technologies that impact organizational outcomes.Required Qualifications:- Strong proficiency in Structured Query Language (SQL) and at least one programming language such as Python or Scala.- Hands-on experience developing ETL or ELT pipelines.- Experience with cloud-native data services (e.g., AWS Glue, AWS Redshift, Azure Data Factory, Azure Synapse, Databricks).- Good understanding of data modeling and data warehousing concepts.Desired Qualifications:- Design, build, and optimize scalable ETL or ELT pipelines handling both structured and unstructured data.- Ingest and integrate data from internal and external sources into data lakes or data warehouses.- Ensure that processed data is accurate, complete, and secure.Outcomes include well-documented, automated pipelines that support downstream analytics without bottlenecks or data errors.
View all details
  • 2 - 8 yrs
  • 5.0 Lac/Yr
  • Delhi
Industrial Sales B2B Sales Industrial Product Sales Plant Machinery Packaging ETL Tool Construction Engineering Communication MS Office CRM Tools Customer Relationship
Location: Okhla Phase 1, New DelhiJob Description:We are looking for a results-driven Industrial Sales Professional to promote and sell our range of industrial products and solutions to B2B clients. The candidate will be responsible for achieving sales targets, building long-term customer relationships, and identifying new business opportunities within the industrial sector.Key Responsibilities:Develop and manage relationships with industrial clients, distributors, and key decision-makers.Generate new business leads through market research, cold calls, and client visits.Understand customer needs and recommend appropriate products or solutions.Achieve assigned sales targets and contribute to company growth.Prepare and present proposals, quotations, and technical specifications.Coordinate with internal teams (operations, logistics, accounts) to ensure smooth order execution.Maintain accurate records of client interactions and sales activities in CRM.Participate in industry exhibitions, trade shows, and networking events.Provide market feedback on pricing, competition, and emerging trends.Required Skills & Qualifications:Bachelors degree in Engineering, Science, or Business (preferred).Of experience in industrial product sales / B2B sales (machinery, tools, packaging, construction, or engineering sectors preferred).Strong communication and negotiation skills.Self-motivated, target-oriented, and able to work independently.Willingness to travel as required.Good knowledge of MS Office and CRM tools.Riya Mishra 8370014003
View all details
  • 4 - 10 yrs
  • Hyderabad
Data Warehouse Data Testing ETL ELT
Data Tester HyderabadExperience: 4+ years Location: Hyderabad Employment Type: Full-timeAbout the RoleWere looking for a skilled Data Tester to ensure data accuracy, integrity, and reliability across data warehouses, integrations, and curated business-ready layers. The ideal candidate will have hands-on experience in ETL/ELT testing, SQL-based validation, and data quality assurance within modern cloud environments.Key ResponsibilitiesTest and validate data warehouses and integrations across multiple data sources.Verify Business-Ready Datasets (BRD) for correctness and completeness.Develop and execute test cases for ETL/ELT pipelines, ensuring data consistency.Validate data models and data flow from source to analytical layer.Conduct end-to-end integration testing, ensuring compliance with business rules and transformation logic.Collaborate with data engineers and analysts to define testing strategies and resolve defects.Perform performance and scalability testing for data pipelines.Automate data validation and regression testing using suitable tools and frameworks.Maintain clear test documentation and reporting for transparency and traceability.Required SkillsProven experience in data warehouse testing and ETL/ELT validation.Hands-on with data testing tools and data quality frameworks.Strong SQL skills for querying and validating data (preferably in Google BigQuery or other cloud platforms).Excellent attention to detail and analytical thinking.Strong collaboration and communication skills.Preferred SkillsFamiliarity with Airflow for workflow orchestration.Working knowledge of Python for test automation.Understanding of metadata management and data governance processes.Exposure to automation tools for data testing. Interested candidates can apply at:careers@rachisutech.com
View all details

Senior Database Analyst

Indievisa Immigration Services Pvt Ltd

Database Administration Data Administrator Data Analyst Database Algorithm Engineer Database Administration Database Designer Data Care Solutions Data Conversion Operator Data Analysis Data Architect Data Encoder Data Operator Backup and Recovery Data Quality Database Security ETL Processes Normalization Performance Tuning Query Optimization Big Data Technologies Data Mining Indexing Data Migration Stored Procedures Relational Databases Database Design Reporting Tools Data Modeling Data Warehous
Database analysts design, develop and administer data management solutions using database management software. Data administrators develop and implement data administration policy, standards and models. They are employed in information technology consulting firms and in information technology units throughout the private and public sectors.This group performs some or all of the following duties:Database analystsCollect and document user requirementsDesign and develop database architecture for information systems projectsDesign, construct, modify, integrate, implement and test data models and database management systemsConduct research and provide advice to other informatics professionals regarding the selection, application and implementation of database management toolsOperate database management systems to analyze data and perform data mining analysisMay lead, co-ordinate or supervise other workers in this group.Data administratorsDevelop and implement data administration policy, standards and modelsResearch and document data requirements, data collection and administration policy, data access rules and securityDevelop policies and procedures for network and/or Internet database access and usage and for the backup and recovery of dataConduct research and provide advice to other information systems professionals regarding the collection, availability, security and suitability of dataWrite scripts related to stored procedures and triggersMay lead and co-ordinate teams of data administrators in the development and implementation of data policies, standards and models.
View all details
  • 5 - 10 yrs
  • Bangalore
Snowflake ETL Tool Elt ELT Informatica Dbt Azure Server AWS Gcp
Job Title: Technical Project Manager (Offshore) Location: Remote Engagement: Contract________________________________________Key Requirements:experience:o 5+ years in technical project managemento Minimum 2 years in data migration projectsTechnical Skills:o Strong hands-on experience with Snowflake .architecture, performance tuning, and data modelingo Solid understanding of ETL/ELT tools such as Informatica, dbt, etc.o Familiarity with cloud platforms: AWS, Azure, or GCPLeadership & Communication:
View all details
  • 3 - 5 yrs
  • 7.5 Lac/Yr
  • Andheri East Mumbai +1 111758
.NET SQL Server Oracle PostgreSQL NoSQL ASP.NET C# Web Services JavaScript JSON ETL HTML CSS XSLT MVC jQuery
Job Description: Technical ConsultantTrishiv AI Tech Services is hiring Technical Consultants for software development on enterprise projects with our client E4 Software Services Pvt. Ltd in Mumbai. The role involves contributing to CRMNEXT solutions as part of an experienced tech team.Responsibilities:Develop, test, and maintain .NET software applicationsWork with databases: SQL Server, PostgreSQL, Oracle, NoSQLDesign and integrate APIs and web servicesCollaborate with internal and client teams for scalable solutionsTroubleshoot and optimize software performanceSkills:.NET, ASP.NET, C#, SQL, Web Services, JavaScript, JSONDatabase management (SQL Server/PostgreSQL/Oracle/NoSQL)ETL concepts; HTML, CSS, XSLT, MVC, jQuery (preferred)Front-end and cloud/DevOps exposure is a plusEligibility:B.Tech / B.E / BCA / B.Sc (CS/IT/Engg)0-3 years experience in software development or consultingType: Full-timeLocation: Mumbai (Hybrid / On-site)Salary: 30,000 60,000 per monthJoin innovative teams at Trishiv AI Tech and E4 Software to work on enterprise AI and technology projects.
View all details
  • 8 - 12 yrs
  • 16.0 Lac/Yr
  • Hyderabad
Data Governance Officer Stakeholder Management Communication SQL Data Management Asset Management Relational Database Investment Management ELT Process Data Dimensional Modelling ETL
We are seeking an experienced Data Modeler with strong expertise in the Private Equity and Investment Management domain. The ideal candidate will have hands-on experience in canonical data modeling, Azure Databricks (Medallion Architecture), and financial data governance. This role requires close collaboration with data engineering, business stakeholders, and product owners to design scalable, accurate, and consistent data solutions for enterprise-wide usage.Key ResponsibilitiesDesign and implement canonical data models to support private equity and investment business processes.Develop logical and physical data models aligned with enterprise architecture and cloud strategy.Work within Azure Databricks Medallion Architecture (Bronze/Silver/Gold layers) to design efficient and scalable data flows.Collaborate with data engineering teams to implement models and optimize performance.Gather requirements from business stakeholders and product owners; translate them into robust data models.Ensure data standardization, lineage, and governance across applications and integrations.Create and maintain detailed documentation of data models, metadata, and business definitions.Provide thought leadership on canonical modeling, financial data integration, and cloud-based data architecture.Required Qualifications & Skills8-12 years of experience as a Data Modeler or in a similar data architecture role.Strong domain expertise in Private Equity, Investment Management, or Asset Management (fund structures, portfolio data, investor reporting, valuations).Proven expertise in canonical models and developing logical/physical data models.Hands-on experience with Azure Databricks & Medallion architecture.Proficiency in SQL, relational databases, dimensional modeling, and ETL/ELT processes.Strong background in data governance, metadata management, and data cataloging.Knowledge of financial data standards and regulatory reporting frameworks.Excellent communication and stakeholder management skills.Must-Have ExperiencePrivate Equity / Investment domain expertise.Cloud-based architecture (Azure preferred).Financial regulatory reporting knowledge.Educational QualificationBachelor of Engineering / Bachelor of Technology (B.E./B.Tech.) or equivalent in Computer Science, Engineering, or related field.
View all details
  • 4 - 10 yrs
  • 50+ Lakh/Yr
  • Togo
Data Integration ETL ETL Tool Data Warehousing Scala
Job Description:We are seeking a highly skilled and experienced *Data Engineer* to help shape and scale our supply chain and operations analytics infrastructure. In this role, you will work closely with cross-functional teamsincluding Operations, Finance, and Analyticsto design, build, and monitor scalable, production-grade data pipelines. Your work will be critical to driving data-informed decisions across the business.---What Youll Do:- Develop and maintain automated ETL pipelines using Python, Snowflake SQL, and related technologies.- Ensure robust data quality through unit testing, validation, and continuous monitoring.- Collaborate with stakeholders to ingest and transform large healthcare datasets with accuracy and efficiency.- Leverage AWS services such as S3, DynamoDB, Batch, and Step Functions for data integration and deployment.- Optimize performance for pipelines processing large-scale datasets (1GB+).- Translate business requirements into reliable, scalable data solutions.---What You Bring:- 4+ years of hands-on experience as a Data Engineer or in a similar role.- Proven expertise in Python, SQL, and Snowflake for data engineering tasks.- Strong experience building and maintaining production-grade ETL pipelines.- Solid understanding of data validation, transformation, and debugging practices.- Prior experience with *healthcare or claims datasets* is highly preferred.- Practical knowledge of AWS technologies: S3, DynamoDB, Batch, Step Functions.- Experience working with large datasets and complex data environments.- Excellent verbal and written English communication skills.---Work Schedule:- Full-time remote* position (40 hours/week).- Working hours must align with U.S. Central Time Zone (CT).
View all details
  • 5 - 10 yrs
  • 40.0 Lac/Yr
  • Hyderabad
AWS Python AWS Data Engineer Terraform ETL Tool CI CD
About the RoleWe are looking for a highly skilled and experienced Senior Data Engineer to join our team in Hyderabad. The ideal candidate will bring strong technical expertise in building scalable data platforms and pipelines using modern technologies such as Python, Scala, AWS, Redshift, Terraform, Jenkins, and Docker. This role demands a hands-on professional who thrives in a fast-paced, collaborative environment and is eager to solve complex data problems.Key ResponsibilitiesDesign, build, and optimize robust, scalable, and secure data pipelines and platform components.Collaborate with data scientists, analysts, and engineering teams to ensure seamless data flow, integration, and availability across systems.Develop infrastructure as code using Terraform to automate provisioning and environment management.Manage containerized services and workflows using Docker.Set up, manage, and optimize CI/CD pipelines using Jenkins for continuous integration and deployment.Optimize performance, scalability, and reliability of large-scale data systems on AWS.Write clean, modular, and efficient code in Python and Scala to support ETL, data transformation, and processing tasks.Support data architecture planning and participate in technical reviews and design sessions.Must-Have SkillsStrong hands-on experience with Python, Scala, SQL, and Amazon Redshift.Proven expertise in AWS cloud services and ecosystem (EC2, S3, Redshift, Glue, Lambda, etc.).Experience implementing Infrastructure as Code (IaC) with Terraform.Proficient in managing and deploying Docker containers in development and production environments.Hands-on experience with CI/CD pipelines using Jenkins.Strong understanding of data architecture, ETL pipelines, and distributed data processing systems.Excellent problem-solving skills and ability to mentor junior engineers.Nice-to-HaveExperience working in regulated domains like healthcare or finance.Exposure to Apache Airflow, Spark, or Databricks.Familiarity with data quality frameworks and observability tools.
View all details
Glue Lamda ETL
3+ years of AWS data engineering: Glue, Step Functions, Lambda, S3, DynamoDB, EC2Strong Python (boto3) scripting for automationTerraform or CloudFormation expertiseHands-on experience integrating RAG workflows or deploying LLM applicationsSolid SQL and NoSQL data-modeling skillsExcellent written and verbal communication in client-facing contexts
View all details

Opening For SR. Data Analytics

Firstwave Technology

  • 7 - 13 yrs
  • Hyderabad
ETL Testing Data Validation Data Quality SQL T-SQL
******** Contact Me or Share your Resume - aman.tyagi@firstwave-tech.com ****************************************7302599936 DMJob Role - Sr. Data AnalyticsExperience - 7 years to 14 yearsLocation - HyderabadMode - Hybrid Notice Period - Immediate Joiner to 15 DaysSkillset looking for Sr Data QA below:Programming Language: - SQL like T-SQL or Pl/SQL (Must), Python (Nice to have)Skills: - QA Process, Mentoring, ETL Testing, Data Validation, Data Quality, RCM Knowledge Or US Healthcare Knowledge, Hands on Agile Process, Test strategy and Planning, CI/CD Integration, Data reconciliation experienceTools: - SSMS, Toad, Any BI Tools (Like Tableau, Power Bi, Etc), SSIS, ADF, Snowflake (Nice to have)Data testing tools (Nice to have): Great Expectations Deequ dbt (with tests) Pytest (for data scripts)US Healthcare not mandatoryJob Summary The Senior Quality Assurance Engineer will bring comprehensive quality testing expertise to agrowing and innovative organization, designing and documenting testing scenarios, creatingtest plans, and reviewing quality specifications and technical design for both existing and newanalytics products. The Sr. QA Engineer will be an integral part of our growing analyticsproduct team, working with new technology in both manual and automation testingenvironments. The Sr. Quality Assurance Engineer will design testing procedures to ensure our analyticsmeets established quality standards using best practices and industry standard practices.Develops and writes testing scripts to ensure our analytics perform as expected whilemonitoring and documenting testing results according to best practice procedures.Essential Function & Task Perform test execution (both manual and automated) for healthcare analytics includingextraction and load processes, data transformations, data models, and dashboarding. Create detailed, comprehensive, and well-structured test plans and test cases. Collaborate closely with Data & Analytics team members to ensure that production systemdefects are documented, an appropriate testing plan is established, and defects are resolvedin a timely manner. Drive data quality programs and assist in the implementation of company automated testframeworks and solutions within an agile team structure. Performs special projects and other duties as assigned.Education & Experiences Requirement Bachelors degree in computer science, Information Technology, Data Science, Math,Finance, or a related field, or equivalent training and/or experience. Minimum(5) years ofexperience as a quality assurance engineer or data analyst with strongdata quality orientation. E Experience with testing in cloud-native systems (MS Fabric preferred)Preferred qualifications QA related certifications preferred. Strong understanding of US healthcare revenue cycle and billing.Knowledge skill & Abilities Proficiency with using a variety of test case management tools in Azure DevOps and Agiledevelopment tools and process (Azure Dev Ops and Confluence). Proven QA experience designing quality assurance testing for ELT process, dashboard tools(ideally Power BI), large scale data warehouse projects. Knowledge of data quality frameworks, to monitor and enforce data quality standards. Experience with automated testing tools. Proven experience building test plans based on business requirements and technicalspecifications The ability to test the performance and scalability of data systems, especially when handlinglarge volumes of data. This includes checking for speed, reliability, and system bottlenecks indata processing and analytics. Expert SQL in relational databases (SQL Server, MS Fabric) with the ability to independentlyexplore, query, and validate data. Ability to read and understand existing queries as well as create new queries. Strong analytical skills. Strong process improvement & organizational skills. Strong time management skills. Working knowledge of project management specifically Azure DevOps. Ability to identify opportunities that drive execution of action plans to close gaps and movekey priorities forward. Ability to influence and gain support from stakeholders through effective communication andrelationship building. Ability to communicate technical information to technical and nontechnical personnel atvarious levels in and across the organization. Ability to exercise sound judgment and handle highly sensitive and confidential informationappropriately. Ability to remain results oriented and work within a collaborative and dynamic high pacedenvironment.******** Contact Me- aman.tyagi@firstwave-tech.com ****************************************7302599936 DM
View all details

Data Engineer

Guiding Consulting

  • 10 - 12 yrs
  • Bangalore
SQL Python Spark Data Integration ETL AWS ETL Tool Data Warehousing Azure Server
Job Description:Yrs of Exp : 10 + yrsMode : 3 days a weekLocation: BangaloreWork Type : PermanentKey ResponsibilitiesDesign and Development:Architect, implement, and optimize scalable data solutions.Develop and maintain data pipelines, ETL/ELT processes, and workflows to ensure the seamless integration and transformation of data.Collaboration:Work closely with data scientists, analysts, and business stakeholders to understand requirements and deliver actionable insights.Partner with cloud architects and DevOps teams to ensure robust, secure, and cost-effective data platform deployments.Data Management:Manage and maintain data lakes, data warehouses, and real-time analytics systems.Ensure high data quality, integrity, and security across the organization.Performance Optimization:Monitor and enhance system performance, troubleshoot issues, and implement optimizations as needed.Leverage Microsoft Fabrics advanced analytics and AI capabilities for innovative data solutions.Best Practices & Leadership:Lead and mentor junior engineers to foster a culture of technical excellence.Stay updated with industry trends and best practices, especially in the Microsoft ecosystem.Required:Bachelors or Masters degree in Computer Science, Data Engineering, or a related field.10+ years of experience in data engineering, with a proven track record of working on large-scale data platforms.Expertise in Microsoft Fabric and its components (e.g., Synapse, Data Factory, Azure Data Lake, Power BI).Strong proficiency in SQL, Python, and Spark.Experience with cloud platforms, particularly Microsoft Azure.Solid understanding of data modeling, data warehousing, and ETL/ELT best practices.Excellent problem-solving, communication, team management and project management skills.Preferred:Familiarity with other cloud platforms (e.g., AWS, GCP).Experience with machine learning pipelines or integrating AI into data workflows.Certifications in Microsoft Azure or related technologies.
View all details
  • 4 - 6 yrs
  • 18.0 Lac/Yr
  • Pune
SQL ETL Azure Pyspark Databricks Python
Responsibilities: Design, develop, and deploy data solutions on Azure, leveraging SQL Azure,Azure Data Factory, and Databricks. Build and maintain scalable data pipelines to ingest, transform, and load datafrom various sources into Azure data repositories. Implement data security and compliance measures to safeguard sensitiveinformation. Collaborate with data scientists and analysts to support their data requirementsand enable advanced analytics and machine learning initiatives. Optimize and tune data workflows for performance and efficiency. Troubleshoot data-related issues and provide timely resolution. Stay updated with the latest Azure data services and technologies andrecommend best practices for data engineering.Qualifications: Bachelors degree in computer science, Information Technology, or related field. Proven experience as a data engineer, preferably in a cloud environment. Strong proficiency in SQL Azure for database design, querying, and optimization. Hands-on experience with Azure Data Factory for ETL/ELT workflows. Familiarity with Azure Databricks for big data processing and analytics. Experience with other Azure data services such as Azure Synapse Analytics,Azure Cosmos DB, and Azure Data Lake Storage is a plus. Solid understanding of data warehousing concepts, data modeling, anddimensional modeling. Excellent problem-solving and communication skills.
View all details

Urgent Requirement For ETL Automation

E2E Infoware Management Services

Automation Pyspark SQL
Skill Name: ETL Automation TestingLocation: Bangalore, Chennai and Pune (wfo)Experience: 4+ YearsRequired:Experience in ETL Automation TestingStrong experience in PythonStrong experience in Pyspark.
View all details
  • 0 - 1 yrs
  • 8.5 Lac/Yr
  • Female
  • Anna Nagar Pondicherry
Copy-Paste Data Accuracy Data Entry Automation Data Entry Speed Data Formatting Data Entry Software Data Cleansing Data Quality Control Spreadsheet Management Data Extraction Typing Speed Microsoft Excel Keyboard Shortcuts Numeric Keypad Data Collection Data Verification Google Sheets Work From Home Home Based Work Data Quality Data Security ETL Processes Metadata Management NoSQL D
We are looking for a Data Architect who is eager to begin a career in data management. This part-time position allows you to work from home, making it a great opportunity for those looking to start their professional journey. As a Data Architect, you will help design and organize data systems to ensure they run smoothly and efficiently. Key Responsibilities:1. **Data Structure Design**: You will create and implement data models that organize data in a way that meets business needs.2. **Data Integration**: You will assist in bringing together different data sources, ensuring they work well together within the same system.3. **Data Quality Management**: You will monitor data for accuracy and consistency, helping to maintain high quality in our data sets.4. **Collaboration**: You will work with other team members to understand their data needs and help design solutions.Required Skills and Expectations:Candidates should have a strong willingness to learn about data architecture and management. While prior experience is not necessary, a basic understanding of data structures or computer systems is a plus. Good communication skills are essential, as you will be collaborating with others. Attention to detail and problem-solving abilities are important for managing data quality effectively. A proactive attitude and the ability to work independently in a remote setting are also highly valued.
View all details
  • Fresher
  • 8.5 Lac/Yr
  • Female
  • KannurCantonment Kannur
Data Cleansing Data Entry Accuracy Data Entry Automation Data Entry Audit Data Entry Forms Data Entry Validation Data Formatting Data Accuracy Data Quality Control Data Entry Software Google Sheets Data Input Data Verification Keyboard Shortcuts Numeric Keypad Data Collection Microsoft Excel Spreadsheet Management Data Extraction Data Quality Data Governance Data Strategy Data Manipulation ETL Processes Metadata Management
We are looking for a motivated Data Architect to join our team. This part-time position allows you to work from home and is open to freshers who have completed at least their 10th grade. The ideal candidate is detail-oriented, eager to learn, and excited to shape data systems.Key Responsibilities:1. **Data Modeling**: Create and maintain data models that define how data is stored and organized within our systems. This involves envisioning how data flows and ensuring it meets business needs.2. **Database Design**: Assist in designing databases that ensure data integrity and optimal performance. You will help structure how data is collected and accessed.3. **Data Management**: Work on ensuring that data is accurate, secure, and easily accessible for various teams. This includes assisting in the implementation of data governance practices.4. **Collaboration**: Collaborate with other team members to understand their data needs. Effective communication will help translate these needs into workable solutions.Required Skills and Expectations:Candidates should have a basic understanding of data concepts and a willingness to learn more. Strong attention to detail is essential, along with good problem-solving skills. The ability to communicate effectively, both written and verbally, is important for collaborating with team members. Candidates should be organized and capable of managing their time effectively while working independently from home. A genuine interest in technology and data architecture will be beneficial in this role.
View all details

Data Engineer

United Technology

  • 1 - 3 yrs
  • 4.0 Lac/Yr
  • Chennai
Data Integration Data Engineer Hadoop ETL SQL Informatica Apache AWS Big Data Python
We are looking Data Engineer with 1 to 3 years experience in Chennai.Immediate joiners preferred
View all details
Software Product Engineer SQL DEVELOPER SQL Server SQL Azure Azure T-SQL Stored Procedures Azure Data Factory ETL PROCESS SSIS Walk in
Develop and maintain SQL and NoSQL databases in Azure, including schema design, stored procedures and data integrity Continuous improvement of data pipelines using Azure Data Factory As a foundation Developing insightful and interactive businessMinimum of4 year Database Administrator Experience with Microsoft SQL Server Minimum of 2 years of Azure SQL Database
View all details
SQL Azure Software Product Engineer SQL Server Azure T-SQL SQL SQL Developer Stored Procedures Azure Data Factory ETL Processes SSIS Walk in
Develop and maintain SQL and NoSQL databases in Azure, including schema design, stored procedures and data integrity Continuous improvement of data pipelines using Azure Data Factory As a foundation Developing insightful and interactive business Minimum of4 year Database Administrator Experience with Microsoft SQL Server Minimum of 2 years of Azure SQL Database Kindly sent
View all details
SQL Azure Software Product Engineer SQL Server Azure T-SQL SQL SQL Developer Stored Procedures Azure Data Factory ETL Processes SSIS Walk in
Develop and maintain SQL and NoSQL databases in Azure, including schema design, stored procedures and data integrity Continuous improvement of data pipelines using Azure Data Factory As a foundation Developing insightful and interactive business.Minimum of4 year Database Administrator Experience with Microsoft SQL Server Minimum of 2 years of Azure SQL Database
View all details

Snowflake Developer

Firstwave Technology

  • 4 - 9 yrs
  • Chennai
SQL Azure ETL Tool Snow Flake Developer
Job Summary: We are seeking a Snowflake Data Engineer to join our Data & Analytics team. This role involvesdesigning, implementing, and optimizing Snowflake-based data solutions. The ideal candidate will haveproven, hands-on data engineering expertise in Snowflake, cloud data platforms, ETL/ELT processes,and Medallion data architecture best practices. The data engineer role has a day-to-day focus onimplementation, performance optimization and scalability. This is a tactical role requiring independentdata analysis and data discovery to understand our existing source systems, fact and dimension datamodels, and implement an enterprise data warehouse solution in Snowflake. This role will take directionfrom the Lead Snowflake Data Engineer and Director of Data Engineering for their work while bringingtheir own domain expertise and experience.Essential Functions and Tasks: Participate in the design, development, and maintenance of a scalable Snowflake data solution servingour enterprise data & analytics team. Implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake andrelated technologies. Optimize Snowflake database performance Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and softwareengineers, to define and implement data solutions. Ensure data quality, integrity, and governance. Troubleshoot and resolve data-related issues, ensuring high availability and performance of the dataplatform.Education and Experience Requirements: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. 4+ years of experience in-depth data engineering, with at least 1+ minimum year(s) of dedicatedexperience engineering solutions in an enterprise scale Snowflake environment. Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. Strong experience with cloud platforms (preference to Azure) and their data services. Experience in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, orFivetran. Hands-on experience with scripting languages like Python for data processing. Snowflake SnowPro certification; preference to the engineering course path. Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. Familiarity with BI and visualization tools such as PowerBI.Knowledge, Skills, and Abilities: Familiarity working in an agile scum team, including sprint planning, daily stand-ups, backloggrooming, and retrospectives. Ability to self-manage medium complexity deliverables and document user stories and tasksthrough Azure Dev Ops. Personal accountability to committed sprint user stories and tasks Strong analytical and problem-solving skills with the ability to handle complex data challenges Ability to read, understand, and apply state/federal laws, regulations, and policies. Ability to communicate with diverse personalities in a tactful, mature, and professional manner. Ability to remain flexible and work within a collaborative and fast paced environment. Understand and comply with company policies and procedures. Strong oral, written, and interpersonal communication skills. Strong time management and organizational skills.Physical Demands: 40 hours per week Occasional Standing Occasional Walking Sitting for prolonged periods of time Frequent hand, finger movement Communicate verbally and in writing Extensive use of computer keyboard and viewing of computer screen Specific vision abilities required by this job include close vision
View all details
  • Fresher
  • 7.0 Lac/Yr
  • Chilkana Road Saharanpur
Data Visualization Data Quality Data Transformation ETL Processes ETL Tools Programming Data Warehousing Database Management Data Integration Data Analysis Data Modeling Big Data Technologies Statistical Analysis Data Mining Hadoop Machine Learning Data Cleansing Scripting SQL Python Data Sheets Data Migration Data Management
We are looking for a detail-oriented Data Processing Engineer to join our team. This part-time position is ideal for freshers who have passed their 10th grade. The role offers the opportunity to work from home, making it convenient and flexible. Key Responsibilities:- **Data Entry:** Accurately enter data into databases and spreadsheets, ensuring all information is correct and up-to-date.- **Data Cleaning:** Identify and correct errors in datasets, organizing the information to improve clarity and accessibility.- **Data Analysis:** Assist in analyzing data trends and patterns, providing insights that may help in decision-making processes.- **Reporting:** Create simple reports that summarize findings and present the data in a clear and understandable format for the team.Required Skills and Expectations:Candidates should have a strong attention to detail and be able to work independently. Basic computer skills, including familiarity with data entry software and spreadsheets, are essential. Strong time management abilities will help you complete tasks efficiently. Good communication skills are important, as you will need to report your findings to the team regularly. A willingness to learn and adapt is key, as you may encounter new tools and methods in data processing. Being proactive and responsive will help you succeed in this dynamic part-time role.
View all details
View More Jobs