Pyspark Developer

Key Skills

Spark Developer Spark SQL PySpark Work From Home

Job Description

In-depth knowledge of Hadoop, Spark, and similar frameworks.

• Ability to design, build and unit test the application in Spark/Pyspark.

• Ability to understand existing ETL/Ab Initio graph & logic to convert into Spark/PySpark.

• Good implementation experience of oops concepts.

• Knowledge of Unix shell scripting, RDBMS, Hive, HDFS File System, HDFS File Types, HDFS compression codec.

• Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources.

• Experience in working with Bitbucket and CI-CD process.

• Have knowledge of the agile methodology for delivering the projects.
  • Experience

    3 - 9 Years

  • No. of Openings

    5

  • Education

    Any Bachelor Degree

  • Role

    Pyspark Developer

  • Industry Type

    IT-Hardware & Networking / IT-Software / Software Services

  • Gender

    [ Male / Female ]

  • Job Country

    India

  • Type of Job

    Full Time

  • Work Location Type

    Work from Home

Similar Jobs
Apply Now

Register to Get Relevant Jobs

Get Noticed By Top Recruiters

Become a Premium Job Seeker

  • Higher Boosting
  • Resume Highlighter
  • Verified Stamp
  • Resume Exposure

499/- for 3 months

Pay Now

We use cookies to improve your experience. By continuing to browse the site, you agree to our Privacy Policy Terms & Conditions [Seeker]

Got it