10

Data Structures Job Vacancies in Pune

filter
  • Location
  • Role
  • Functional Area
  • Qualification
  • Experience
  • Employer Type
  • Fresher
  • Pune
Java Mainframe Testing Manual Testing Web Application Testing Data Structures
Selected intern's day-to-day responsibilities include:1.Test Case Design & Documentation2.Manual Testing3.Bug Identification & Reportingapply form here: https://forms.gle/7qiwjkNxYhL7eKET9
View all details
  • 5 - 10 yrs
  • Pune
Python Developer Python MySQL Django Data Structures
Candidate should have experience in development using Python and Django and a solid understanding of Data Structures, OOPs, microservices-based architecture using REST APIs and RESTful web services. Experience in designing data persistence systems using SQL/NoSQL, DBMS, MongoDB, elastic search, etc. Good understanding of Scrum/Agile methodology.Technology Stack: Python, Django/Flask Framework, Unix, GitHub, Jenkins, AWS or Azure.Responsibilities:Create solutions by developing, implementing, and maintaining Python/SQL based components and interfaces.Work with top-level stakeholders of Python to solicit and detail requirements prior to developmentPerform end-to-end work : Analysis, Design, Development, Testing, Code Review, Unit Testing, RCA, Defect Fixing, Deployment, UAT Support.Lead the development effort of RESTful web servicesIdentify any potential risks and inform the PM and others on timeTechnical Skills:Strong expertise in Python development and SQLSounds knowledge of Software Engineering design patterns and practicesIn-depth knowledge and experience with Data structures and CollectionsStrong understanding of Functional programming.Strong grasp over data structures and algorithmsGood knowledge of web-capable devices and browsersGood hands on in RESTful APIsExcellent written & verbal communication, ability to multitask, work well under demanding situations, prioritize and meet deadlineStrong in data-structures and algorithms, with solid understanding of concepts like multithreading and concurrency.Good understanding of OOPS concepts
View all details
  • 2 - 5 yrs
  • Pune
Python Go Flask Fast API Data Structure Spark ETL Developer Cloud Engineer Work From Home
Responsibility:Create scalable systems and develop innovative algorithm solutions keeping space and time complexity trade-off in considerationExperiment with innovative ideas and where approved contribute to open-sourceManage design, development and deployment of scalable, high volume and real time systems.Research on algorithm improvements and implement high-speed APIs and fast data processing pipelines.Collaborate with other teams including UI and product management to ensure the solution follows the required specifications and is implementableMeasure quality, performance and security of code and document the resultsSkillsGood foundation in Data Structures and algorithms, with ability to decide right approach as per the problem statementAbility to solve complex, multi-part problems and serve the solution as API (through Flask or FastAPI or similar framework)Exceptional programming skills in Python (and optionally open to one more language - like Go)ability to critically assess a requirements and develop an algorithm that combs data sets to arrive at specific conclusionsOpen to assisting other engineers and team members in fulfilling project schedulesGood communication, documentation and reporting skills(preferred) exposure to distributed computing frameworks like Spark(preferred) Research inclination with interest in advanced hashing, scaling, efficient data representation techniques.(optional but preferred) some level of familiarity with cloud services
View all details

JAVA J2EE Developer

CleomeSoft Technologies

  • 2 - 5 yrs
  • 10.0 Lac/Yr
  • Pune
Java Programmer Spring MVC Springboot Hibernate Data Structures J2EE Singleton
Job DescriptionClear understanding and application knowledge of Object Oriented Programming OOPExpert in Core JAVA, Collection Framework, Exception handling, Logging Framework, Web Services, SOAPMust have very good knowledge and also experience in struts, Spring and ORM tools like Hibernate, JPAShould have hands on Coding and design experience of critical enterprise applicationsShould have experience in code review tools like pmd, sonar checkstyleShould have experience writing unit testing using junit, jmock, DBUnit EtcShould have experience in writing build scripts using ant, mavenShould be able to write complex SQL queries, Query optimization and tuning.CTC:ECTC:Current Location:Preferred Location:Notice Period:Reason for Job Change:Please note candidates with 30 days notice period or lass will be given preference.
View all details

Get Personalized Job Matches

Based on your experience, skills, interests, and career goals to help you find the most relevant opportunities faster. Register Now!

GOOGLE CLOUD PLATFORM

Light Speed Web

Knowledge in Data Migration & Google Cloud Infra-structure Work From Home
Job descriptionQualificationsBachelors degree in Computer Science or equivalent practical experience.Experience serving in the capacity of a technical sales engineer in a cloud computing environment or equivalent experience in a customer-facing roleExperience with legacy and modern application development, cloud service delivery, and deployment.Preferred qualifications:Masters degree in Computer Science or other technical field.Experience with developing scalable architectures using API management, microservice frameworks, PaaS and container orchestration systems like Kubernetes, OpenShift, Cloud Foundry.Experience in advanced areas of networking, Linux, SDN, virtualization, open protocols, application acceleration and load balancing, DNS, VPNs and their application to PaaS/IaaS technologies, large-scale VMware deployments.Experience with cloud-native software development methodologies, tools for application development/deployment stages including CI/CD (Jenkins, Spinnaker, Concourse).Experience with logging/monitoring of hybrid-cloud workloads.Ability to learn and work with emerging technologies, methodologies, and solutions in Cloud/IT technology space.ResponsibilitiesRecommend integration strategies, enterprise architectures, platforms, and application infrastructure required to successfully implement a complete solution on Google Cloud.Provide in-depth enterprise application/cloud modernization expertise to support the technical relationship with Google s partners, including product and solution briefings, proof-of-concept work, and partner with technology practices and product management to prioritize solutions impacting adoption of Google Cloud.Guide partners through their customer assessments of existing legacy application environment(s), provide recommendations on a prioritization road-map for application modernization, identify applications for migration to hybrid-cloudIf interested kindly share your resume
View all details
  • 0 - 1 yrs
  • 3.5 Lac/Yr
  • Pune
Data Structures Algorithms
Graduate Engineering Trainee (Fresher) in Tudip is responsible for the programming and development of applications and software using Java, spring and Batch. Individuals in this role should possess a strong logical and technical bent of mind.Job Requirements/Qualifications:1) Must possess strong logical and technical bent of mind.2) Should have BE/BTech in Computer Science/IT or MCA degree.3) Should be eager to learn, adapt and excel in an extreme technology company like Tudip.4) Deep understanding of algorithm and data structure is a MUST
View all details

Clinical Data Scientist

Ormak Bizserve LLP

  • 5 - 7 yrs
  • Pune
Database Structures Database Programming ICH GCP GCDMP FDA MS Windows Navigation Microsoft Word Microsoft Excel MS Office Powerpoint eCRF Design Work From Home Walk in
Duties and Responsibilities: Serve as the technical leader on all data management aspects for project(s) including start-up, maintenance, and completion activities and develop Data Management Plans that will deliver accurate, timely, consistent, and quality clinical trial data. Independently perform all activities related to data management per regulations and applicable standard operating procedures (SOPs). Primary contact person for day to day data management activities, and is the person ultimately responsible for all data management deliverables for assigned projects. Primary contact person for communication and discussion of topics related to data management timelines and deliverables, request for out of scope tasks, and first line contact for technical or procedural issues. Responsible for planning and implementing data management timelines and deliverables, for providing database and data management activities status reports, and contributes to the overall project planning, progress tracking and reporting. Design and review electronic Case Report Forms (eCRFs). Develop and review eCRF Completion Instructions. Generate and review annotated eCRFs. Develop and maintain data validation specifications. Fully involved in the clinical study database User Acceptance Testing (UAT), and ensure proper documentation thereof. Manage the process of database modifications (after go-live) due to protocol amendments or study needs. Regularly communicate and/or respond to data collection sites, third party service providers, and when appropriate sponsors; responding to queries in a timely manner,Qualifications: A minimum of 5 years of clinical data management related experience in either a CRO, pharmaceutical, biotech, or device company. In-depth understanding of database structures and database programming. In depth knowledge of CDISC SDTM/CDASH standards. In-depth knowledge of clinical trial processes and experience in ICH, GCP, and GCDMP (SCDM
View all details
React JS Javascript Angular JS Redux Frontend Developer Data Structure Work From Home
experience in Software EngineeringVery strong software engineering fundamentalsThe ability to take technical ownership of assigned projects from high-level project scope down to details of individual design, architecture and coding.Experience in working on ReactJS , Javascript, HTML and CSSAbility to work independently or within a team environment and handle multiple projectssimultaneously.Strong analytical and problem solving skills.Implement and improve DevOps and Engineering practices for all services.Strong Computer Science fundamentals in Object-Oriented Design and Data StructuresStrong knowledge of how to build and work with RESTful APIs and a working knowledge of gitStrong understanding of Security and Performance aspects while designing for the Web interfaces on Desktop/Mobile.
View all details
  • 0 - 1 yrs
  • Baner Pune
ASP.NET .NET Java C C++ Data Structure Algorithms PL SQL SQL Work From Home
- Should have good knowledge on skills like .NET, Java, C, C++, Data structure, algorithms- A good understanding of software development, implementation lifecycle, and hands-on programming experience would be advantageous- Good knowledge in database PL/SQL, SQL- Good Aptitude, analytical and problem-solving skills- Passionate about the digital world, designing and building the logic- Thrive in a dynamic environment and project-based working environment- Must be independent, disciplined, and highly motivated- Willing to learn new skills- Good communication skills are must - During the training period, stipend will be offered to selected candidates- 2019 & 2020 Pass out MCA/M.Sc Computer Science
View all details
Python SCALA JAVA AWS - EMR Hadoop Spark Kafka SQL NoSQL Data Architecture Data Structures Storm Flink
ResponsibilitiesCreate and maintain optimal data pipeline architectureAssemble large, complex data sets that meet functional / non-functional business requirements.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Open Source and AWS big data technologiesBuild analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.Work with data and analytics experts to strive for greater functionality in our data systems.QualificationsExperience building and optimizing big data pipelines, architectures and datasets.Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.Experience interacting with customers and various stakeholders.Strong analytical skills related to working with unstructured datasets.Build processes supporting data transformation, data structures, metadata, dependency and workload management.Working knowledge of message queuing, stream processing, and highly scalable big data lakes.Strong project management and organizational skills.Experience supporting and working with cross-functional teams in a dynamic environment.They should also have experience using the following software/tools:Big data technologies: Hadoop, Spark, Kafka, etc.Relational SQL and NoSQL databases, including Postgres and Cassandra.Data pipeline and workflow management tools: Airflow, NiFi etc.Cloud services: AWS - EMR, RDS, Redshift, Glue. Azure - Databricks, Data Factory. GCP - Dataproc, Pub/SubStream-processing systems: Storm, Spark Streaming, Flink etc.
View all details