key responsibilities
requirement analysis: collaborate with stakeholders to understand business
requirements and data sources, and define the architecture and design of data
engineering models to meet these requirements.
architecture design: design scalable, reliable, and efficient data engineering models,
including algorithms, data pipelines, and data processing systems, to support business
requirements and quantitative analysis.
technology selection: evaluate using pocs and recommend appropriate technologies,
frameworks, and tools for building and managing data engineering models, considering
factors like performance, scalability, and cost-effectiveness.
data processing: develop and implement data processing logic, including data cleansing,
transformation, and aggregation, using technologies such as aws glue, batch, lambda.
quantitative analysis: collaborate with data scientists and analysts to develop algorithms
and models for quantitative analysis, using techniques such as regression analysis,
clustering, and predictive modeling.
model evaluation: evaluate the performance of data engineering models using metrics
and validation techniques, and iterate on models to improve their accuracy and
effectiveness.
data visualization: create visualizations of data and model outputs to communicate
insights and findings to stakeholders.
data engineering: understanding of data engineering principles and practices, including
data ingestion, processing, transformation, and storage, using tools and technologies
such as aws glue, batch, lambda.
quantitative analysis: proficiency in quantitative analysis techniques, including statistical
modeling, machine learning, and data mining, with experience in implementing
algorithms for regression analysis, clustering, classification, and predictive modeling.
programming languages: proficiency in programming languages commonly used in data
engineering and quantitative analysis, such as python, r, java, or scala,