Job Summary: We are seeking a Snowflake Data Engineer to join our Data & Analytics team. This role involvesdesigning, implementing, and optimizing Snowflake-based data solutions. The ideal candidate will haveproven, hands-on data engineering expertise in Snowflake, cloud data platforms, ETL/ELT processes,and Medallion data architecture best practices. The data engineer role has a day-to-day focus onimplementation, performance optimization and scalability. This is a tactical role requiring independentdata analysis and data discovery to understand our existing source systems, fact and dimension datamodels, and implement an enterprise data warehouse solution in Snowflake. This role will take directionfrom the Lead Snowflake Data Engineer and Director of Data Engineering for their work while bringingtheir own domain expertise and experience.Essential Functions and Tasks: Participate in the design, development, and maintenance of a scalable Snowflake data solution servingour enterprise data & analytics team. Implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake andrelated technologies. Optimize Snowflake database performance Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and softwareengineers, to define and implement data solutions. Ensure data quality, integrity, and governance. Troubleshoot and resolve data-related issues, ensuring high availability and performance of the dataplatform.Education and Experience Requirements: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. 4+ years of experience in-depth data engineering, with at least 1+ minimum year(s) of dedicatedexperience engineering solutions in an enterprise scale Snowflake environment. Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. Strong experience with cloud platforms (preference to Azure) and their data services. Experience in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, orFivetran. Hands-on experience with scripting languages like Python for data processing. Snowflake SnowPro certification; preference to the engineering course path. Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. Familiarity with BI and visualization tools such as PowerBI.Knowledge, Skills, and Abilities: Familiarity working in an agile scum team, including sprint planning, daily stand-ups, backloggrooming, and retrospectives. Ability to self-manage medium complexity deliverables and document user stories and tasksthrough Azure Dev Ops. Personal accountability to committed sprint user stories and tasks Strong analytical and problem-solving skills with the ability to handle complex data challenges Ability to read, understand, and apply state/federal laws, regulations, and policies. Ability to communicate with diverse personalities in a tactful, mature, and professional manner. Ability to remain flexible and work within a collaborative and fast paced environment. Understand and comply with company policies and procedures. Strong oral, written, and interpersonal communication skills. Strong time management and organizational skills.Physical Demands: 40 hours per week Occasional Standing Occasional Walking Sitting for prolonged periods of time Frequent hand, finger movement Communicate verbally and in writing Extensive use of computer keyboard and viewing of computer screen Specific vision abilities required by this job include close vision