Job Description
Responsibilities:
- Lead the design, development, and implementation of data solutions using AWS and Snowflake.
- Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
- Develop and maintain data pipelines, ensuring data quality, integrity, and security.
- Optimize data storage and retrieval processes to support data warehousing and analytics.
- Provide technical leadership and mentorship to junior data engineers.
- Ensure compliance with industry standards and best practices in data engineering.
- Utilize knowledge of insurance, particularly claims and loss, to enhance data solutions.
Must have:
- 6-8 years of relevant experience in Data Engineering and delivery.
- 5+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations.
- Strong experience with SQL, python and Pyspark
- Good understanding of Data ingestion and data processing frameworks
- Good experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture)
- Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership as appropriate.
- Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment.
- Experience working in Agile Methodology
- Proven expertise or certification in Palantir Foundry / Typescript is highly preferred.