Job Description
Job Description
Responsibilities:
· Design, implement, and maintain data pipelines, data warehouses, and other data solutions using AWS services.
· Develop and implement ETL (Extract, Transform, Load) processes for data extraction, transformation, and loading into data storage systems.
· Create and manage data models to ensure data integrity and facilitate efficient data analysis.
· Implement and maintain data security and compliance measures, including access controls, encryption, and data masking.
· Ensure data quality, accuracy, and consistency through data validation, cleansing, and monitoring.
· Collaborate with data scientists, business analysts, SAP functional SMEs and other stakeholders to understand requirements and deliver data solutions that meet business needs.
· Diagnose and resolve data-related issues and performance bottlenecks.
Required Education & Experience:
· Bachelors’ degree in Information Technology or related field. Additional years of experience may be considered in lieu of a degree.
-2-4 years of experience in relevant roles.
· Proficiency in AWS services such as S3, Redshift, DMS, Glue, and Lambda.
· Experience with data warehousing concepts and technologies.
· Strong SQL skills for data querying and manipulation.
· Proficiency in programming languages like Python or Java.
· Understanding of data modeling principles and techniques.
· Experience with ETL (Extract, Transform, Load) processes and tools.
· Knowledge of data security best practices and compliance requirements.
· Strong problem-solving and analytical skills.
· Excellent communication and collaboration skills.