Job Description
Job Description
Role Overview
The Software Engineer will support the design, development, and maintenance of enterprise-level data pipelines and integrations within AWS cloud environments. The engineer will collaborate closely with data architects and DevOps teams to build scalable, secure, and high-performing systems supporting Duke Energy’s enterprise data strategy.
Required Skills & Experience
- AWS Redshift Implementation – end-to-end data warehouse design and optimization
- Data Pipeline Development using Python and AWS native services
- Proficiency in AWS Services: Lambda, API Gateway, S3, RDS, CloudFormation/Terraform
- Strong understanding of Data Warehousing (DW/BI) methodologies and best practices
- CI/CD Pipeline Management using GitHub Actions, Jenkins, or CodePipeline
- Python Programming (ETL, APIs, data ingestion scripts)
- Database Design & Query Optimization in Redshift, PostgreSQL, or SQL Server
- Agile/Scrum development methodology experience
Preferred / Nice-to-Have Skills
- Java development experience
- Web Methods (reverse engineering)
- Kafka or Flink for streaming data integration
- Infrastructure-as-Code (Terraform or CloudFormation)
- Monitoring & Logging: CloudWatch, Splunk, or similar
- Experience working with large-scale enterprise data environments
Candidate Submission Template
Please submit with the following information completed:
- Current Location:
- Citizenship of Candidate (H1, F1, EAD, US Citizen):
- Pay Rate:
- W2 or C2C:
- Medical Benefits (Y/N):
- Committed to 4 days a week onsite (or N/A):
- Available to Interview:
- Legal First Name:
- Legal Last Name:
- Phone:
- Email:
- Last 4 SSN:
- DOB (MM-DD):
- Available to Start:
- Why is the candidate on the market?:
- Why does the candidate want this role?:
- Is the candidate currently working?:
- If required to travel or move, what is the motivation?:
- Has the candidate ever interviewed/worked at Duke Energy? If so, please explain: