Job Description
Job Description
Overview
CTG is seeking a skilled Data Engineer to design, build, and maintain high-performance ETL pipelines and data platforms for our client. Local tri-state candidates preferred.
Location: Hybrid in NYC (minimum 3 days onsite weekly)
Duration: 12 months
What You’ll Do:
-
Develop scalable ETL pipelines using Python and OOP principles.
-
Write, optimize, and troubleshoot SQL queries and stored procedures (SQL Server, DB2, Oracle).
-
Integrate applications via APIs and web technologies.
-
Build and manage Airflow DAGs to automate workflows.
-
Design and maintain data models, schemas, and entity relationships.
-
Ingest, cleanse, govern, and deliver data in cloud warehouses like BigQuery and Databricks Delta Lakehouse.
Skills & Experience:
-
Strong Python programming and software development expertise.
-
Deep knowledge of SQL, RDBMS, ETL processes, and data modeling.
-
Experience with Airflow, cloud data warehouses, and workflow automation.
-
Familiarity with Spark a plus.
-
Independent problem-solver with strong analytical skills.
Nice to Have:
-
Hands-on experience with IBM Apptio or ability to ramp up quickly.
-
Willingness to learn new tools and technologies.
Education:
-
Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field preferred, or equivalent experience.
Excellent verbal and written English communication skills and the ability to interact professionally with a diverse group are required.
CTG does not accept unsolicited resumes from headhunters, recruitment agencies, or fee-based recruitment services for this role.
To Apply:
To be considered, please apply directly to this requisition using the link provided. For additional information, please contact Dallas Bell at Dallas.Bell@ctg.com. Kindly forward this to any other interested parties. Thank you!