Search

Data Engineer

On-Demand Group
locationMinneapolis, MN, USA
PublishedPublished: 6/14/2022
Technology
Full Time

Job Description

Job Title: Data Engineer

Job Location: On-site

Job Type: Contract to Hire

USC of GC Holders only for contract to hire need having no sponsorship


  • Must have requirements:
  • GCP, SQL, Python, Airflow
  • System design mindset
  • Communication – ability to vocalize what they are doing, what/how they are achieving their work. Accents not an issue as long as they are comprehendible.
  • Healthcare not required, but a nice to have.
  • Location: Onsite – any 4 office location, focus is Minneapolis, Arlington, VA, Portland, OR, Raleigh, NC
  • 100% onsite, then switch to 2-3x/week hybrid if they do well


Job Summary:

The Senior Cloud Data Engineer plays a key role in designing, building, and maintaining

data pipelines and infrastructure using Google Cloud Platform (GCP) BigQuery. The

incumbent will collaborate with data analysts, data scientists, and other engineers to

ensure timely access to high-quality data for data-driven decision-making across the

organization.


The Senior Cloud Data Engineer is a highly technical person that has mastered hands-on

coding in data processing solutions and scalable data pipelines to support analytics and

exploratory analysis. This role ensures new business requirements are decomposed and

implemented in the cohesive end-to-end designs that enable data integrity and quality, and

best support BI and analytic capability needs that power decision-making. This includes building data acquisition programs that handle the business’s

growing data volume as part of the Data Lake in GCP BigQuery ecosystem and maintaining

a robust data catalog.


This is a Senior Data Engineering role within Data & Analytics’ Data Core organization

working closely with leaders of the Data & Analytics. The incumbent will continually

improve the business’s data and analytic solutions, processes, and data engineering

capabilities. The incumbent embraces industry best practices and trends and, through

acquired knowledge, drives process and system improvement opportunities.


Responsibilities:

• Design, develop, and implement data pipelines using GCP BigQuery, Dataflow, and

Airflow for data ingestion, transformation, and loading.

• Optimize data pipelines for performance, scalability, and cost-efficiency.

• Ensure data quality through data cleansing, validation, and monitoring processes.

• Develop and maintain data models and schemas in BigQuery to support various

data analysis needs.

• Automate data pipeline tasks using scripting languages like Python and tools like

Dataflow.

• Collaborate with data analysts and data scientists to understand data requirements

and translate them into technical data solutions.

• Leverage DevOps Terraform (IaC) to ensure seamless integration of data pipelines

with CI/CD workflows.

• Monitor and troubleshoot data pipelines and infrastructure to identify and resolve

issues.

• Stay up-to-date with the latest advancements in GCP BigQuery and other related

technologies.

• Document data pipelines and technical processes for future reference and

knowledge sharing.


Basic Requirements:

• Bachelor’s degree or equivalent experience in Computer Science, Mathematics,

Information Technology or related field.

• 5+ years of solid experience as a data engineer.

• Strong understanding of data warehousing / datalake concepts and data modeling

principles.

• Proven experience with designing and implementing data pipelines using GCP

BigQuery, Dataflow and Aiflow.

• Strong SQL and scripting languages like Python (or similar) skills.

• Experience with data quality tools and techniques.

• Ability to work independently and as part of a team.

• Strong problem-solving and analytical skills.

• Passion for data and a desire to learn and adapt to new technologies.

• Experience with other GCP services like Cloud Storage, Dataflow, and Pub/Sub etc.

• Experience with cloud deployment and automation tools like Terraform.

• Experience with data visualization tools like Tableau or Power BI or Looker.

• Experience with healthcare data.

• Familiarity with machine learning, artificial intelligence and data science concepts.

• Experience with data governance and healthcare PHI data security best practices.

• Ability to work independently on tasks and projects to deliver data engineering

solutions.

• Ability to communicate effectively and convey complex technical concepts as well

as tasks / project updates.


The projected hourly range for this position is $78 to $89.


On-Demand Group (ODG) provides employee benefits which includes healthcare, dental, and vision insurance. ODG is an equal opportunity employer that does not discriminate on the basis of race, color, religion, gender, sexual orientation, age, national origin, disability, or any other characteristic protected by law.

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...