Search

Databricks Data Engineer

Equiliem
locationSouthern Maryland, MD, USA
PublishedPublished: 6/14/2022
Technology
Full Time

Job Description

Job Description

Job Title: Databricks Data Engineer

Location: 100% Remote (U.S.-based) with quarterly travel to Gaithersburg, MD

Security Clearance: Active Public Trust required
Pay : $60.00 - $63.00 per hour

26-00708
Job Summary:
Our client is seeking a Databricks Data Engineer to design, develop, and maintain scalable data pipelines and analytics solutions within an Azure cloud-based data lake. The ideal candidate will translate business requirements into robust data engineering solutions, support ETL operations, and maintain high-quality, enterprise-scale analytics environments. This role emphasizes Databricks expertise, cloud architecture, advanced AI workflows, and cross-functional collaboration to deliver innovative data solutions.

Key Responsibilities:

  • Design, build, and optimize scalable data solutions using Databricks and Medallion Architecture

  • Develop and maintain ingestion routines for multi-terabyte datasets across multiple projects and Databricks workspaces

  • Integrate structured and unstructured data to enable actionable business insights

  • Implement data management strategies to ensure integrity, availability, and accessibility

  • Monitor platform performance, manage Spark optimization, cluster stability, and configuration management

  • Collaborate with data science and analytics teams to enable AI-driven workflows

  • Integrate Databricks pipelines with Azure services such as Functions, Storage Services, Data Factory, Log Analytics, and User Management

  • Provision and manage infrastructure using Infrastructure-as-Code (IaC) best practices

  • Apply data governance, security, and compliance standards in line with federal regulations and public trust requirements

  • Gather business requirements and translate them into architecture and interface solutions

  • Document solutions clearly, including architecture diagrams and workflow documentation

Required Qualifications:

  • U.S. Citizen

  • BS in Computer Science or related field (3+ years experience) OR MS in CS or related field (2+ years experience)

  • 3+ years experience developing ingestion flows (structured, streaming, and unstructured) on cloud platforms with data quality controls

  • Databricks Data Engineer certification with 2+ years maintaining Databricks platforms and Spark development

  • Proficient in Python and Spark; R is required; .NET experience is a plus

  • Experience with data governance, metadata management, enterprise data catalogs, and data security

  • Agile methodology experience, CI/CD automation, and cloud-based development (Azure preferred, AWS a plus)

  • Strong client-facing skills; able to document and communicate solutions effectively

Preferred Qualifications:

  • Azure cloud certifications

  • Knowledge of FinOps principles and cost management

  • Additional certifications or advanced degrees are a plus

    #ZR

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...