Databricks Data Engineer


Job Title: Databricks Data Engineer

Job Type: Contract (W2)

Location: Pleasanton, CA (HYBRID role; onsite 4 days per week)

Start Date: ASAP

Duration: 6 Months (with potential for extension)

Work Hours/Schedule: Monday-Friday, 8 hours per day (1st shift/standard business hours)

Compensation: $43 to $48 per hour

Job Summary: Our Big 4 Consulting Firm client is seeking a skilled Databricks Data Engineer with hands-on Databricks experience to design, build, and optimize large-scale data pipelines and analytics solutions. You will work with cross-functional teams to enable scalable data processing using the Databricks Lakehouse Platform on Azure.

Key Responsibilities:

  • Design and implement ETL/ELT pipelines using Databricks, Delta Lake, and Apache Spark.

  • Collaborate with data scientists, analysts, and stakeholders to deliver clean, reliable, and well-modeled data.

  • Build and manage data workflows with Databricks Jobs, Notebooks, and Workflows.

  • Optimize Spark jobs for performance, reliability, and cost-efficiency.

  • Maintain and monitor data pipelines, ensuring availability and data quality.

  • Implement CI/CD practices for Databricks notebooks and infrastructure-as-code (e.g., Terraform, Databricks CLI).

  • Document data pipelines, datasets, and operational processes.

  • Ensure compliance with data governance, privacy, and security policies.

Qualifications:

  • Bachelor’s or Master’s in Computer Science, Data Engineering, or a related field.

  • 5+ years of experience in data engineering or a similar role.

  • Strong hands-on experience with Databricks and Apache Spark (Python, Scala, or SQL).

  • Proficiency with Delta Lake, Unity Catalog, and data lake architectures.

  • Experience with cloud platforms (Azure, AWS, or GCP), especially data services (e.g., S3, ADLS, BigQuery).

  • Familiarity with CI/CD pipelines, version control (Git), and job orchestration tools (Airflow, DB Workflows).

  • Strong understanding of data warehousing concepts, performance tuning, and big data processing.

Preferred Skills:

  • Experience with MLflow, Feature Store, or other machine learning tools in Databricks.

  • Knowledge of data governance tools like Unity Catalog or Purview.

  • Experience integrating BI tools (Power BI, Tableau) with Databricks.

  • Databricks certification(s) (e.g. Data Engineer Associate/Professional, Machine Learning, etc.).

DETAILS AT A GLANCE

JOB TITLE: Databricks Data Engineer

JOB TYPE: Contract (W2)

HOURS: Monday–Friday, 8 hours per day (1st shift/standard business hours)

LOCATION: Pleasanton, CA (HYBRID role; onsite 4 days per week)

START DATE: ASAP

DURATION: 6 Months (with potential for extension)

COMPENSATION: $43.00 to $48.00 per hour

BNA RECRUITER: Nick Yacobi


HOW TO APPLY: Click on the apply button and send your resume to BNA. Please make sure to reference the job title in the subject line.

This role is a TEMPORARY position through BNA. You would be employed and paid by our company and working directly with our client.


Next
Next

UX Designer