Databricks Administrator

ReqT
Washington, US
Remote

Job Description

Job Title: Senior Databricks Platform Administrator

Location: Remote

  • U.S. Citizenship Required
  • Active Clearance or Clearance-Eligible

Position Overview:

We are seeking a Senior Databricks Platform Administrator to lead the administration, security, and operational management of enterprise-scale Databricks environments. This role supports multiple Databricks workspaces in a regulated AWS cloud environment and requires strong expertise in Unity Catalog, workspace governance, cluster security, and automation. The ideal candidate combines deep platform administration skills with hands-on Python and Databricks automation experience.

Key Responsibilities:

Databricks Platform Administration

  • Administer and support multiple Databricks workspaces, ensuring consistent governance, security, and operational standards.
  • Configure and manage Unity Catalog, including catalogs, schemas, tables, external locations, storage credentials, and fine-grained access controls.
  • Manage workspace-level objects, users, groups, and service principals across environments using best practices for isolation and least privilege.
  • Implement and maintain role-based access control (RBAC) and object-level ACLs across workspaces.

Security, Clusters, and Policies

  • Design, configure, and manage Databricks cluster policies, ensuring cost control, security compliance, and performance optimization.
  • Administer cluster configurations, libraries, init scripts, and runtime versions.
  • Enforce security controls aligned with federal and enterprise compliance requirements.
  • Support auditability and governance requirements in regulated environments.

Automation and Tooling

  • Use the Databricks CLI to automate workspace provisioning, user/group management, cluster operations, and administrative tasks.
  • Develop automation scripts and workflows to standardize platform operations and reduce manual intervention.
  • Support Infrastructure-as-Code (IaC) and CI/CD patterns for Databricks administration where applicable.

Data Engineering & Integration

  • Develop and maintain Python-based Databricks notebooks to support data engineering, administrative, and operational workflows.
  • Support data ingestion pipelines, file transfers, and API-based integrations.
  • Work within AWS GovCloud, accounting for its networking, security, and service limitations when designing integrations and data movement processes.

Collaboration & Operations

  • Partner with data engineers, data scientists, security teams, and DevOps teams to support platform usage and onboarding.
  • Troubleshoot platform issues related to access, performance, security, and configuration.
  • Provide guidance and best practices for Databricks usage in a multi-tenant, multi-workspace environment.

Required Qualifications:

  • U.S. Citizenship required
  • Active security clearance or ability to obtain and maintain a clearance
  • Strong hands-on experience administering the Databricks platform in production environments.
  • Deep knowledge of Unity Catalog, workspace management, and Databricks security models.
  • Experience managing multiple Databricks workspaces with shared and isolated resources.
  • Proficiency with Databricks cluster management, policies, and runtime configuration.
  • Strong Python programming skills, particularly for Databricks notebooks and automation.
  • Experience using the Databricks CLI for administrative automation.
  • Familiarity with AWS GovCloud, including nuances around networking, APIs, and data transfers.

Skills & Requirements

Technical Skills

DatabricksUnity catalogWorkspace governanceCluster securityAutomationPythonDatabricks cliAws govcloudGoogle vertex aiApi gatewaysAuthentication and authorizationApplication security controlsCi/cdObservability

Salary

$90,000+

year

Employment Type

FULL TIME

Level

senior

Posted

4/22/2026

Continue to LinkedIn

You will be redirected to the job posting on LinkedIn.