We are hiring Databricks Engineer for Coventry Location Below are details of the role Key Responsibilities: •Databricks Platform Support: •Provide daily support for the Databricks platform running on AWS.
•Ensuring data security and access controls are setup and maintained •Monitor and troubleshoot clusters, workspaces, jobs, and other Databricks resources •Familiar with setting up and using medallion architecture, and the unity catalog •Handle incidents, requests, and changes in Databricks environments.
•Collaborate with engineering and DevOps teams to resolve complex platform issues.
•Familiarity with Databricks ML flows and monitoring •Performance Tuning & Optimization: •Analyse workloads and optimize Databricks clusters for performance and cost-efficiency.
•Recommend best practices for efficient Spark job execution and data processing.
•AWS Infrastructure Management: •Manage and configure AWS services (e.g., S3, EC2, IAM, Lambda) that integrate with Databricks.
•Ensure proper security configurations and compliance with AWS standards.
•Manage networking, security groups, and VPCs related to Databricks setup.
•Automation & DevOps: •Develop and maintain automation scripts for platform maintenance and monitoring.
•Work closely with DevOps teams to automate cluster management, scaling, and infrastructure provisioning.
•Implement CI/CD pipelines for Databricks deployments using tools like Jenkins, Git, or Terraform.
Qualifications: •Bachelor's degree in computer science, Information Technology, or related field.
and 5+ years of experience supporting Databricks or similar data platforms.
•Strong expertise in Databricks on AWS: cluster management, job execution, Spark optimization.
•Hands-on experience with AWS services like EC2, S3, IAM, Lambda, RDS, VPC.
•Proficiency in Python, Scala, or SQL for data engineering tasks.
•Solid understanding of Apache Spark, ETL pipelines, and big data processing.
•Familiarity with DevOps practices, automation tools (Terraform, Ansible), and CI/CD pipelines.