Job Description:
We are seeking experienced Databricks Developers with strong expertise in designing and implementing scalable data solutions on the Databricks platform. The ideal candidate will be proficient in DBX workflows, SQL Warehouse, and familiar with newer features like AI/BI dashboards and AgentBricks to support modern data engineering and AI/BI initiatives.
Key Responsibilities:
-
Design, develop, and optimize ETL/ELT pipelines using Databricks workflows.
-
Work with Databricks SQL Warehouse to build high-performing queries, data models, and dashboards.
-
Leverage Databricks AI/BI dashboards and AgentBricks to support advanced analytics and AI-driven solutions.
-
Collaborate with data architects, data scientists, and BI developers to deliver end-to-end data solutions.
-
Ensure best practices for data governance, quality, and security within the Databricks environment.
-
Integrate Databricks with cloud platforms (Azure, AWS, or GCP) and external data sources.
-
Monitor and troubleshoot workflows for performance and scalability improvements.
-
Participate in Agile/Scrum development processes and contribute to sprint planning and delivery.
Professional Skills:
-
3+ years of experience working with Databricks in data engineering or analytics projects.
-
Strong proficiency in SQL and experience with SQL Warehouse.
-
Hands-on experience with Databricks workflows (job orchestration, scheduling, monitoring).
-
Familiarity with AI/BI dashboards and AgentBricks.
-
Expertise in PySpark, Python, or Scala for data transformation.
-
Solid understanding of data warehousing concepts, ETL/ELT design, and performance tuning.
-
Experience working with at least one cloud platform (Azure, AWS, GCP).
-
Knowledge of Git/CI-CD pipelines for version control and deployment.
-
Strong problem-solving skills and ability to work in cross-functional teams.






