Job Summary:We are seeking an experienced Databricks Solution Enablement Engineer to lead the evaluation, enablement, and operational readiness of Databricks solutions within a cloud-native data ecosystem.
This role is not about core development but about hands-on engineering, architecture alignment, and being a trusted enabler for internal teams and business users.We are not looking for snowflake developers, but platform engineers.Evaluate Snowflake for data processing, storage, and analytics while considering key factors such as security, scalability, performance, and cost.Establish and enforce data engineering best practices, standards, and guidelines to ensure data quality, reliability, and consistency in terms of Snowflake.Research, test, benchmark, and assess new Snowflake features, providing recommendations for their integration into the data platform.Develop and implement Snowflake-based solutions that align with business strategy, architectural considerations, and both short- and long-term roadmaps, ensuring high scalability and extensibility.Optimize Snowflake performance, conduct tuning, and troubleshoot data infrastructure components to maximize efficiency and resource utilization.Proactively identify bottlenecks, gaps, and opportunities, in snowflake and driving necessary changes through direct action or by influencing peers and leadership.Deploy Snowflake following best practices, ensuring knowledge transfer so engineers can independently extend its capabilities.Engage hands-on with customers to demonstrate and communicate Snowflake implementation best practices.Support prospects and customers throughout the sales cycle, from demos to proof-of-concept, design, and implementation, effectively showcasing Snowflakes value.Collaborate with Product Management, Engineering, and Market teams to continuously enhance Snowflakesolutions.Apply hands-on expertise with AWS and cloud-based data services such as Snowflake.Leverage software engineering and analytical skills to solve large-scale business challenges.Utilize modern data pipeline, replication, and processing tools such as Matillion, Fivetran, DBT, Airflow, and Astronomer.Ensure compliance with data security and privacy regulations, implementing best practices for data protection.Understand the end-to-end data analytics stack and workflow, from ETL processes to data platform design and BI tools.Demonstrate expertise in large-scale databases, data warehouses, ETL, and cloud technologies, including Data Lakes, Data Mesh, and Data Fabric.Bridge the gap between business challenges and Snowflakes solutions, aligning data architecture with customer needs.Conduct deep discovery of customer architecture frameworks and integrate them with Snowflake?s data architecture.Exhibit strong proficiencyKey Responsibilities:Evaluate and prototype new Databricks features and capabilities to improve platform value.Collaborate with data architects, engineers, and security teams to integrate Databricks efficiently and securely.Identify and automate manual processes in data engineering workflows involving Databricks.Manage user roles, perform patching and upgrades, and ensure compliance with security policies.Assist in the delivery of scalable data pipelines and analytics frameworks using Databricks.Analyze performance issues and optimize Spark jobs, clusters, and workflows.Act as a Databricks subject-matter expert, supporting client-facing engagements and enablement activities.Participate in architecture reviews and contribute to solution design and governance discussions.Provide internal enablement, knowledge transfer, and best practices for Databricks usage.Soft Skills & Competencies:Strong communicator and collaborator with both technical and business stakeholders.Ability to assess new technology features and articulate their pros, cons, and impact.Experience leading technical discussions, demos, and stakeholder presentations.Proactive mindset with a strong sense of ownership and accountability.