OBS! Ansökningsperioden för denna annonsen har
passerat.
Arbetsbeskrivning
As a Senior Solutions Engineer (data engineering, analytics, AI, big data, public cloud), you will guide the technical evaluation phase in a hands-on environment throughout the sales process. You will be a key part of the sales team, acting as a technical advisor, and working with the product team as an advocate of your customers in the field. You will help our customers to achieve tangible data-driven outcomes through the use of our Databricks lakehouse platform, establishing a clear architectural vision and identifying compelling use cases, helping our customers realise value. You'll grow as a leader in your field, while finding solutions to our customers' biggest challenges in big data, analytics, data engineering and data science problems. You will report to the Manager, Field Engineering.
The impact you will have:
You will be a big data engineering architecture and design expert
Lead your clients through evaluating and adopting the Databricks lakehouse platform
Support your customers by authoring reference architectures, how-tos, and demo applications
Engage with the technical community (both internal and customer) by leading workshops, seminars and meet-ups
Together with your Account Executive, you will form successful relationships with clients throughout your assigned territory to provide technical and business value
What we look for:
Pre- or post-sales experience working with external clients providing thought leadership
We are currently focussing on our Digital Native Business (DNB), Start-ups, Retail/CPG, Financial Services, Health/Life Sciences and Telco/Media verticals and would very much welcome such experience
As this is a broad and varied role, you should have proven expertise in data, such as data engineering, architecture or data science
Proven experience demonstrating technical concepts, including presenting and whiteboarding
Proven experience designing and implementing architectures within a public cloud (AWS, Azure or GCP)
You will have experience with at least some big data technologies, such as Spark, Hadoop, Cassandra, etc. applied to solutions within data engineering, data science, MLOps and platform migrations
Fluent coding experience in one or more of the following: Python, R, Java or Scala
A minimum of a bachelor's degree in Computer Science, Information Systems, Engineering, Data Science, or equivalent experience through demonstrable work experience
Öppen för alla
Vi fokuserar på din kompetens, inte dina övriga förutsättningar. Vi är öppna för att anpassa rollen eller arbetsplatsen efter dina behov.