OBS! Ansökningsperioden för denna annonsen har
passerat.
Arbetsbeskrivning
Doktor 24 Group
For most people, healthcare is one of the most important sectors in society. But in many ways, it has been decades behind other sectors in terms of accessibility and productivity. The Doktor24 Group is driving the change in how healthcare works. We are building "Healthcare 2.0" which means that we are, through smart, easy and, sustainable solutions, improving healthcare to ensure the best possible care at the right time in the most suitable format. We are daily working on making it more accessible, efficient with resources, and increasing the quality of care given to patients, while bridging the gap between physical and digital health-care.
Doktor 24 group is fast-growing and consists of two business areas that work tightly together: Doktor24, a digital integrated caregiver, and Platform24, an innovative health tech company that offers our award-winning platform to many healthcare providers such as several Swedish regions, insurers, and private health care providers. We are fortunate to have Investor AB (publ) and Apoteket AB as two long-term owners.
What we offer You will have an important role at one of the most interesting health tech startups in Scandinavia and your work will bring value to the 100,000s of patients who use our solutions. We’re a tight-knit team filled with fun and smart colleagues with an office located in the Stockholm city center.
You will come in at a time where you can really leave your mark on the organization but also on healthcare as an entire industry.
What you will do. In this role, you will be responsible for building robust and scalable data pipelines to enable production data analysis, while also working on developing and maintaining systems and infrastructure for our machine learning capabilities.
You will have a lot of possibilities to choose what technologies to use to ensure we’re using data to help take the platform to a new level.
Who are we looking for
- You have at least three years of experience from data engineering, or software engineering roles with a Masters Degree in Computer Science or similar.
- You have advanced knowledge of a programming language suited for a Data Engineer role, preferably from Python but Go or Java will work.
- Worked with at least one ETL workflow engine, like Airflow, Luigi or Dagster.
- Experience in container technologies, such as Docker and Kubernetes.
- Knowledge of relational databases and SQL.
- Used messaging technologies like Kafka or RabbitMQ.
It would be nice if you also have experience from
- building and designing REST APIs.
- setting up and configuring non-relational databases.
- have used and built production systems with machine learning frameworks like Tensorflow, Keras or SciKit-learn.