OBS! Ansökningsperioden för denna annonsen har
passerat.
Arbetsbeskrivning
Global migration is a 21st-century reality. Whether people are fleeing from something or racing towards an exciting opportunity, following love or seeking new experiences, more and more people are living in different places and in new ways.
We celebrate the power of people coming together. That’s why we connect ambitious people abroad with their families and friends back home so they can support each other, lose less of what was left behind, and lead more enriched lives.
With international calling, mobile top-ups and more, we design products with the needs of modern migrants in mind.
Our ambitious team reflects our international audience. We are a diverse group of people from all over the world that come together every day—and we’re looking for others driven by the same desire - to create meaningful products that bridge cultural and geographic distances.
Location Stockholm, Sweden
We are now looking for a Data Engineer to help us take the infrastructure and usage of data to the next level. We are launching new products and looking to fuel our growth driven by insights from the vast amounts of data we generate. We are also looking to work on some really exciting initiatives such as building the next generation of fraud prevention and churn prediction models. You will work closely with our stakeholders from departments like finance, operations, product, and commercial.
Current stack: AWS (Kinesis, S3, Lamda), Snowflake, Airflow, Matillion, DBT, Looker, Kubernetes
The team’s responsibility
Design, build, and maintain our data pipelines empowering our business intelligence
Build a scalable event streaming infrastructure to enable real-time analytics use cases
Model our data in our Snowflake DWH with consideration for data quality
Interact with business stakeholders & collaborate to build the solution that serves the business needs
Experience we seek
Experience in data modeling and building data pipelines
Experience with Data Ops
Experience in ELT techniques and data warehouse modeling
Excellent SQL skills
Some or all technologies: DBT, Python, Airflow, AWS (or other cloud data platforms). Event-bus tech like Kinesis data stream.
Additional appreciated experience
Experience from a fast-paced company with digital products, marketing, and/or machine learning
Experience with data visualization and reporting using tools like Looker or similar
Decent knowledge of DevOps and SCM tools and processes
Experience working with Kubernetes and Docker
Ideal candidate
4+ Years within Data Engineering
Prior experience working with data streams and event based data pipelines
MSc (or equivalent) in Engineering, Computer Science (or relevant professional experience)
Great team player
Happy trying out new tools, techniques and technologies