OBS! Ansökningsperioden för denna annonsen har
passerat.
Arbetsbeskrivning
Data Engineer
Are You Ready to Dive into a Realm of Possibilities?
Seize the opportunity to join a leading tech powerhouse with an exceptional team and a world of diverse opportunities. Nexer Insight invites you to embark on a journey as a Data Engineer where you'll be part of an extraordinary team, engage in exciting activities, work with fascinating clients, tackle varied and challenging tasks, and enjoy exceptional benefits.
Accelerate your Data Engineer career at Nexer Insight
As a Data Engineer consultant at Nexer Insight, your world revolves around new technology and create a better tomorrow for our customers. While there's no standard day in the consultancy realm, we're all about agility, teamwork and pursuing project milestone. You'll be on a journey of problem-solving, strategic planning, architecting innovative solutions, guiding clients and become the best version of you. Beyond that, you'll be part of internal and external events and meetings and contribute to our evolving delivery processes.
Why Join Us:
- You will get an oppurtunity to be a part of a growing Nexer Insight that are a company that epitomising entrepreneurship and innovation.
You will learn and work with the latest technology in Azure, Databricks and DBT which we are partners with
- We invest in our employees to become a champions in their domain, we offer professional growth and development
- We work in teams and have a great community.
We are offering a flexible work environment with the possibility to work at least part-time from home
Key Responsibilities:
- Develop and operate scaleable and robust analytical models
- Set up data platforms and build data models and pipelines
- Innovate and collaborate with customers to learn and find new business opportunities
- Become a champion in best practices regarding architecture, data quality, reliability, and performance
- Be in the forefront of the latest data engineering technology
Based on our criteria for the role, we expect you to have:
- Hands-on project experience of working with cloud solutions (Azure, AWS or GCP)
- Experience of working with ETL tools such as Azure Data Factory, Databricks, DBT
- Experience with Data Lakes or Lakehouse Architecture
- In-depth knowledge of SQL syntax
- Experience in information and dimensional modelling
- Good knowledge of DevOps processes with git version control and CI/CD methodology
- Speaks English and Swedish fluently
Meritorious experience:
- Platform modernization
- Tabular cubes and DAX
- Data Vault modeling
- Experience in setting up Azure, AWS or GCP Infrastructure