OBS! Ansökningsperioden för denna annonsen har
passerat.
Arbetsbeskrivning
Veritaz is a fast-growing IT-consultant firm. Our company is made up of insanely bright people from over 4 countries, and we are located in Sweden, UK, US and Pakistan. The voyage has been incredible this far, but it is only the beginning.
Assignment Description:
We are looking for a Data Engineer who is experienced in SQL and programming skills in Python and business intelligence architecture frameworks.
Do you value openness, transparency, and empowerment? Our squad is high performing cross-functional team who are set with a mission to provide win-win exchanges for our customers.
What you'll do:
● Participates in documenting, designing, developing, testing, and deploying the Data Models and Data Marts that come from various source systems.
● Using the data flow design, develop and maintain the ETL scripts that acquire data from various source systems to load information into the Staging Area, the Data Warehouse, and Data Marts.
● Identifies and develops opportunities to utilize objects and code where appropriate to reduce development effort and enforce consistent business rules.
● Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.
● Use predictive modelling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes.
Who you are:
● Good Knowledge in prototyping, specifying, and implementing analytical ecosystems and/or logical data warehouses, including (big) data architecture, such as data lakes, streaming data analytics, etc., and full stack analytics capabilities.
● Working knowledge of common enterprise data, advanced analytics, and business intelligence architecture frameworks.
● Have proven experience in software development best practices: SCRUM, GIT, CI/CD, Test Automation.
● Excellent communication, teamwork skills, inter-personal and Organizational skills.
● Have experience in big data architecture and hands on experience with, Hadoop, Java, Kafka, Spark, and Scala.