OBS! Ansökningsperioden för denna annonsen har
passerat.
Arbetsbeskrivning
Veritaz is a fast-growing IT-consultant firm. Our company is made up of insanely bright people from over 4 countries, and we are located in Sweden, UK, US and Pakistan. The voyage has been incredible this far, but it is only the beginning.
Assignment Description:
We are looking for a Data Hero create value out of large datasets together with a team of fun, humble, and ambitious people doing various working tasks. You will work closely with the function developers to get a deep understanding of the Pilot Assist functions and the data sources you are working with. If you would like to, this involves driving test vehicles in Sweden as well as abroad. The company is learning together and creates great things by sharing knowledge with each other.
What you'll do:
● Participate in documenting, designing, developing, testing, and deploying the Data Models and Data Marts that come from various source systems.
● Using the data flow design, develop and maintain the ETL scripts that acquire data from various source systems to load information into the Staging Area, the Data Warehouse, and Data Marts.
● Identifies and develops opportunities to utilize objects and code where appropriate to reduce development effort and enforce consistent business rules.
● Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions.
● Use predictive modelling to increase and optimize customer experiences, revenue generation, ad targeting and other business outcomes.
● Work through entire toolchain from understanding the data needs in your team to visualizing and presenting results to them.
Who you are:
● Good Knowledge in prototyping, specifying, and implementing analytical ecosystems and/or logical data warehouses, including (big) data architecture, such as data lakes, streaming data analytics, etc., and full stack analytics capabilities.
● Working knowledge of common enterprise data, advanced analytics, and business intelligence architecture frameworks.
● Have proven technical experience in Python programming, Database handling & architecture, Parallelization techniques, Cloud computing, BI tools such as Power BI/Tableau/Grafana/Kibana, and Geographical data handling, such as PostGIS
● Excellent communication, teamwork skills, inter-personal and Organizational skills.
● Have experience in big data architecture and hands on experience with, Hadoop, Java, Kafka, Spark, and Scala.