OBS! Ansökningsperioden för denna annonsen har
passerat.
Arbetsbeskrivning
Detailed Job Description/Requirement:
We are seeking an experienced Data Solution Architect to join our dynamic team in HCL Technology Sweden AB. The ideal candidate will possess a strong background in Azure Cloud Data Architecture, Snowflake, and Kafka, along with Java & full-stack development skills. This role requires a strategic thinker who can design, architect, and implement robust data solutions while collaborating with Product Owners, Engineering Managers, Enterprise Architects, and Business Stakeholders.
Key Responsibilities:
1. Solution Design & Architecture:
Design and architect scalable and efficient data solutions that meet business needs.
Work closely with Enterprise Architects to align solutions with the overall architectural vision.
Collaborate with Product Owners and Engineering Managers to define data product requirements and deliverables.
2. Data Pipeline Development:
Design, build, and maintain ETL data pipelines using the Azure Cloud Data Engineering stack.
Leverage Azure Data Lake and Lakehouse architectures for effective data storage and processing.
Extract, transform, and load (ETL) data from various sources into data warehouses or data lakes.
3. Data Governance & Quality:
Desing, Implement and enforce data governance policies and practices.
Ensure data quality by implementing data validation and quality checks.
Maintain data accuracy and reliability for analytical purposes.
4. SQL and Snowflake Expertise:
Develop and optimize complex SQL queries and stored procedures for data transformation and integration.
Utilize Snowflake for efficient data warehousing solutions, including building data products and designing secure data sharing methodologies.
Ensure data security and compliance with industry standards and regulations.
5. Kafka Integration:
Design and Implement Ingestion frameworks to manage real-time and batch data ingestion from Kafka and similar message queuing systems.
Develop and maintain Kafka data streaming solutions.
6. Fullstack Development:
Strong Java experience along with building APIs for data sharing and application development.
Be able to apply JavaScript knowledge for full-stack development tasks.
Preferred experience with Angular or React frameworks for front-end development.
7. Collaboration & Communication:
Work with business stakeholders to understand data requirements and deliver actionable insights.
Collaborate with engineering teams to ensure seamless integration of data solutions.
Communicate effectively with cross-functional teams to drive data initiatives.
8. DevOps and Automation:
Implement DevOps practices to streamline data pipeline deployments and operations.
Automate data processing workflows to enhance efficiency and reliability.
Qualifications and Must Have:
Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
Strong Experience in data architecture & engineering.
Proven expertise in Azure Cloud Data Architecture, Azure Data Factory, Databricks and ETL processes.
Strong proficiency in SQL and Snowflake for data transformation and integration.
Extensive experience with Kafka for real-time data streaming and batch processing.
Knowledge of Java, JavaScript and experience with Angular or React frameworks is preferred.
Demonstrated experience in data governance and data quality management.
Excellent problem-solving skills and ability to design scalable data solutions.
Strong communication skills with the ability to work effectively with technical and non-technical stakeholders.
Preferred Skills:
Experience with DevOps practices and automation tools.
Familiarity with data product design and development.
Knowledge of modern data warehousing concepts and technologies.
Strong analytical and organizational skills.