Job Search
Pomôžeme vám vybudovať výnimočnú kariéru.
Big Data Architect IRC245383
Job: | IRC245383 |
Location: | Poland |
Designation: | Consultant |
Experience: | 10-15 years |
Function: | Engineering |
Skills: | Big Data, Cloud(Azure/AWS/GCP), Databases, Databricks, Spark, SQL |
Work Model: | Remote |
Description:
Our Client
Our Client is world-leading manufacture of premium quality “smart” beds designed to help answer questions around not being able to sleep at night through innovative technologies and digitalized solutions.
Our client is a fast-moving, highly technical team of people with the ambitious goal of bringing people better health and well-being through the best possible sleep experience, aimed to be the leader in sleep. The product combines established expertise in creating comfortable, adjustable beds with the latest in sleep science, cutting-edge sensor technology, and data processing algorithms.
The Role:
As a Big Data Architect, you will be responsible for the leadership and strategic innovation related to data platform and services, help guide our solution strategies, and develop new technologies and platforms.
We are looking for individuals, who have a desire to architect and ability to rapidly analyze use cases, design technical solutions that meet business needs while adhering to existing standards and postures, lead multiple technical teams during implementation. Successful candidates will have excellent written and oral communication skills in English; will be comfortable explaining technical concepts to a wide range of audiences, including senior leadership and has deep understanding of modern big data architecture and design practices, patterns and tools.
Requirements:
Total of 10+ years of development/design/architecting experience with a minimum of 5 years experience in Big Data technologies on-prem and on cloud.
Experience with architecting, building, implementing and managing Big Data platforms On Cloud, covering ingestion (Batch and Real time), processing (Batch and Realtime), Polyglot Storage, Data Analytics and Data Access
Good understanding of Data Governance, Data Security, Data Compliance, Data Quality, Meta Data Management, Master Data Management, Data Catalog
Proven understanding and demonstrable implementation experience of big data platform technologies on cloud (AWS and Azure) including surrounding services like IAM, SSO, Cluster monitoring, Log Analytics etc
Experience working with Enterprise Data Warehouse technologies, Multi-Dimensional Data Modeling, Data Architectures or other work related to the construction of enterprise data assets
Strong Experience implementing ETL/ELT processes and building data pipelines including workflow management, job scheduling and monitoring
Experience of Nifi, SOLR, and HBase is a plus
Experience building stream-processing systems, using solutions such as Apache Spark, Databricks, Kafka etc…
Experience with AWS Kinesis
Experience with Spark technology is a must
Experience with Big Data querying tools
Solid skills in Python
Strong experience with data modelling and schema design
Strong SQL programming background
Excellent interpersonal and teamwork skills
Experience to drive solution/enterprise-level architecture, collaborate with other tech leads
Strong problem solving, troubleshooting and analysis skills
Experience working in a geographically distributed team
Experience with leading and mentorship of other team members
Good knowledge of Agile Scrum
Good communication skills
Job Responsibilities:
Work directly with the Client teams to understand the requirements/needs and rapidly prototype data and analytics solutions based upon business requirements
Architect, Implement and manage large scale data platform/applications including ingestion, processing, storage, data access, data governance capabilities and related infrastructure
Support Design and development of solutions for the deployment of data analytics notebooks, tools, dashboards and reports to various stakeholders
Communication with Product/DevOps/Development/QA team
Architect data pipelines and ETL/ELT processes to connect with various data sources
Design and maintain enterprise data warehouse models
Take part in the performance optimization processes
Guide on research activities (PoC) if necessary
Manage cloud based data & analytics platform
Establishing best practices with CICD under BigData scope
#remote #LI-IN1
What We Offer
Empowering Projects: With 500+ clients spanning diverse industries and domains, we provide an exciting opportunity to contribute to groundbreaking projects that leverage cutting-edge technologies. As a team, we engineer digital products that positively impact people’s lives.
Empowering Growth: We foster a culture of continuous learning and professional development. Our dedication is to provide timely and comprehensive assistance for every consultant through our dedicated Learning & Development team, ensuring their continuous growth and success.
DE&I Matters: At GlobalLogic, we deeply value and embrace diversity. We are dedicated to providing equal opportunities for all individuals, fostering an inclusive and empowering work environment.
Career Development: Our corporate culture places a strong emphasis on career development, offering abundant opportunities for growth. Regular interactions with our teams ensure their engagement, motivation, and recognition. We empower our team members to pursue their career goals with confidence and enthusiasm.
Comprehensive Benefits: In addition to equitable compensation, we provide a comprehensive benefits package that prioritizes the overall well-being of our consultants. We genuinely care about their health and strive to create a positive work environment.
Flexible Opportunities: At GlobalLogic, we prioritize work-life balance by offering flexible opportunities tailored to your lifestyle. Explore relocation and rotation options for diverse cultural and professional experiences in different countries with our company.