Job Search
Pomożemy ci zbudować niezwykłą karierę
Data Architect IRC246138
Job: | IRC246138 |
Location: | India - Noida |
Designation: | Solution Architect |
Experience: | 10-15 years |
Function: | Technology |
Skills: | Architect/Lead, Big Data |
Work Model: | Hybrid |
Description:
Join GlobalLogic, to be a valid part of the team working on a huge software project for the world-class company providing M2M / IoT 4G/5G modules e.g. to the automotive, healthcare and logistics industries. Through our engagement, we contribute to our customer in developing the end-user modules’ firmware, implementing new features, maintaining compatibility with the newest telecommunication and industry standards, as well as performing analysis and estimations of the customer requirements.
Requirements:
Big Data Technologies: Proficiency in working with big data tools and frameworks like Apache Hive, Apache Spark (including Spark MLlib), MapReduce, and Apache Flink.
Cloud Computing: Experience with cloud platforms such as Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure, with strong knowledge of services like AWS Redshift, Google BigQuery, Azure Data Warehouse, Azure Data Lake Analytics, and AWS Athena.
Data Integration and Streaming: Knowledge of data integration tools and real-time data streaming platforms like Apache Kafka, AWS Kinesis, and Google Cloud DataFlow.
Machine Learning and AI: Hands-on experience with ML platforms such as Amazon SageMaker, Google Cloud AutoML, and Google Cloud ML Engine.
ETL Development: Strong background in developing and optimizing ETL workflows and data pipelines using both open-source and proprietary tools.
Data Security and Governance: Experience with data governance and security frameworks, such as Apache Ranger and Apache Atlas.
CI/CD Practices: Proficiency with CI/CD tools like Jenkins to support continuous integration and deployment of data solutions.
Data Visualization: Experience with reporting and visualization tools such as MicroStrategy and other Data Integrated Suites.
Problem-Solving: Strong analytical and problem-solving skills with the ability to design and architect efficient data solutions for complex business needs.
Communication Skills: Excellent communication and collaboration skills to work effectively with cross-functional teams and business stakeholders.
Job Responsibilities:
We are seeking a highly skilled Data Architect to join our team. The ideal candidate will have extensive experience in designing, implementing, and optimizing large-scale data processing systems and big data platforms. This individual will work closely with data engineers, data scientists, and stakeholders to build scalable and efficient data solutions using the latest big data technologies across multiple cloud platforms.
Architect and Design Data Solutions: Develop and maintain highly scalable and distributed data architectures that leverage big data technologies such as Apache Hive, Apache Spark, and Hadoop.
Cloud Data Infrastructure Management: Design and deploy data solutions on cloud platforms like AWS, Google Cloud, and Microsoft Azure, utilizing services like AWS Redshift, Google BigQuery, Azure Data Warehouse, and Data Lake Analytics.
Data Integration and ETL Pipelines: Lead the development and maintenance of ETL processes and data integration workflows using various tools and technologies, including Apache Flink, Apache Kafka, AWS Kinesis, and Google Cloud DataFlow.
Big Data Processing and Optimization: Implement and manage big data processing frameworks such as Apache Spark, MapReduce, and Google Cloud DataProc for efficient data processing and transformation.
Data Governance and Security: Implement best practices for data governance and security using tools like Apache Ranger and Apache Atlas.
Advanced Analytics and Machine Learning: Collaborate with data scientists to build and deploy machine learning models using platforms like Amazon SageMaker, Google Cloud AutoML, and Google Cloud ML Engine.
Data Visualization and Reporting: Work with data analysts and business teams to develop and implement advanced data visualization solutions using tools like MicroStrategy and Data Integrated Suite.
Continuous Integration and Deployment: Utilize CI/CD tools such as Jenkins to automate the deployment and management of data pipelines and data applications.
Stakeholder Collaboration: Engage with business stakeholders to understand data requirements, translate them into scalable and maintainable data solutions, and provide technical leadership for data architecture and engineering best practices.
What We Offer
Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them.
Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!
Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays.
Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings.
Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses.
Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!