Job Search
Wir können Ihnen helfen, eine außergewöhnliche Karriere aufzubauen
Python Data Engineer with AWS IRC237632
Job: | IRC237632 |
Location: | India - Gurgaon, Noida |
Designation: | Associate Consultant |
Experience: | 5-10 years |
Function: | Engineering |
Skills: | AWS, Python |
Work Model: | Hybrid |
Description:
Key Responsibilities
Data Pipeline Development:
Design, develop, and maintain scalable data pipelines for data ingestion, transformation, and storage.
Data Integration:
Collaborate with cross-functional teams to integrate data from various sources into centralized storage solutions.
Data Processing:
Perform efficient data manipulation and analysis using Python libraries like Pandas and Numpy.
Database Management:
Write, optimize, and manage complex SQL queries for data extraction and reporting.
Cloud Infrastructure:
Implement and manage AWS-based solutions for data storage, processing, and analytics (e.g., S3, Lambda, RDS, Redshift).
Performance Optimization:
Ensure high performance, scalability, and reliability of data pipelines and workflows.
Documentation:
Create and maintain technical documentation for data pipelines, workflows, and best practices.
Requirements:
Experience with distributed data processing frameworks like Apache Spark or Hadoop.
Knowledge of data warehouse design and dimensional modeling.
Familiarity with version control tools like Git.
Exposure to CI/CD pipelines for data workflows
Preferences:
We are seeking a highly skilled and motivated Python Data Engineer to join our dynamic team. In this role, you will design, develop, and maintain scalable data pipelines, ensuring seamless data integration and analysis. If you are proficient in Python, AWS, SQL, and have a strong grasp of data manipulation libraries like Pandas and Numpy, we want to hear from you!
Job Responsibilities:
Technical Expertise:
Proficiency in Python and experience with data manipulation libraries like Pandas and Numpy.
Strong SQL skills with experience in writing complex queries and working with relational databases.
Hands-on experience with AWS services such as S3, Lambda, RDS, Redshift, and Glue.
Experience:
5+ Years of professional experience in data engineering or a similar role.
Solid understanding of ETL/ELT processes and data pipeline design.
Soft Skills:
Excellent problem-solving skills and attention to detail.
Strong communication skills to collaborate with technical and non-technical stakeholders.
Was wir anbieten
Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them.
Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!
Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays.
Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings.
Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses.
Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!