Job Search

Pomożemy ci zbudować niezwykłą karierę

1042 + otwartych rekrutacji na całym świecie

1042 + otwartych rekrutacji na całym świecie

India || Sr. Data Engineer || IRC233610

Job: IRC233610
Location: India - Noida
Designation: Associate Consultant
Experience: 5-10 years
Function: Engineering
Skills: Azure, Databricks, ETL, Python, Scala, SQL
Work Model: Hybrid

Description:

Join GlobalLogic, to be a valid part of the team working on a huge software project for the world-class company providing M2M / IoT 4G/5G modules e.g. to the automotive, healthcare and logistics industries. Through our engagement, we contribute to our customer in developing the end-user modules’ firmware, implementing new features, maintaining compatibility with the newest telecommunication and industry standards, as well as performing analysis and estimations of the customer requirements.

Requirements:

Required Qualifications

Bachelor’s degree in Computer Science, Engineering or related field
5+ years of overall work experience working on Data first systems
2+ years of experience on Data Lake/Data Platform projects on Azure
Strong knowledge of SQL for handling relational databases and familiarity with NoSQL databases to manage structured and unstructured data effectively.
Understanding of data warehousing concepts, including data storage, data marts, and ETL processes.
Skilled in using Microsoft Azure services such as Azure Data Lake, Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. These tools are essential for data ingestion, storage, processing, and analytics.
Knowledge of cloud storage solutions provided by Azure, such as Blob Storage and Table Storage, which are integral for data lakes.
Familiarity with ETL tools and frameworks for data extraction, transformation, and loading. Skills in Azure Data Factory or similar tools are particularly valuable.
Ability to perform data cleaning, transformation, and enrichment to ensure data quality and usability.
Proficient in programming languages such as Python or Scala, which are widely used in data engineering for scripting and automation.
Skills in scripting to automate routine data operations and processes, improving efficiency and reducing manual errors.
Understanding of how to develop and maintain APIs, particularly JDBC/ODBC APIs for data querying. Knowledge of RESTful API principles is also beneficial.
Awareness of data security best practices, including data encryption, secure data transfer, and access control within Azure.
Understanding of compliance requirements relevant to data security and privacy, such as GDPR.
Experience with data testing frameworks to ensure the integrity and accuracy of data through unit tests and integration tests.
Proficiency with version control tools like Git to manage changes in data scripts and data models.
Familiarity with DevOps practices related to data operations, including continuous integration and continuous deployment (CI/CD).
Basic skills in data analysis to derive insights and identify data trends, which can help in troubleshooting and improving data processes.


Job Responsibilities:

Responsibilities

Data Ingestion & Integration:
Assist in the development and maintenance of data ingestion pipelines that collect data from various business systems, ensuring data is integrated smoothly and efficiently into the data lake.
Implement transformations and cleansing processes under the guidance of senior engineers to prepare data for storage, ensuring it meets quality standards.
Data Management:
Help manage the organization of data within the data lake, applying techniques for efficient data storage and retrieval.
Assist in managing and configuring data schemas based on requirements gathered by senior team members, ensuring consistency and accessibility of data.
Support the maintenance of the data catalog, ensuring metadata is accurate and up-to-date, which facilitates easy data discovery and governance.
API Support & Data Access:
Support the development and maintenance of JDBC/ODBC-based SQL query APIs, ensuring they function correctly to allow end-users to access and query the data lake effectively.
Assist in optimizing and supporting simple analytical queries, ensuring they run efficiently and meet user and business requirements.
Quality Assurance & Testing:
Conduct routine data quality checks as part of the data ingestion and transformation processes to ensure the integrity and accuracy of data in the data lake.
Participate in testing of the data ingestion and API interfaces, identifying bugs and issues for resolution to ensure robustness of the data platform.
Collaboration and Teamwork:
Work closely with the Principal Data Architect and Principal Data Engineer, assisting in various tasks and learning advanced skills and techniques in data management and engineering.
Occasionally interact with other business stakeholders to understand data needs and requirements, facilitating better support and modifications in data processes.
Platform Monitoring & Maintenance:
Help monitor the performance of data processes and the data lake infrastructure, assisting in troubleshooting and resolving issues that may arise.
Learning & Development:
Continuously learn and upgrade skills in data engineering tools and practices, especially those related to Azure cloud services and big data technologies.
Contribute ideas for process improvements and innovations based on day-to-day work experiences and challenges encountered.

 


What We Offer

Exciting Projects: We focus on industries like High-Tech, communication, media, healthcare, retail and telecom. Our customer list is full of fantastic global brands and leaders who love what we build for them.

Collaborative Environment: You Can expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment — or even abroad in one of our global centers or client facilities!

Work-Life Balance: GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off and holidays.

Professional Development: Our dedicated Learning & Development team regularly organizes Communication skills training(GL Vantage, Toast Master),Stress Management program, professional certifications, and technical and soft skill trainings.

Excellent Benefits: We provide our employees with competitive salaries, family medical insurance, Group Term Life Insurance, Group Personal Accident Insurance , NPS(National Pension Scheme ), Periodic health awareness program, extended maternity leave, annual performance bonuses, and referral bonuses.

Fun Perks: We want you to love where you work, which is why we host sports events, cultural activities, offer food on subsidies rates, Corporate parties. Our vibrant offices also include dedicated GL Zones, rooftop decks and GL Club where you can drink coffee or tea with your colleagues over a game of table and offer discounts for popular stores and restaurants!

About GlobalLogic

GlobalLogic is a leader in digital engineering. We help brands across the globe design and build innovative products, platforms, and digital experiences for the modern world. By integrating experience design, complex engineering, and data expertise—we help our clients imagine what’s possible, and accelerate their transition into tomorrow’s digital businesses. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers around the world, extending our deep expertise to customers in the automotive, communications, financial services, healthcare and life sciences, manufacturing, media and entertainment, semiconductor, and technology industries. GlobalLogic is a Hitachi Group Company operating under Hitachi, Ltd. (TSE: 6501) which contributes to a sustainable society with a higher quality of life by driving innovation through data and technology as the Social Innovation Business.

Apply Now

The gender information on this form helps us understand the makeup of our applicant pool in this key area, and to continuously improve our efforts to make our workforce more inclusive.
Attach your file here or browse
Only .docx, .rtf, .pdf formats allowed to a max size of 5 MB.
  • URL copied!