Archives

Modern enterprises spend a large amount of time and resources building data pipelines into the data platform from a variety of sources and managing the quality of data transferred through the pipelines. These pipelines can vary in terms of source systems, sink systems, transformations, and validations performed.

A pipeline created for a particular use case may not be reusable for a different one and will require additional development effort to change. As a result, there is a need for frameworks that build new pipelines, adding additional data sources or data sinks with minimal time and development effort. Ideally, the framework should also be flexible in customizing and extending it to easily adapt to suit enterprise-specific requirements.

A number of low code and no-code solutions exist that allow for visually creating the data pipelines across a variety of sources and sinks. However, they do not provide the flexibility and modularity typically required to customize the pipelines for a given scenario.

Using a low code framework consisting of reusable, modular components that can be stitched together to compose the required pipelines is a better approach.

Low-Code-Data-Platform-Framework-On-Page

In this post, you’ll learn about the requirements for the low code framework and the approach to designing this framework.

Requirements for the Framework

Creating and maintaining pipelines to move data in and out of the platform is a major consideration. A data platform framework that allows its users to perform the different operations in a consistent way, irrespective of the underlying technology, will greatly reduce time and effort.

What do you look for in a low code framework? Here are some suggested requirements.

Modular: The framework should be modular in design. Each component of the framework can be used, managed, and enhanced independently.

Out-of-the-Box Functionality: Support integration with common data sources and sinks, and perform transformations out of the box. The components should be easy to implement for common use cases.

Flexible: The framework should be able to integrate with different services/systems across clouds or from on-premises.

Extensible: Allow extending existing components to customize as per specific requirements or add new custom components to implement new functionalities.

Code First: Provide a programmable way of defining and managing pipelines. API and/or SDK support should be available to programmatically create and access the pipelines.

Cross Cloud Support: Support for data sources, sinks, and services across different cloud services. You should be able to migrate pipelines using the framework for one cloud or on-premises to another cloud environment.

Reusable: Provides common reusable templates that allow for creating jobs in an easy way.

Scalable: Ability to scale workers dynamically or by configuration to handle high performance. The framework should automatically scale the underlying compute in response to changing workloads.

Managed Service: The framework should be deployable on a fully managed cloud service. Provisioning the infrastructure capacity, managing, configuring, and scaling the environment should be managed automatically. Minor version upgrades and patches are automatically updated and support is provided for major version updates.

GUI-based Definition: An intuitive GUI for creating and maintaining the data pipelines will be useful. The job runs and logs from execution should be accessible through a job monitoring and management portal.

Security: Out-of-the-box integration with an enterprise-level IAM tool for authentication and role-based access control.

A High-level Overview of the Framework

The data platform framework provides the base foundation upon which you can build specific accelerators or tools for data integration and data quality/validation use cases.

Blueprint

While designing the framework, it is important to consider the following points:

  • Technology Choice: We recommend a cloud-first approach when it comes to technology. The core of the framework should be deployable on a cloud-managed service that is extensible, flexible, and programmatically manageable.
  • Data Processing: Data processing should be based on massively parallel processing solutions that can easily scale as per the requirement in order to support large volumes.
  • Orchestration: Scheduling and executing data pipelines requires a scalable and extensible orchestration solution. Go with a managed workflow service that provides a programmable framework, with out-of-box operators for integration, and also allows for adding custom operators as required.
  • Component Library: Common data processing functionalities should be made available as components that can be used independently or in addition to other components.
  • Pipeline Configuration: A custom DSL-based configuration definition allows for reusability of pipeline logic and provides a simple interface for defining the required steps for execution.

Building Blocks

Here are the building blocks for such a framework:

  • Pipeline Template: A DAG template that supports pipeline orchestration for different scenarios. The template can be used to generate data pipelines programmatically during design time, based on user requirements.
  • Job Template: A job execution template that supports processing the data using the component library as per user requirements. Common job flow patterns can be supported through built-in templates.
  • Component Library: A suite of functionality code for supporting different processing use cases. It consists of components, factories, and utilities.
  • Components: The base processing implementations that perform read/write on various data sources, apply transformations, run data validations, and execute utility tasks.
  • Factory and Generators: Factory and Generator code helps in abstracting the implementation differences across different technologies.

Accelerate Your Own Data Journey

At GlobalLogic, we are working on a similar approach as part of the Data Platform Accelerator (DPA). Our DPA consists of a suite of micro-accelerators built on top of a platform framework based on cloud PaaS technologies.

We regularly work with our clients to help them with their data journeys. Share your needs with us using the contact form below and we are happy to discuss your next steps.

The development of immersive AR/VR technology has made its way into every aspect of our lives. Heightened by the COVID-19 pandemic, many businesses have been forced to adopt remote working, thus accelerating technological advancement. From entertainment to medical to eCommerce, digital content through augmented and virtual reality allows individuals to work, interact and socialize with others within its complex technological ecosystem. 

In this paper, we will explore the different types of AR and VR technology. Learn how this immersive technology has been integrated into different sectors and what role the AV/VR landscape plays in the future.

Why do you need to craft insights?

In our journey to solve problems for people, we begin by understanding the user's needs. This divergent user research process drives actionable insights. As a result, crafting insights helps fuel the entire design phase by creating products and services that resonate with user needs.

Insights also help establish an emotional connection with the problem faced by end-users. It helps bring all the stakeholders who are part of the problem-solving activity on the same page by articulating a common goal. As insights are based on human needs and desires, they inspire us to innovate and ideate on possible opportunities.

Insights are not only relevant for designers but are also helpful for all functions of your business. They pave the way from information to innovation, enabling you to positively impact lives.

craft insights

What is not an insight?

It is also important to understand what is not an insight.

We often come across situations where gathered data and interview quotes are represented as insights. Data alone cannot be deemed an insight, as it is primarily in the raw form. It is on us to analyze that data and interpret it to inform insight.

Secondly, an observation can not be considered an insight until we understand the why behind it. Do not assume you have insights based on customer/user statements alone. Dig deeper to understand the motivation to get to an actionable insight.

Characteristics of a Great Insight

A good insight effectively communicates our raw learnings into actionable outcomes. The characteristics of good insights should reflect:

  1. Revealing: It must be non-obvious and should be novel in nature.
  2. Memorable: It should be crisp and concise so that it can stick with the audience.
  3. Inspiring: It should motivate people to go ahead and take action.

In order to craft insights, you must let go of your assumptions, stretch your thinking,  and focus on being curious. These aspects help you to understand and give meaning to your raw learnings which ultimately take shape as insights.

It's time to utilize your creative confidence towards crafting insights that bring teams together, give direction and act as an anchor to innovate and make lives better for people.

For companies starting out as well as established companies on their agile journey, quarterly planning is typically one of their top challenges. But it doesn’t need to be.

Advantages with Quarterly Agile Planning

When done correctly, quarterly planning can help you effectively accomplish all of the objectives below.

1. Management Vision and Priorities Become Known

Every quarter, management will inform and explain their updated vision and priorities to their employees, so they are able to understand and align to the company ethos.

Prior to an agile transformation, it’s common for teams to communicate that they don’t understand their leader’s vision or priorities. Therefore, it is the role of the management to clearly communicate everyone’s priorities. This particular transition is always a delight to see.

2. Communications Between Delivery Teams and Management Become Clear

When everyone on the team attends a meeting and discusses what the management wants and why, the team then comes back to management with realistic written targets stating what they can deliver.

This leaves little ambiguity between what management wants (but almost never gets all of) and what the team can deliver.

3. Dependencies are Identified and Tracked

While teams commit with good intentions, they can’t always deliver. The number one reason they don’t is because of dependencies. That’s why it’s important to identify them upfront and document them in a tracking system.

It’s essential that the dependencies are monitored, otherwise the quarter will come to an end only to discover that certain committed deliverables won’t be delivered due to dependencies.

4. Shared Services are Actively Involved in Commitments

Shared services will often have commitments outside the planning group. They will probably have their own roadmap entailing systems upgrades and other commitments that need to be taken into account.

The best way to address this is to include them in your planning and have them determine their own commitments.

5. Inviting Everyone Generates Shared Understanding

When you talk about inviting individual contributors such as developers, QA, etc., it is advisable to push back the perceived reduction in productivity.

I have also received persistent positive feedback when individual contributors participate. This feedback includes understanding what the larger business unit does, understanding what they’ll be working on, and accepting suggestions on how to change objectives to make them better.

6. Everyone Has a Say in the Commitments

Adding to the above, not only is everyone invited but they also have a say in the quarterly goals for the team.

This sense of ownership is powerful because it helps further cement their commitment than if they were simply being handed down orders.

Doing Prep Work in Advance to Planning Can Make or Break It

Some companies have a variety of complex regulations and are unable to plan in real-time. One previous client, for example, could walk into a room, discuss some ideas from the ideation phase, and have staff iterate the idea and come up with feasible MVPs for the quarter.

Yet for another client, I had to develop a customized framework to accommodate their unique situation, making their prep work all the more important.

During a planning session, if you hear a lot of “I don’t know” blockers that could prevent you from moving forward, it’s an indication that more prep work is needed.

Keep reading:
Agile Transformation: Are You Ready?
The Scrum Guide is Dead — Long Live the Scrum Guide!
Tracking and Resolving Software Regressions

The pandemic has forced many schools and training organizations to conduct their activities online, including evaluations. This has created a need for remote testing invigilation and the technology that can facilitate it.

What would a standard invigilation system look like? In this paper, you’ll learn about the tools that need to be developed (including a list of high-level product features), the benefits and types of remote invigilation, and the use cases for this system.

As technology develops rapidly, security and privacy challenges arise as well. A secure infrastructure involves many hardware components, and each piece processes or transmits sensitive data. As many enterprises adopt IoT concepts and edge computing, information security is a top concern for ensuring confidence and integrity.

How can businesses provide infrastructure security for edge devices? Learn about reader-enabled access infrastructure, methods to improve hardware security, the infrastructure of edge computing, and some proposed solutions for edge device security, along with their challenges.

Time series technologies and tools are driving innovative new applications for statistical analysis, and many of these are open source. Keeping up with the tech stack required to analyze massive amounts of time series data at scale has become imperative, whether that means engineering a new purpose-built database or upgrading an older one to suit.

In this paper, explore the building blocks of time series analysis, various characteristics of different types of time series, challenges in storing time series data, and different methods of visualization. Learn how analysts are applying time series data analysis to everything from predicting server outages to forecasting growth in GDP for a budding economy.

The insurance sector is facing increased claims, disruption from InsurTech companies, and changing client needs. Customers are coming to expect a superior experience in their dealings with insurance companies. When customers get frustrated by delays, they quickly change their policies.

Which technologies can insurance companies employ to boost customer retention? Learn the difference between Narrow and General AI, the challenges faced by the insurance industry, and how AI-based deep machine learning can provide solutions. In this whitepaper, you’ll also discover use cases for vehicle damage inspection, drone roof assessments, and acquiring new clients using chatbots.

Data is essential for any organization, but it’s challenging to protect it from unauthorized access, misuse, theft, and other outside threats that are increasing daily. Data loss prevention (DLP) is an approach that helps protect sensitive information and restricts end users from moving it outside a company’s internal network.

Why should data loss prevention matter to your organization? Learn how DLP protects against theft, accidental disclosure, loss, and misuse of sensitive information. You’ll also find the leading DLP software, as ranked by Gartner.

Production & Manufacturing: A Technology-Driven Industry

What branch of industry sets trends and is most likely to embrace new technologies? Not many people would respond to this question by suggesting production and manufacturing. Yet it is this sector that continues to forge closer relationships with the IT industry and invest in research and development.

The industry's digital innovations under the watchful eye of IT companies

For many sectors, the year 2020 brought the greatest challenges in years. At the same time, it highlighted the importance of digital transformation and the use of new technologies in business. This has benefited the IT industry, which has provided much-needed support for many sectors during this complex time by offering the knowledge and solutions they sought.

In today's reality, progress is necessary and inevitable. Only organizations that evolve with the times are able to strengthen their position in the market, fight for the attention of increasingly demanding customers or maintain the current level of competitiveness. This can be seen in factories, which have changed beyond recognition over the course of several decades. In the 21st century, they are automated, use data efficiently, and deploy digital twins, artificial intelligence and IoT solutions on a large scale. They often set trends for the whole world, showing, for example, how to skillfully build foundations for cooperation between robots and employed specialists.

“Where technologies emerge, enterprises quickly reap significant benefits. Productivity and safety increase, and at the same time costs are reduced,” comments Marek Matysiak, Head of Division - Engineering at GlobalLogic Poland. “Industry automation software unlocks the business potential of companies that can also successfully tackle the most common industry challenges, such as costly downtime and unexpected breakdowns,” he adds.

This corresponds with the needs of customers and business partners who are pushing for change. However, one cannot forget the requirements of contemporary civilization. The idea of sustainable development plays an important role in the strategy of every organization and is part of the general trend of reducing the carbon footprint, caring for the natural environment and using resources more consciously.

The digital world of industry

March 2021 saw industrial production in Poland reach its highest level in history, according to data from Statistics Poland. Its development would not have been possible without effectively implemented technologies. In fact, 30% of companies that took part in the study Smart Industry Poland 2020 see digital technological solutions as an essential ingredient in the company's success over the next three years. It is worth noting, however, that the remaining, more frequently mentioned elements are also closely related to the IT sector and the modernization of organizations through technology. Without them, reducing costs, improving process efficiency or product quality may not be possible or may become much more difficult to achieve.

Building such awareness among decision-makers in industrial companies and — in the long run, directly supporting these companies in implementing changes and carrying out digital transformation — is the responsibility of their partners from the IT sector. We've learned this first-hand as the largest provider of IT solutions and services for the industrial, construction and manufacturing sectors in 2020, according to the "Computerworld TOP200" report.

“Close partnership between IT companies, which have the required competencies and knowledge, and industrial enterprises translates into dynamic development of the entire sector, one which is often the first to implement new technological solutions. It is futile to look for other areas where robotics and automation are equally advanced, and the concept of combining the real and digital worlds becoming so real. We see it every day while carrying out projects for customers of various sizes,” says Marcin Zając, Director - Engineering at GlobalLogic Poland — took first place in the ranking for the third time in a row.

The noticeable benefits resulting from the implementation of new technologies affect the growth of industry 4.0. The pace of change increases every year and with it the ties between industry and IT.

Research and development in industry

According to Eurostat, Poland is one of the countries with the highest growth rate in research and development spending in recent years. This includes investments influencing the evolution of industry, which thanks to digital transformation can better respond to the needs of their business partners and customers in the 21st century, and at the same time operate in a more ecological and resource-efficient way. The future of industry is invariably speed and efficiency, but also greater employee safety and care for the environment.

“Finding the golden mean is only possible thanks to new technologies, such as digital twin, smart sensors or intelligent robots, as well as progressive digital transformation,” concludes Marek Matysiak, Head of Division - Engineering at GlobalLogic Poland.

  • URL copied!