Archives

I was one of the early buyers of the first release of Apple Vision Pro AR headset early this year. I got up at 5am my time to place an order on-line at the first moment when the device became available for pre-order. I then made an appointment at my local brick-and-mortar Apple Store to pick it up as early as possible on the first Saturday after they shipped. Needless to say, I was excited about this technology (and still am).

I got to the store early and waited in line with other eager Vision Pro buyers also picking up their orders. When the store opened and we all filed in, we were each assigned to a store associate for a demo and familiarization session with this new device. While I was overwhelmed by the experience offered by the device itself, what struck me just as forcefully was the nature of the retail experience. The air of excitement in the store was palpable, not only among the group of buyers, but also among the sales associates and even the store manager. I felt like we were all conspiring to share something truly unique, truly special. And that, I think, is the essence of a great retail experience: It’s a conspiracy between the buyer and seller to share a thing of value.

I’ve had this experience a few other times in my life, generally with the owner of a “Mom & Pop” or family enterprise. I particularly remember an experience in India where I bought a leather wing-back chair. I had seen the chair on display at the old Bangalore airport, and because I couldn’t find a retail outlet, I made a visit to the local factory where it was made. The owner of the factory personally showed me around, and actually took my measurements, like I was buying a suit of clothes. They made the chair “to order” and to fit my dimensions (hip-to-knee, length of torso, etc.)! When his workers brought the finished chair to the company apartment where I was staying at the time, you could feel their pride in the finished product. They waited eagerly until I sat in the chair and pronounced it a perfect fit. Years ago, I brought the chair back to the US with me, and it sits proudly in my study to this day, full of very pleasant memories around how it came into my life.

Why are retail experiences like this rare?

Well, when I’ve had them, I was buying something that was at least relatively expensive, and from someone to whom the sale would be significant in some way. In the Apple Vision Pro example, I assume that—in addition to the sales people’s genuine excitement around this new product—the store probably had incentives, a contest, or at least recognition in place for their associates to first learn and then present this new technology. I don’t know that for a fact, but I did get a survey soon afterward, from which I assume (hope) the sales associate was recognized in some way for his great work. In the factory owner example, in addition to their genuine pride in a great product, I think they were hopeful that I’d spread word-of-mouth among my colleagues and fellow ‘ex-pats’ (which I did). In other words, even beyond the money they received, it was worthwhile for someone to deliver an exceptional product and experience.

Notably, none of my best retail experiences were entirely on-line. This being the 21st Century, they all involved the Web: I originally ordered the Apple Vision Pro on-line, and I tracked down the manufacturer of the chair through Google. Other “peak” retail experiences have all likewise involved the Web in some way—education, awareness and so on. But none of my best retail experiences to date have been entirely on-line; all have required a personal touch.

Why is this? Thirty years after the Web came to prominence, why is it not delivering such ‘peak’ retail experiences by itself?

I think it is because of the relative lack of personalization. A critical factor in any ‘conspiracy’—which I believe a great retail experience requires—is collusion between one or more people. Sophisticated websites use machine learning and demographic data to present an experience that is in some ways tailored to the buyer—or at least to a similar type of buyer. However, in my own on-line shopping, I have not yet encountered a retail website that really directly engages me, individually, the way a good human retailer does. My best experiences to date have all required a “human in the loop”.

Back in the early days of the Web I was inspired by a then-recently-published 1993 book called “The One to One Future: Building Relationships One Customer at a Time”, by Don Peppers and Martha Rogers. Peppers’ and Rogers’ thesis was that businesses can prosper by building strong relationships with their best customers, and by growing their share of that customer’s wallet over time. When my team was developing an early e-commerce platform at Apple in the 1990’s, we built in a number of features that enabled automated individualized shopping experiences—such as intelligent cross-selling. But, while we aimed to achieve it, the technology did not exist at that time to truly enable the kind of “mass personalization” and personalized relationships that Peppers and Rogers envisioned. To this day, thirty years later, developing a deep personal relationship with a customer has required a person—a human in the loop. It’s not something the industry has truly been successful at automating.

Earlier attempts at automating ‘personalized’ customer interaction, such as chat-bots and IVR systems, have had limited success. This is largely, I believe, due to the necessarily scripted and programmed nature of these interactions. While some experiences are far better than others, people do not interact naturally on a ‘programmed’ basis—it tends to sound phony unless the system vocabulary is very large, and the options are truly flexible, or even dynamic. That’s exactly what we have in GenAI—interactions that are increasingly open-ended and therefore more human-like.

Could a GenAI-based retail system learn to understand me as an individual—what drives me, what decision criteria I use, what I value in a product? And then map that information to identify the products I would love to buy, and help me understand why I’d love and can afford those products? I believe the answer is an emphatic “yes”, going far beyond traditional ‘propensity to buy’ determinations. The main wildcard here is access to accurate customer behavioral and motivational data and, ultimately, the customer’s willingness to be known deeply by a given system. In general, people are more disclosing to a machine than to other people, because the fear of judgement is reduced. People’s main concern when disclosing information to a machine tends to be the privacy of the information, and concerns about the manner in which that information will be used. These are solvable problems. In particular, I think shopping ‘agents’ will be developed who understand a customer and a customer’s finances and goals intimately—perhaps presenting an anonymized face to the world, for privacy sake.

The other wildcard is how disclosing a retailer is willing to be about the true nature of their products. Mapping an individual customer to a product description only creates loyalty if that customer is truly delighted after receiving and using the product itself. Marketing hype may help sell a product once. But alone, it doesn’t create the kind of “lifetime customer” or increasing “wallet share” that Pepper and Rogers describe in the “One to One Future”. Thankfully, we live in an age of increasing transparency, with customer ratings and reviews now a routine part of the buying experience, and suppliers increasingly honest about sourcing, provenance and other criteria of concern to various individuals. While these systems and data can be—and often are—gamed, they point the way toward a more objective and satisfying retail experience in the future.

I remember Steve Jobs once sent a thoughtful email to the employees at NeXT talking about how retail experiences always involve individual ‘values’—not in a moral sense necessarily, but in terms of the importance they give to certain factors. He gave the example of himself and his wife Laurene considering the purchase of a high-quality European washing machine for their home. He said that the washer produced softer clothes and used less water and laundry soap, and was therefore better for the environment. On the other hand, the washing cycle was longer, so they couldn’t clean as many clothes in the same amount of time. Steve and Laurene opted for the European machine I believe. But you can readily imagine someone with a large family valuing the higher throughput of a cheaper US-made machine above the quality and environmental advantages posed by the more expensive European option. With its natural language understanding, consideration of these kind of value judgements are a very good use case for GenAI. It’s exciting to think of the opportunities for delightful retail experiences for us all, both as consumers, and as sellers, in the one-to-one GenAI future.

Also with GenAI, we have an expanded opportunity to not only create personalized experiences, but personalized products as well. As a simple example, on request a telco analyzes a specific customer’s historic data and voice usage, media viewing habits and other factors—such as payment history—and dynamically creates and prices a product tailored to that specific individual. On a slightly more speculative level—but by no means far-fetched—physical goods can be produced to individual order using GenAI-produced programs and models to drive a combination of industrial robots, 3D printing, CNC milling machines and other computer-controlled manufacturing devices. These technologies exist today and, indeed, are being used to produce products under GenAI control in isolated pockets and on a small scale already. The only speculation is that these approaches will become affordable and widespread.

While it has taken thirty years and counting, I think the technical basis to realize a true “One to One” future is now in place. I think the next thirty years will transform retail almost beyond recognition, by allowing people and AI’s to conspire together to create value and deliver delightful buying experiences. And, not incidentally, create tremendous customer loyalty to those who deliver this service.

The hype cycle has little to do with the merits of a particular technology. It simply has to do with the amount of publicity the technology has received. In particular, if the publicity jumps ahead of what the technology can immediately deliver, then the technology quickly gets labeled as “over hyped”. This is not the ‘fault’ of the technology—just of the overinflated expectations for immediate benefits that grow up around it.

A case in point is, believe it or not, the world-wide web. Back in 1994, my team set up NeXT Software’s (now Apple’s) first website. At the time, there were only something like 10,000 websites on the entire internet (at this writing there are well over a billion). Even at its beginnings, though, it seemed obvious to me—and to a lot of other people—that Web technology was transformational. However, in the late 1990’s, believe it or not, the Web was considered over-hyped.

With the benefit of 25 years of hindsight it seems almost incredible to us that the world-wide web and the internet could possibly be considered overhyped. If there’s a single technology that truly transformed the world, I think most of us would agree that it’s the Web (plus the internet and the ‘personal’ computer, but those are stories for perhaps another day). The Web and the follow-on technologies it spawned have completely transformed our world, and their impact continues to fill our working and personal lives. Web-related and web-motivated technologies include social media, the cloud, smart handheld devices (phones, tablets, etc.), massive multi-player games, on-line dating, dynamic content creation, shopping and connected cars, and many others. In fact, it’s hard to imagine modern life without the Web, the internet, and its various downstream impacts. We simply take for granted instant access to information, ubiquitous connectivity, pervasive communication, remote device monitoring and control, media when and where we want it, as many others. These are now simply built into the fabric of our lives.

Yet the people who claimed the Web was overhyped in the late 1990’s had a point. At that time, connectivity was limited, and complex graphically rich page renderings were slow. Even when user interactivity was introduced, it was—at first—very simple by today’s standards; essentially form-based. E-commerce emerged very early—within two years of the first static website I mentioned—but issues like payment security were still being worked out and trust was low by today’s standards. And indeed, the naysayers were right in one sense: there was a “dot-com bubble” that burst and struck down many web- and internet-centric companies in the early 2000’s. While this downturn had many causes, one of them was that the “hype” had indeed gotten ahead of the technology.

Why do I bring up this ancient history? I think we’re going to see something similar happen to GenAI, probably this year (2024). Like many people, I am confident that GenAI and the downstream technologies it inspires will utterly transform the world—on the scale that the internet, the world-wide web and their follow-on technologies have done, if not more. Bill Gates is quoted as saying that in the short-run GenAI is overhyped, but in the long run it is under-hyped. I don’t know if Mr. Gates was thinking of the history of the Web when he said this, but I’m sure the analogy must have been on his mind. His remark is an excellent description both of the Web‘s historical adoption curve, and sums up very neatly what I think is likely to come with GenAI.

Today’s tools and technologies make it easy to create a very compelling demo with GenAI.  Today in early 2024, eighteen months after ChatGPT went public in the fall of 2022, many of us, myself included, continue to be stunned by what this technology can do. We are even more excited by what it promises for the future. However, as the POCs move into enterprise-scale deployments and business-critical applications, the problems and gaps will predictably start to surface.

People will realize that data is harder to gather, prepare, curate and keep relevant than they suppose. Approaches that only a few months ago defined the state of the art for GenAI development will change as new approaches are invented—obsolescing systems already built. We’ve seen this already: the “RAG” model (“Retrieval Augmented Generation”) that six months ago was so cool is now being termed the “naive RAG model” and has been replaced by the “advanced RAG model”. Probably, in the new future, it will itself be replaced by other approaches that are even better. Lots of work that was done to work around the 4k token window size supported by popular LLMs has become unnecessary because those token windows are expanding to 128k and are growing larger. People are starting to realize that the GPUs needed to power many GenAI systems are expensive and hard to come by, both physically and even on the cloud. New security vulnerabilities and threats will be discovered and invented. And, of course, hallucinations, bias, and inconsistent answers will plague suppliers and applications.

I think it’s pretty much inevitable that there will be a media (social and other) backlash against GenAI in the near future, and that the technology will be labeled as “over-hyped”. I sincerely hope it does not cause the Armageddon among startups that the “dot-com bust” of the early 2000’s did, but some companies will certainly fall victim to the hype cycle plummeting into what Gartner calls the “Trough of Disillusionment” [https://en.wikipedia.org/wiki/Gartner_hype_cycle].

To reframe a famous phrase in a totally different context, though, my experience of the dot-com era tells me that “the end of the peak hype cycle is the beginning of wisdom”. I think it’s a healthy thing for us all to realize that this technology will not, overnight, transform the world. Like all new technologies, GenAI has rough edges that need to be smoothed out, limitations that need to be discovered and overcome, security and other holes that need to be plugged, and infrastructure that has to be built around it before it becomes commonplace. I also believe that this will happen, and that GenAI and its downstream technologies will fulfill the promise that many of us see in it—and probably faster than we think. The important thing, as technologists, is to realize that the “hype cycle” is simply about the hype—it’s not about the technology. Let’s hope our bosses with the money understand the same thing!

Executives, decision-makers, technical experts, and Google Cloud partners converged at Google Cloud Next to explore cutting-edge innovations and industry trends. GlobalLogic was there, speaking about modernization strategy and delivering a Cube talk on Intelligently Engineering the Next GenAI Platform we are building for Hitachi.

Among the buzz at GCN 2024, using GenAI for customer success and process and platform modernization with AI stole the spotlight. Innovative ways companies are evolving from proof of concepts to proof of value were hot topics, too. However, challenges like data integrity and legacy point systems loom large as enterprises shift towards those proof-of-value AI-driven solutions and efficient monetization strategies. Where should you focus now – and what comes next as you develop your innovation roadmap?

Here are five key trends and takeaways from the event that speak to the essential building blocks innovative companies need to lay the groundwork for successful enterprise-grade AI implementations.

1. Applying GenAI for Customer Success

Enterprise-Grade GenAI solutions for customer success are revolutionizing service quality and driving business outcomes. Imagine equipping your frontline staff with GenAI-driven agents, empowering them to ramp up productivity and provide every customer with a personalized, enhanced experience. Built-in multilingual customer support makes GenAI a versatile powerhouse for enterprise teams, catering seamlessly to a global customer base with diverse linguistic preferences. 

This transformative approach to customer success merges advanced technology with human expertise, paving the way for exceptional service delivery and business success in the digital age.

2. Modernizing the Tech Stack & Transforming the SDLC

GenAI is reshaping the software development landscape by empowering developers to drive efficiency and elevate code quality to new heights. This transformative approach extends beyond mere updates—it's about modernizing the entire stack, from infrastructure to user interface. 

Innovative approaches include automated code generation, building RAG-based applications, enhanced testing and QA, predictive maintenance, and continuous integration and deployment (CI/CD). Leveraging natural language processing (NLP) for documentation, behavioral analysis, automated performance optimization, and real-time monitoring and alerting, GenAI streamlines development processes, improves code quality, and enables proactive decision-making. GenAI empowers developers to drive efficiency, improve security, and elevate software quality to unprecedented heights throughout the SDLC by automating tasks, optimizing performance, and providing actionable insights. 

Through comprehensive refactoring of applications, GenAI is leading the charge towards a future-proofed ecosystem. However, this ambitious undertaking isn't without its challenges; it demands time, dedication, and a strategic roadmap for success. 

3. Building a Future-Forward Framework for Success

Enterprises face key challenges in unlocking the value of AI, such as ensuring data privacy and security, protecting intellectual property, and managing legal risks. Flexibility is essential to adapt to evolving models and platforms, while effective change management is crucial for successful integration. 

Embracing a 3-tier architecture with composable components over the core platform emerges as the future-forward approach, fostering flexibility and scalability. Having a robust infrastructure and data stack to underpin the GenAI layer is indispensable, forming the bedrock for successful implementation. We refer to this holistic framework as the "platform of platforms," which not only ensures alignment with business objectives but also facilitates the realization of optimal outcomes in the GenAI journey.

4. Monetizing Applications 

Monetization was a hot topic at Google Cloud Next, and enterprise organizations gravitate towards Google’s own Apigee for several reasons. Apigee’s robust API management platform offers versatile monetization models like pay-per-use and subscriptions, streamlined API productization, customizable developer portals, real-time revenue optimization analytics, seamless billing system integration, and robust security and compliance features. 

For example, we recently designed and built a solution for monetizing an application that uses APIs to access and leverage industry data stored in a cloud-based data lake. This allowed for scalable and serverless architecture, providing reliable and updated information for improved decision-making, identification of new opportunities, and early detection of potential problems. Apigee’s reputation as a trusted and reliable API management platform is backed by Google Cloud's expertise and infrastructure, further solidifying its appeal to enterprise customers.

5. Evolving the Intelligent Enterprise from POC to Proof of Value

Transitioning from Proof of Concept (POC) to Proof of Value (POV) marks a critical phase in adopting AI technologies, particularly in light of recent challenges. Many POCs implemented in the past year have faltered, and the pressure is on to demonstrate a return on AI investments.

Maturing your AI program from POCs to POV calls for a holistic approach that encompasses not only the capabilities of GenAI but also your foundational architecture, data integrity, and input sources. Maintaining data integrity throughout the AI lifecycle is paramount, as the quality and reliability of inputs significantly impact the efficacy of AI-driven solutions. Equally important is the evaluation and refinement of input sources, ensuring that they provide relevant and accurate data for training and inference purposes. 

Successful GenAI implementations are those that are reliable, responsible, and reusable, cultivating positive user experiences and deriving meaningful value for the enterprise. 

Responsibility means delivering accurate, lawful, and compliant responses that align with internal and external security and governance standards. Reliability shifts the focus to maintaining model integrity over time, combating drift, hallucinations, and emerging security threats with dynamic corrective measures. Finally, reusability emerges as a cornerstone, fostering the adoption of shared mechanisms for data ingestion, preparation, and model training. This comprehensive approach not only curtails costs but also mitigates risks by averting redundant efforts, laying a robust foundation for sustainable AI innovation.

How will you propel your AI strategy beyond ideas and concepts to enterprise-grade, production-ready AI and GenAI solutions? 

Let’s talk about it – get in touch for 30-minute conversation with GlobalLogic’s Generative AI experts.

I think we’d all agree that getting promoted is desirable. Higher wages, better job title, greater impact, and perhaps more prestige. But it’s not without its problems.

Like many engineers, I started out writing code. I was good at it, and customer demand increased for my services. I then started hiring other engineers and promoted myself to what today we’d call an “architect” role (this is back in the days when software engineering roles were not so well-defined). I still coded, but only the critical algorithmic parts of the software.

I hired smart people, but at first, the engineers I hired were not as good at coding as I was. 

Then, over time, they got better. In fact, some of them became better coders than I had been. I realized that my promotion—while a positive thing—also involved at least two losses for me: First, watching someone do something worse than I had done it. And second, seeing them do it better.

I’ve been through the same process for pretty much every job I’ve ever held. Getting promoted is a good thing, but it also poses challenges for the one being promoted. You need to learn to be effective in a new role, and simultaneously, you need to let go of the skills that got you promoted in the first place. 

This is demanding, both work-wise and emotionally. In particular, working for your replacement’s success requires humility. Letting go of the ‘identity’ you built by being good at the previous level can be scary, especially when you’re trying to establish a new track record of success at a higher level.

While it can be hard to let go of your previous role and identity, I’ve learned, over time, that as a boss, I should only intervene when it’s important. It’s demoralizing for an employee to get overridden by their boss, so this should be a very rare occurrence and only happen when it’s truly needed. 

Recommended reading: GenAI and the Midas Touch

Telling your employees how to do their job when they already know or can figure it out themselves denies them a chance to learn. It’s a sign of an immature boss when he or she is not willing to delegate the tasks that made him or her successful in their former role. A boss's proper function is to monitor their team’s performance and intervene when something is seriously wrong. But when you as a boss are afraid to delegate, you fail to nurture your employee’s growth or, indeed, to grow yourself into your new higher role after you’ve been promoted.

I’ve learned that when an employee in my team simply does something differently than I would have done it, though equally well or only slightly worse, it’s best not to interfere. 

This gives the employee room to improve—and if they ask for my help, so much the better. And where an employee eventually begins to do my old job better than I would have, I’ve learned not to be jealous (at least I try), but rather to cheer her or him on and to regard their success as a victory for both of us. After all, he or she had the benefit of having a great boss (me), so it’s my victory, too!

I’m finding that this same management approach applies not just to humans but also to AIs. I’ve noticed the same “promotion” emotions in myself with my “AI-driven” Tesla, for example. I was comfortable with some of the Tesla's self-driving features right from the outset – for example, its adaptive cruise control. It didn’t brake and accelerate exactly like I would, but it did them as well as a different human driver would have. So, just as if I were being driven by someone else, I rapidly learned to be comfortable with it. 

Other features took me a while to rely on – lane changes, for example. For a year or so, I remained hyper-vigilant whenever I prompted the car to change lanes ‘just in case’ it didn’t notice other drivers already present or that another driver might be trying to overtake me in the passing lane. 

Over time, I came to accept emotionally that the car, with its array of sensors and cameras, was more aware of the other drivers and their acceleration than I was. If the car intelligently used that information (which I came to trust that it did), then I was safer letting the car do the lane changes than I was doing them manually. While I remain alert out of simple caution, I no longer stress about the car making lane changes. I enabled the automatic lane change feature and generally let the car make lane change decisions by itself.

There are still a couple of cases where I override the self-driving features of the Tesla. One is a freeway exit on my way to work. After exiting, I have to cross four lanes of traffic in about 500 feet/150 m to make a left turn into our campus. While the Tesla can do this automatically and safely, it starts and stops so suddenly and repeatedly that I’m uncomfortable with how it faces this challenge. So I switch to manual. 

Another is a freeway intersection where the left two lanes are always congested because of exiting traffic. I know this is the case, but the Tesla apparently does not because it keeps trying to put me in one of the congested lanes rather than the leftmost lane, which I know will usually be less crowded. I, therefore, override the automatic lane change feature in this case.

Recommended reading: Using AI to Maximize Business Potential: A Guide to Artificial Intelligence for Non-Technical Professionals

My point is that using the Tesla and its AI-based driving capabilities feels a lot like I gave myself a “promotion” out of my hands-on driving duties, and instead hired the Tesla as my driver. Like a human employee, the Tesla does some things better than I did, some things worse, and other things just differently. And as with a human employee, I need to monitor and intervene when Tesla’s departure from the way I want things done is important to me. But where the car is better or just different, I let the car decide.

I suspect this will be the case for all of us with GenAI; we are all giving ourselves a “promotion” when we adopt this new technology. 

Each of us is still discovering the ways GenAI can make us more effective in our jobs—and even, in some cases, where it can automate areas we used to do ourselves, allowing us to take on additional roles – and sometimes requiring us to shed what we are currently good at. This can be uncomfortable and even feel threatening. Our identities and sense of security as employees are based, in part, on what we see ourselves as being good at, so this is tough.

Your boss might not admit this to you, but when he or she first found themselves in a new position with more power and responsibility, I’ll bet they, too, were uncertain of how best to handle it. If you don’t already feel that way with GenAI, I predict that you will soon. 

GenAI is an empowering technology for “thought workers” and creative people. It’s not something many of us expected to see happen right now, but even in its relative infancy, it gives us all significant leverage to get more done better and faster. 

When you begin to grab hold of GenAI and make it work for you, my advice is to relax and explore the possibilities of the new role now open to you. Be a good boss and treat GenAI like your employee. You’ve just gotten a promotion!

Learn more:

Dive into the dynamic world of Over-the-Top (OTT) media technology with our comprehensive whitepaper, "The State of OTT 2024: The Race to Profitability." Discover the latest insights and trends shaping the OTT landscape, from consumer priorities to key technology advancements and the impact of emerging technologies like GenAI and 5G.

Explore the evolving preferences of OTT consumers, including their demand for superior user interfaces, seamless search and streaming experiences, and personalized content recommendations. Uncover the critical role of AI and machine learning in driving personalized user experiences and optimizing content monetization strategies.

Learn how cloud hosting solutions are revolutionizing content distribution and cost optimization for streamers, and how the advent of 5G is poised to transform OTT accessibility and user experiences worldwide. Gain insights into the globalization of the OTT market and the growing significance of streaming audio platforms in the digital media landscape.

Stay ahead of the curve and position your business for success in the fast-growing OTT industry. Download "The State of OTT 2024" whitepaper today and unlock the insights you need to thrive in the dynamic world of digital media.

While any software development initiative has unique features, some situations recur so often that I feel like I should have a recording that I can play back the next time that same situation comes up. One of these is the “What,” “How,” and “When” of software development.

Projects get into trouble when it’s not clear who owns these critical decisions, and—perhaps more importantly—when the wrong person or function tries to own one or more of them. When the business people try to own the technical “how” of a project, you know you’re headed for trouble. 

Similarly, when the technical people start designing end-user features (the “what”) without input from the users or the business, that often ends in disaster as well. And when either function tries to dictate “when” without regard to “what” or “how,” that spells trouble big-time.

Just the other day, I heard a business person say, “It’s obvious what they need to do—why can’t they just start coding?” Here the business person was saying, essentially, that the “what” is known (at least in their own mind), so the “how” should be obvious—meaning that engineering should just start doing it. 

In such situations, unless the engineers are truly incompetent (rare), it’s very doubtful that the business person speaking actually understands either the “what” or the “how.” The engineers certainly do not, or they would indeed be coding. 

Recommended reading: Software, the Last Handmade Thing

When a business person makes a statement like this, if he or she is in a position of sufficient power that the engineers do indeed “just start coding” even in the absence of clarity around the what or the how, the project rarely ends well. In particular, it rarely, if ever, delivers what the business person had in mind, when and how they wanted it. 

And—you guessed it—it’s the engineers who generally get blamed for the failure, not the person who insisted they go ahead no matter what.

Projects work best when the business says “what,” the engineers say “how,” and the business and technical people negotiate jointly in good faith over “when.” Sometimes the “when” is fixed—for example, a trade show-driven launch date or an investor deadline. In that case, the business and technical people need to negotiate over the “what” and “how.” 

Similarly, either the “how” or the “what” might be fixed—for example, because you are making modifications to an existing system and have limited technical options, or you have committed to deliver a certain feature. In this case, the “when” and the other of the three independent variables (either “what” or “how” respectively) need to be negotiable. Otherwise, a predictable failure—and/or development burnout—will occur.

Perhaps the most frequent issue is when a single person or function tries to own all three—the what, the how, and the when—telling engineering what they need to develop, how they are going to develop it, and when the project is to be delivered. Unless the person doing so is a universal genius—rare—this inevitably leads to problems. 

I worked with Steve Jobs for four years at NeXT, and even he rarely tried to dictate all three. Two out of three he would try for—but rarely, if ever, all three (and then not for long). Steve would generally defer to engineering on the “how” and would often (though sometimes grudgingly) accommodate strong pushback on the “when.” While I’ve never worked with Elon Musk, I get the sense he also listens to a core team of engineers he trusts. Unless you consider yourself smarter than Steve Jobs and Mr. Musk, you should pause to reconsider your own actions when you try to dictate what, how, and when to your engineering team.

Another often-overlooked facet of this puzzle is the fact that all three activities require communication. Even if the “what” seems clear in your own mind, it still needs to be expressed in terms that the engineering team can understand. This process of ‘backlog elaboration’ nearly always reveals gaps in the clarity of the initial vision, even if it might have seemed ‘obvious’ to you. Similarly, the ‘how’ may be clear to your technical leads, but it still needs to be expressed in architecture diagrams, sequence diagrams, API specs, and other artifacts that communicate the technical vision to the engineering team. 

Only when the “what” and “how” are expressed in sufficient detail can a reliable “when” be produced. The fact that the “what” is clear in our business person’s mind, or the “how” is clear in the mind of the architect, does not mean that the person’s vision could be successfully operationalized without further work. This is why “just start coding” reveals a real gap in understanding of how successful software projects are implemented.

All this can be really fast—even verbally and at the whiteboard in some cases. But in general, the more input and understanding you get from the people actually doing the work, the better your backlog and the more accurate your timeline will be.

A proper appreciation for the value of each ingredient (“what,” “how,” and “when”), combined with due respect for the roles of their proper owners, is the key recipe for successful software development.

More helpful resources:

Despite uncertainty around regulation, millions are already interacting inside the metaverse, a market Ernst & Young expects could contribute over $3 trillion to global GDP by 2031. With the metaverse poised to dramatically change how banks, insurance companies, and other financial institutions engage with customers, IT leaders are focused intently on the challenges and opportunities ahead. 

Banking and financial services IT professionals gathered recently for an immersive, one-hour VR roundtable discussion in the metaverse, co-hosted by GlobalLogic and The CXO Institute. In The Future of Banking: Doing Business in the Metaverse, hosted by GlobalLogic CTO Steven Croke and facilitated by yours truly, participants took a deep dive into innovative next-generation banking and finance solutions on the horizon and questions on how banks will feed consumer needs for personalization, interaction, convenience, and security. 

In this article, you’ll find the highlights from our session, including top questions surfacing in banking and finance organizations as each plans its metaverse roadmap – plus your personal invitation to join GlobalLogic’s Monthly Metaverse Meetups for banking leaders and innovators. Let’s begin by exploring the most pressing challenges and opportunities digital leaders face as metaverse and VR adoption gradually increase.

Why Metaverse Planning is on Banking & Financial Service Roadmaps

Changing consumer demographics and rapidly advancing VR technologies drive massive opportunities for forward-thinking brands, and banking is ripe for disruption. A recent GlobalLogic survey revealed that 90% of Gen Z are willing to turn to big tech and nonbanks for better and faster banking services, and most participants in that demographic had “no idea why” they’d go into a branch when most basic things can be done quicker and easier online.

The same survey found that 80% of Gen Z respondents felt there was insufficient advice available about banking and financial products and that they did not understand how things like mortgages were structured. Investing was a key theme across our research interviews, and most participants brought the idea up unprompted. Inflation, skyrocketing housing costs, and increasing volatility in the job market are weighing heavily on consumers’ minds, and freelancing in various forms is becoming more common. 

For all their diverse needs across employment, banking, shopping, and entertainment, people are looking for more immersive, engaging, and personalized experiences. Increasingly, they’re finding those in the metaverse – particularly Gen Z and Millennials (spanning ages 14 to 40), around 40% of whom have already used VR technology in some way. According to Deloitte, close to 50% of this cohort say they spend more time interacting with others on social media than in real life. Further, Gartner predicts that by 2026, 25% of people will spend at least one hour a day in the metaverse for work, shopping, education, social, and/or entertainment.

This state of current affairs in which consumers are seeking out financial advice and services and increasingly doing so online ought to cause concern for banks, Croke shared with The Future of Banking participants. How will your business respond to an emerging group of consumers who do not feel they need banks or understand them properly and have little or no desire to enter a branch?

Metaverse Presents Opportunities for Education, Support & Customer Experience

The banking sector’s adoption of cryptocurrency and blockchain has increased significantly in recent years and will account for 4% and 4.5% of metaverse revenue in 2025, respectively. But beyond these earliest and best-known DeFi products, how will your bank build trust in the metaverse and make virtual interactions more compelling than the bricks-and-mortar equivalent?

Several banks are setting up lounges or virtual branches as an entry point to the metaverse and using the space to establish a presence and nurture customer relationships. Offering education, support, and advice on financial products in the metaverse can enable financial services brands to engage Gen Z even as VR banking matures.

HSBC, for example, purchased virtual real estate in The Sandbox to engage and connect with sports, e-sports, and gaming enthusiasts. Is this the right idea?

IT leaders attending The Future of Banking event had mixed feelings regarding virtual banking services. They expressed skepticism about the likelihood of adoption without a specific incarnation of virtual offerings that fires the customer’s imagination. Banks will need to give customers compelling reasons to go to the metaverse to complete actions they can already do with mobile banking applications or develop actions they cannot experience with mobile or web interfaces. The next biggest hurdle will be understanding what that will look like across the industry. 

Transitioning to a VR Financial Services Mindset

For one institution, KB Kookmin Bank in South Korea, it meant creating a virtual branch where simple transactions, such as remittances, can be managed at a teller window. 

“We're already seeing several banks now setting up branches… they're essentially providing lounges for users to go into those branches and try and make them, effectively, a place to get a conversation going with customers,” Croke shared. Roundtable participants were asked whether they see replicated real-world experiences as the model for transactions in the Metaverse.

One delegate, a CTO for a large insurance brand, said he felt that HSBC’s approach made more sense. “History is littered with examples of trying to replicate something in a new medium and it not working as well… doing something different in a different medium would probably be a more fruitful direction forward.”

Perhaps a hybrid approach would be easier than a metaverse-native experience? Banks may consider creating products that mimic something in the real world with a VR twin; for example, mortgage applicants could access and explore a digital twin of the property they’re considering. 

IT leaders must also consider how metaverse-native experiences might be handled in the future. “If you buy a ticket to a concert in the Metaverse, why would you not purchase that with a payment product that is Metaverse native?” Croke said.

Exploring Options for Metaverse Finance & Banking Products

Even if product development is still far off on the long-term planning horizon, bank leaders should be thinking today about broadening their ideas of what financial products could look like within the metaverse context. 

“We’re already seeing value items being created in the Metaverse,” Croke said. “We're seeing collectibles being created. We're seeing equities, we're seeing art being created. How are these going to be financed? How can one purchase those products? And if you think about storage, where do we store that value?”

Delegates questioned whether we should expect to see a dual business model, with banks in the Metaverse handling cryptocurrencies transacted through the banks or Metaverse ATMs. We tend to think today that the metaverse will not be able to handle traditional banking products. However, as one delegate pointed out, we may see this change once banking finds a strong use case to drive initial adoption and create demand for more services. “I think it’s about starting with a very niche, single-use case that's killer, and everybody wants to use it, but I think that's yet to be found,” he said. 

Visualizations offer an interesting way to explore the possibilities today. “If I want to have a 3D visualization of risk, rather than today's 2D diagram, for example… in 3D, I can move stuff with my fingers and share that information with other traders,” one delegate shared. “I think that'd be very helpful. So, I visualize value addition. I think that's a pivotal point where Metaverse can start adding value to existing processes.”

The Maturation and Growth of Decentralized Finance

The huge uptake in cryptocurrency and NFTs has led to a new virtual economy, even after the initial buzz died down. This is a borderless, secure, and fast environment in which DeFi enables financial transactions to be performed by entities directly using smart contracts without financial intermediaries. 

Still, we’ve not yet reached a point of maturity where people feel comfortable undertaking a number of activities and transactions in the metaverse outside of Gen Z and gamers. We have this community of early adopters who are already quite demanding and discerning in their metaverse experiences alongside a far larger population still trying to wrap their minds around the possibility. 

Whether adoption and user behavior drive regulation or increased regulation opens the door to greater adoption remains to be seen. The rise of central bank digital currencies and expressed desire from Singapore monetary (likely the furthest ahead at this point) raises many questions about DeFi and its impact on metaverse adoption and maturation. Even so, it is clear today that banks must prepare now to put their arms around this economy as the experiment and learn now so they can be best positioned to move fast and innovate as opportunities open up.

Final Thoughts & Continuing the Conversation

Internet users rely on multiple apps for authentication. But does the Metaverse require that we now own our digital identity? And if we move across multiple platforms, doesn’t a unique digital identity become a prerequisite? 

How do we combat money laundering and fraud in a virtual environment where a criminal can open a crypto wallet, fund their wallet with cryptocurrency, and buy a parcel on a chosen metaverse platform? Once the parcel is bought, they can build a store to hold their NFTs and sell NFTs as a cover for illicit products in real life. 

These are just a few of the metaverse questions and challenges facing IT leaders in banking worldwide right now. 

GlobalLogic will continue the conversation in our monthly Metaverse Innovation Meetups beginning Friday, November 10, to be held in VR. Join us for ongoing discussions about:

  • current trends in banking in the metaverse 
  • successes in the industry and lessons learned
  • brainstorming prototypes that will help define the app that will ultimately successfully drive VR adoption

We’re growing a community of like-minded innovators and business leaders to talk through ideas and help move banking in the metaverse forward in real ways. Will you join us? 

Click here to email me your request for an invitation to GlobalLogic’s Monthly Metaverse Meetup.

Digital product development can be a game-changer for organizations, in the ways it facilitates a seamless, software-driven user experience. It can provide insights on taking a user-centric approach to planning and developing digitally-driven solutions that delight users, create new lines of revenue, and scale with your growing business. 

Consistently applying a data-driven approach to digital product development helps your organization uncover customer insights, identify market trends, and validate hypotheses that result in products that better meet customer needs and drive business growth. Moreover, continuously iterating based on real-time insights ensures the products you've invested in are sustainable and evolve with your customers' needs.

In today’s world, organizations are accumulating and sitting on large volumes of data from an increasing number of systems and interfaces. However, this comes with its fair share of challenges, including (but not limited to) data quality and reliability, scalability and infrastructure, data privacy and security, and the growing talent and expertise gap. We’ll take a closer look at these key considerations and more, so you can achieve a more data-driven approach to digital product development.

1. Data Quality, Reliability & Governance

While the availability of vast amounts of data offers opportunities for valuable insights, it also introduces the risk of incomplete, inaccurate, or inconsistent data. Ensuring data quality and reliability is essential to leveraging the full potential of a data-driven approach.

Incomplete or missing data can result in incomplete or skewed insights, leading to flawed decision-making. Without reliable data, organizations risk basing their strategies on faulty assumptions or incomplete information.

Overcoming this challenge calls for robust data governance processes. This includes defining data standards, establishing data collection and storage protocols, and implementing quality checks. Data validation techniques, such as data profiling, outlier detection, and consistency checks, are crucial in identifying and rectifying data anomalies. Regular data audits and monitoring processes help maintain data integrity and reliability over time.

Additionally, organizations can employ automated data validation tools and techniques to streamline the process and ensure a higher level of data quality. These tools can flag data inconsistencies, identify missing values, and validate data against predefined rules or business requirements.

2. Scalability and Infrastructure

The ability to process and analyze large volumes of data is essential for effective digital product development. As organizations gather increasing amounts of data from diverse sources, scalability and infrastructure become critical factors in harnessing the full potential of this data.

Traditional systems and infrastructure may not be equipped to handle the velocity, variety, and volume of data that digital product development demands. Processing and analyzing massive datasets require powerful computing resources, storage capacity, and efficient data processing frameworks.

Investing in scalable infrastructure ensures organizations can handle ever-growing data volumes without compromising performance. Cloud-based solutions, such as scalable cloud computing platforms and storage services, offer the flexibility to scale resources up or down based on demand. This elasticity allows organizations to handle peak workloads during intense data processing and analysis periods while avoiding excessive costs during periods of lower activity.

Modern technologies like distributed computing frameworks, such as Apache Hadoop and Apache Spark, provide the ability to parallelize data processing across clusters of machines, improving processing speed and efficiency. These frameworks enable organizations to leverage distributed computing power to tackle complex data analytics tasks effectively.

Recommended reading: The Evolution of Data & Analytics Technologies

3. Data Privacy and Security

A strong focus on data privacy and security in digital product development helps organizations maintain compliance, protect sensitive data, and foster customer trust. This, in turn, allows for more effective data-driven decision-making and enables organizations to leverage the full potential of their data assets while mitigating the inherent risks.

It's not a matter of if it will happen but when, as IBM reports that 83% of organizations will experience a data breach. Those using AI and automation had a 74-day shorter breach lifecycle and saved an average of USD 3 million more than those without.

Safeguarding customer information and maintaining trust is crucial in a data-driven approach. This data often includes sensitive and personal information about individuals, such as personally identifiable information (PII) or financial data. Protecting this data from unauthorized access, breaches, or misuse is of paramount importance.

Organizations must comply with data privacy regulations, such as the General Data Protection Regulation (GDPR) or the California Consumer Privacy Act (CCPA). These regulations outline guidelines and requirements for the collection, storage, processing, and sharing of personal data. Adhering to these regulations ensures that organizations handle customer data responsibly and legally.

Companies can implement encryption techniques to protect data at rest and in transit, access controls, and user authentication mechanisms. Conducting regular security audits and vulnerability assessments is also best practice. Supporting these initiatives requires a culture of data privacy and security awareness among employees. Training programs and clear communication channels can help employees understand their roles and responsibilities in protecting data and recognizing potential security risks.

4. Interpreting and Extracting Insights

Extracting meaningful insights from complex and diverse datasets is crucial for driving product innovation and success. However, this task can be challenging without the expertise of skilled data scientists and analysts to apply advanced analytical techniques and statistical models. These professionals possess the skills to navigate vast amounts of data, identify relevant patterns, and extract actionable insights that inform product development strategies.

Data scientists and analysts involved in digital product development must have a deep understanding of statistical analysis, data mining, machine learning, and visualization techniques. They should also possess domain-specific knowledge to contextualize the data and derive meaningful insights relevant to the product and its target audience.

These professionals leverage analytical tools and programming languages to manipulate and analyze data, such as Python, R, SQL, and data visualization tools like Tableau or Power BI. They employ exploratory data analysis techniques, statistical modeling, predictive analytics, and other advanced analytical methods to uncover patterns, correlations, and trends within the data.

They can identify user behavior patterns, preferences, and pain points, allowing organizations to make data-driven decisions about feature enhancements, user experience improvements, and product roadmaps. Collaboration between data scientists, analysts, and product development teams is crucial for the successful interpretation and application of data insights. 

And, of course, this leads us to...

5. Talent and Expertise Gap

Successfully blending software engineering and data analytics expertise enables organizations to build data-driven products that offer exceptional user experiences. However, bridging the talent and expertise gap by finding skilled professionals with a strong understanding of both disciplines can be a significant challenge.

Software engineers possess the technical prowess to design and build robust and scalable applications, while data analytics professionals can extract meaningful insights from data and apply them to inform product development strategies. The intersection of these skill sets is relatively new, and the demand for professionals who can bridge the gap is high. This creates a talent shortage and a competitive job market for individuals with software engineering and data analytics expertise.

To address this challenge, organizations must invest in talent acquisition strategies that attract individuals with hybrid skill sets. They can collaborate with educational institutions to develop specialized programs that equip students with the necessary knowledge and skills in both domains. Providing internships, training programs, and mentorship opportunities can also help nurture talent and bridge the expertise gap.

Organizations can foster cross-functional collaboration to encourage knowledge sharing between software engineering and data analytics teams. This allows professionals from different disciplines to learn from each other and leverage their collective expertise to drive innovation in digital product development.

Additionally, promoting a culture of continuous learning and professional development is crucial. According to McKinsey, which takes regular pulse checks of product-development senior executives, 53% of decision-makers believe skill building is the most useful way to address capability gaps, ahead of hiring, talent redeployment, and contracting in skilled workers. Encouraging employees to enhance their skills through training programs, industry certifications, and participation in conferences and workshops helps keep them updated with the latest advancements in software engineering and data analytics.

Recommended reading: A Digital Product Engineering Guide for Businesses

6. Data Integration and Compatibility

Integrating and compatibility between disparate data sources and systems is a major challenge for organizations. Establishing seamless data integration pipelines and ensuring system compatibility is crucial for successful data-driven digital product development.

Organizations often have many data sources, including internal databases, third-party APIs, customer feedback platforms, social media platforms, and more. These sources can generate data in various formats, structures, and locations, making integrating and harmonizing the data effectively complex.

Legacy systems further compound the challenge. Older systems may have limited compatibility with modern data analytics tools and techniques. Extracting, transforming, and loading data from legacy systems for analysis can be cumbersome and time-consuming.

To address these challenges, organizations need to adopt a strategic approach to data integration, including:

  • Data architecture and planning to develop a robust data architecture that outlines data flows, integration points, and data transformation processes. This architecture should account for different data sources, formats, and systems in the product development lifecycle.
  • Data integration tools and technologies to simplify the integration of disparate data sources. These tools can help automate data extraction, transformation, and loading (ETL) processes, ensuring smooth data flow across systems.
  • API and middleware integration, which can facilitate seamless integration between systems and data sources. APIs provide standardized interfaces for data exchange, allowing different systems to communicate and share data effectively.
  • Data transformation and standardization. Data transformation techniques play a vital role in harmonizing data from different sources. Standardizing data formats, resolving inconsistencies, and ensuring data quality during the transformation process enables more accurate and reliable analysis.
  • Modernization efforts to improve compatibility with data analytics tools and techniques. This digital transformation could involve system upgrades, adopting cloud-based solutions, or implementing data virtualization approaches.

7. Data Visualization and Communication

By using data visually to tell a story through charts, graphs, dashboards, and other interactive visual elements, organizations can distill complex information into intuitive and easy-to-digest formats. Data visualization is pivotal in effectively communicating complex data insights to non-technical stakeholders. 

In its raw form, data can be overwhelming and difficult to comprehend for individuals without a technical background. Complex datasets, statistical analyses, and intricate patterns can easily get lost in rows of numbers or dense spreadsheets. This is where data visualization comes into play, allowing stakeholders to grasp the key insights and trends at a glance.

Effective data visualization relies on understanding the audience and tailoring the visual representations accordingly. Different stakeholders have varying levels of familiarity with data and different areas of interest. The visualizations should be designed to align with their needs, ensuring the right information is conveyed clearly and concisely.

There are several key principles to consider when designing data visualizations for effective communication, including simplifying complex data, a visual hierarchy that highlights important information, contextualization and relevant comparisons, interactivity, and compelling storytelling.

Recommended reading: 4 Best Practices to Guide IoT and Dashboarding Projects

8. Ethical Use of Data

The collection and analysis of vast amounts of data give rise to ethical considerations. As organizations harness the power of data to drive product development strategies, it is essential to uphold the highest standards of ethical conduct. This includes respecting user privacy, protecting sensitive information, and ensuring data usage complies with applicable laws and regulations.

Obtaining informed consent from users is essential. Organizations must be transparent about the data they collect, how it is used, and the measures in place to protect it. 

Fairness is another crucial aspect of ethical data use, ensuring that the organization is using unbiased algorithms, models, and analytical techniques that do not discriminate against individuals or perpetuate societal biases. Proactively assess and mitigate potential biases in data collection, analysis, and decision-making processes to ensure fairness and equity.

Social responsibility is another guiding principle in data-driven product development. Advocate for the ethical use of data to address societal challenges, foster positive social impact, and avoid harm to individuals or communities. Consider the broader implications of data practices and determine how your organization can actively contribute to creating a responsible and inclusive digital ecosystem.

Implementing ethical data practices requires a comprehensive approach that includes clear policies, regular audits, and ongoing training for employees. It's well worth getting right. Ethical data practices contribute to the long-term sustainability and reputation of organizations, while also aligning with broader societal expectations and regulatory requirements.

9. Cost and ROI

Implementing big data and analytics solutions in digital product development comes with significant upfront costs, including investments in infrastructure, tools, and talent acquisition. Organizations must carefully evaluate the return on investment (ROI) to ensure that the benefits derived from analytics initiatives outweigh the associated expenses.

While the costs of implementing big data and analytics solutions can be substantial, the potential benefits are equally significant. Leveraging data efficiently allows organizations to gain valuable insights, make informed decisions, and drive business growth. Research from The Business Application Research Center (BARC) shows that companies leveraging their data efficiently see an average increase in profitability of 8% and a 10% reduction in costs.

Begin by clearly defining the specific business objectives and key performance indicators (KPIs) your big data and analytics initiatives aim to address. This provides a basis for evaluating the impact and effectiveness of the investments made.

Conduct a thorough cost-benefit analysis to assess the potential returns and associated costs of implementing big data and analytics solutions. Consider both tangible and intangible benefits, such as improved decision-making, enhanced customer experience, and increased operational efficiency.

When investing in infrastructure, consider scalability to accommodate future growth and increasing data volumes. Cloud-based solutions offer the flexibility to scale resources based on demand, minimizing upfront infrastructure costs while providing the necessary capabilities to handle growing data requirements.

Establish mechanisms to measure and track the ROI of big data and analytics initiatives. You'll need to regularly assess the impact on key business metrics, such as revenue growth, cost savings, customer satisfaction, and operational efficiency.

10. Continuous Learning and Adaptation

Staying current with the latest advancements, best practices, and industry trends is vital in digital product development, where technological advancements, new methodologies, and emerging opportunities drive constant evolution. To remain competitive and harness the full potential of data, thought leaders must foster a culture of continuous learning and adaptability within their organizations.

Encourage teams to pursue professional development opportunities. It's important to allocate time and resources for training and learning activities and provide access to relevant educational resources to facilitate these programs. Give employees space and time to establish knowledge-sharing platforms and communities of practice to facilitate the exchange of ideas and encourage collaboration, as well.

Agile methodologies, such as Scrum or Kanban, are great for promoting iterative development and continuous improvement. Apply these methodologies to data analytics projects to enable teams to adapt quickly to changing requirements, incorporate feedback, and continuously learn from data insights and even failures.

Continuous learning should extend beyond the boundaries of data and analytics, as cross-disciplinary collaboration and combining data-driven insights with domain expertise can lead to more innovative approaches in digital product development. Developing data literacy across the organization is crucial, and empowers individuals to make informed decisions, contribute to data-driven discussions, and effectively communicate insights to drive organizational success. Advocate for understanding and interpreting data among all stakeholders, regardless of their roles or technical backgrounds. 

Conclusion

Applying a big data and analytics lens to digital product development means taking a strategic, data-driven approach encompassing technical solutions, organizational cultural shifts, investment in talent and infrastructure, adherence to ethical principles, and a culture of continuous learning.

Yes, it's a tall order. Working alongside an experienced digital engineering partner like GlobalLogic through ideation, design, development, testing, deployment, and ongoing maintenance can help. We help organizations unlock the true potential of their data and get to market faster with innovative, compliant digital products that drive business success.

Want to learn more? Contact the GlobalLogic team today and see what we can do for you.

  • URL copied!