-
-
-
-
URL copied!
This blog post on software development and AI was originally published in 2009.
“Collector” is probably too strong a word, but I am definitely an aficionado of handcrafted items. While true hand craftsmanship has become rare and generally prohibitively expensive in the West, I’m fortunate to travel to places where handmade items are still relatively affordable.
Over the years, I’ve managed to accumulate artwork, metalcraft, furniture, embroidered tablecloths, handwoven shawls, silks and other items that my family and I really love. I’ve been luckier in finding great women’s clothing items than men’s, but at least that has made me a popular gift-giver in my family!
Having beautiful, handmade things in my life is a source of real satisfaction. I like the thought that an actual human being made something I have or use and, I like to think, perhaps they cared about what they made, and took pride that what they made was really good. Maybe they even hoped that someone like me would come along who would appreciate what they were making.
A handcrafted item puts me in touch with a different way of life — that of the craftsperson. And the best of these items have an elegance and, well, “soul” to them that machine-made items just don’t seem to have.
When I’m looking at these items, it strikes me that my work is not so different.
Software is the last handmade thing in common use in the developed world.
For those of us in the software industry, that software is “handcrafted” is no great revelation. To us, it’s clear that beyond the technology (which, though sophisticated, is at least largely amenable to human control) the true difficulty in producing a software product are the human factors and the imponderables that human beings and human interactions introduce.
Humans misunderstand directions, give unclear requirements, make mistakes and wrong assumptions, have prejudices and divergent career goals, are good at some things and bad at others, can act lazy or be overly aggressive, and generally are very human.
Recommended reading: ChatGPT and what makes us human
Though the technology is compelling, to me the people aspect of the business is the bigger challenge. How do you take a collection of imperfect human beings and get them to work together to quickly produce a quality product that does what you and your customers really want it to do? That’s the challenge, and why software product development is so much a human activity.
A few years ago, I acquired a small carpet I really love. This is a fairly elderly appearing tribal pattern carpet, and the seller made extravagant claims for its origins and history. After a series of visits and hard-bargaining sessions, I finally bought it. But I still had doubts about whether I had really purchased a handmade masterpiece or an artificially aged, cheap, machine-made knock-off.
After living with the carpet a while, I began to notice a lot of small asymmetries in the intricate patterns. A series of “gulls” or “C” shapes make up part of the border, for example. I began to notice that some of the “C” shapes opened to the right, and others to the left, and that while the two sides of the carpet were close to being mirror images of each other, they actually were not.
There were a number of other such small asymmetries here and there throughout the carpet. After a few weeks of observing these imperfections, I became completely convinced that this was indeed a laboriously handmade carpet. It would be prohibitively expensive or even impossible to achieve this degree of asymmetry and imperfection by machine; the only way it could be done was by hand, knot-by-knot.
I’ve also read that in more than one culture, such asymmetries are inserted purposely as intentional flaws. This is to avoid tempting fate by aspiring to make something perfect.
While I think this philosophy is probably a good example of making a virtue out of necessity, I nonetheless appreciate the sentiment. It is, unfortunately, not for us humans to produce perfect work — at least, I’ve never seen it. Machines, maybe; but a human endeavor, no.
Learning to appreciate the rough edges.
As another case in point, I once tried to get a business suit made for myself in India, figuring the labor cost would be low — which it was. I had read that one of the hallmarks of a good tailor-made suit were buttonholes made by hand. I had also read that you can tell whether a buttonhole is handmade or not by looking at the side of the buttonhole on the inside of the suit, facing the wearer. If the buttonhole is imperfect on the inside, it means it has been hand-embroidered; if it’s perfect on both sides, it’s machine-made. In the West, you will pay a lot more for a suit with “imperfect” buttonholes than for a suit with “perfect” ones, because hand embroidery is a sign of extra, skilled effort.
However, when I asked the Bangalore tailor, “Do you put handmade buttonholes on your suits?” he looked embarrassed and responded, “Yes. We’ve been trying to save up for a machine but so far, we can’t afford one.”
The tailor’s perspective regarding handwork and my own were quite different in this situation. To me, the value of the suit was increased by skilled handwork, even if the result was in some sense imperfect. To the tailor, the imperfections (and the extra time that came from the required handwork) were a negative.
It is imperfection that is the hallmark of a handmade item, not perfection. Both the Bangalore tailor and I agreed on that point. But where I valued the “imperfection” in this case, he was embarrassed about it.
As consumers, our standard for software products really is perfection.
We like our buttonholes perfect on both sides. Like that tailor in India, though, as software developers we are in a situation where there are no tools available to us that will rapidly produce a perfect product. In large part, such tools are not even theoretically possible, because the goals or requirements for a software product are invariably “fuzzy” to a smaller or larger degree when we begin a project.
Years ago, in the early 1990s, I was on the team that developed Rational Rose 1.0. I believed — as did a number of my colleagues at the time — that we were helping to create a new way of developing software. We felt that in the future, people would generate code directly from a graphical version of their software architecture, and would then be able to use the same tool to reverse engineer hand-modified code back into an updated graphical representation.
Alternatively, you could start with an implementation, semi-automatically extract the architecture from it, and proceed from there. The round-trip nature of this process would overcome what we then saw as one of the major obstacles to good software architecture, which was keeping the architecture diagrams up to date with the actual implementation.
Recommended reading: How to Hire a Software Architect
The reverse engineering piece, once fully in place, would allow people to effortlessly toggle between a high-level architectural view of their code and the detailed implementation itself, we thought.
Now, we’re fifteen years down the road.
Why isn’t a rigorous architectural approach to software development widely used?
Why don’t people architect their product and then just mechanically generate the implementation code directly from the detailed architecture?
Surely enough time has passed since Rose 1.0 that its descendents and competitors could have made this approach completely practical if that’s what the market wanted. There are probably many reasons why this is not the approach companies actually tend to take today, but I would argue that a key factor is that people generally do not know exactly what their product is supposed to do when they start work on it.
I would also argue that in most cases, they can’t know. Even in the case of a relatively mechanical port, the urge to add features, fix problems, act on mid-course learning and/or exploit the possibilities of a new technology generally prove irresistible. And in real life, the resulting product will invariably be different from the original concept.
There is no tool yet devised that will mechanically turn a “fuzzy” or currently unknown requirement into one that is clear and unambiguous. There are definitely techniques that can help the product owner refine his or her vision into something less ambiguous: Innovation Games, Agile kick-off meetings, requirements as test cases, rapid prototyping and the “fail fast” approach, for example.
And I can still imagine a world where completely unambiguous requirements are fed into a machine and something perfect pops out of the other end — like code from a Rose diagram or, to return to our tailoring analogy, a buttonhole from an automatic sewing machine. What I cannot imagine, though, is human beings producing specifications so perfect that what comes out is actually what is desired, perhaps even for a buttonhole.
Until humans are out of the requirements and specification loop (which I can’t imagine if the software is to be used by human beings) I think we need to live with imperfection in this sense.
To be sure, our ability to implement requirements mechanically has and will continue to steadily increase. Programming paradigms like convention over configuration (Ruby) and aspect-oriented programming (Spring) are reducing the amount of code required to implement a new feature, reducing development times and eliminating systematic sources of error. Tools, languages and reusable components and frameworks have vastly increased programmer productivity over the last few decades, and I am sure this trend will continue. But to date, people are very much part of the process of creating software as well as specifying it.
Should we hope that human beings are eventually eliminated from the software development process?
Right now, a large part of the work of a professional software engineer could arguably be characterized as identifying and eliminating ambiguities in the specifications, to the point where a machine can mechanically carry them out. A software engineer often makes dozens of decisions each day about what the software should do in a given situation not specified or anticipated by the product owner or requirements, often because the situations are considered “edge cases” or “too low level.”
A theoretical device that completely eliminated these ambiguities would have to first identify them, and then surface them for resolution by the product owner at requirements-generation time. But would the product owner be able to cope with a huge volume of decisions about, say, out-of-memory conditions in a very specific situation many levels deep?
My guess is that while in the future “programming” will be done at a higher level, with better tools, more reusable frameworks and even perhaps artificially intelligent assistance, it will remain a human activity at core for years to come.
As long as humans are better at understanding the intent of other humans than machines are, I think this must be the case.
At some point, machines will probably become so smart, and the collection of reusable frameworks so deep, that AI systems can assemble better software from vague requirements than people can. Until then, however, I think we will have to learn to appreciate our buttonholes with one rough side, and use approaches like Agile that acknowledge that software development is a human activity.
More helpful resources:
Top Insights
If You Build Products, You Should Be Using...
Digital TransformationTesting and QAManufacturing and IndustrialPredictive Hiring (Or How to Make an Offer...
Project ManagementTop Authors
Blog Categories
Let’s Work Together
Related Content
Unlock the Power of the Intelligent Healthcare Ecosystem
Welcome to the future of healthcare The healthcare industry is on the cusp of a revolutionary transformation. As we move beyond digital connectivity and data integration, the next decade will be defined by the emergence of the Intelligent Healthcare Ecosystem. This is more than a technological shift—it's a fundamental change in how we deliver, experience, … Continue reading Software, the Last Handmade Thing →
Learn More
Crowd-Striked: Lessons Learned and Best Practices for Future Prevention
Incident Summary On July 19, 2024, CrowdStrike released a content configuration update for the Windows sensor that resulted in widespread system instability, causing Windows systems to experience the "Blue Screen of Death" (BSOD). The issue was traced to a channel file named “C-00000291*.sys” included in the update, which caused system crashes upon deployment. “Channel files” … Continue reading Software, the Last Handmade Thing →
Learn More
Share this page:
-
-
-
-
URL copied!