Killing the Yoda

Yoda vs. Hulk (164/365)In technology, there are neverending aspirations to be better - and one thing in common with technology is Yoda. Well known he is to the technology community.

There's a Buddhist koan that I've derived from for this post. It states that if you find the Buddha, you should kill him (Linji). It's not a literal saying, it's about people being on their own path and aspiring to be more.

It's a good thought. You can read more about killing the Buddha here. The same thought adapted is to kill the Yoda.

The idea here is that any predominant technology, best practice, software architecture, software process... even education related to any of that (or anything else) is a stepping stone.

How does this apply to software? Let's say that you write the best possible application for real time widget updates. It's your Yoda. Someone else with either a lot of time or a lot of money is going to kill that Yoda unless you kill it yourself and replace it with another Yoda. In fact, that's what Software Development Plans are partially for- defining the End of Life for a project, and every project should have one. Otherwise, Yoda++ will take your Yoda down.

That process you think is perfect? It won't always be.

That architecture you think is perfect?

Those 'best practices'?

Yoda's your pal now. Be ready to take him out.

(Throw Jar Jar Binx under the bus while you're at it).

Best Practices, Software Process and Architecture: Stuck on the Tracks

Man fixing railroad tracksOnce upon a time, I was a member of the Software Engineering Process Group at Honeywell. I'd already had access to what was done so far for our division, so I got a copy of The Capability Maturity Model: Guidelines for Improving the Software Process (almost completely outdated by the new CMMI). I got sent off to class and learned more than I expected.

In fact, one of the key things that always got pressed home was that the CMM was a guideline, and that many companies were using the book as The Law. It was never intended to be. It was intended to be adapted, to fit the business need, and to add value.

Therein lies the rub. Fairly frequently in the software development and engineering circles, some new 'best practice' pops up, some new way of looking at software processes, or some new software architecture gets evangelized. Generally, it's expected that whatever is newer is better. It's expected that software teams - of whatever structure - should be getting better. Sometimes it does, sometimes it does not. Petronius is often cited for this particular quote (I attribute it properly):

We trained hard ... but it seemed that every time we were beginning to form up into teams we would be reorganized. I was to learn later in life that we tend to meet any new situation by reorganizing; and a wonderful method it can be for creating the illusion of progress while producing confusion, inefficiency, and demoralization.

- Charlton Ogburn (1957)

Technology is rife with this. End users line up for it, software developers and engineers are swayed by new ways of looking at old things, as is management. Things done right for years can be turned on their head because of something 'newer' and allegedly 'better'. Entire programming languages and operating systems have risen and fell.

Sometimes some good comes of it, such as measuring the productivity of a programmer by Source Lines of Code - though these days it does seem to be commonly measured by how much they sit in a chair or how many hours they put in staring at a monitor (we'll fix that eventually).

Can you fully test that legacy application, or do you need to grandfather it in and only test modifications (in the hope that undocumented features aren't broken)? Do you patch your software with human process or do you patch your software? Do you stick with a n-tier architecture or do you adapt on it so that your business needs are better met? Do you code review every change, or do you only code review release versions to assure synergy doesn't create new issues? Do you abandon a code base because it won't work with Windows 10, or do you support it as little as possible until it is not relevant?

Everything, from architecture to software process to best practices needs to be adapted to the business need. Why? Because without the business need, no one gets paid.

Sometimes you need to jump the tracks. Or adjust them. And if you're spending more time adjusting them than getting things done, you have a big problem.

Designed To Fail

TypewriterI've been reading Don Norman's 'The Design of Everyday Things (Revised and Expanded Edition) and have enjoying it. As someone who has seen quite a few things that failed because of poor design over the years - fortunately none of them were mine (yet?) - it's an entertaining read. The 'Norman Doors' we all have encountered, where how to use the door is either misleading or not apparent, were named after the author of the book.

In fact, the image with this entry is one of the keys of a typewriter. The QWERTY keyboard you likely use every day was designed to keep the keys from sticking together. It was designed to slow typists.

But now that the mechanical keys no longer exist in this way, decades later... we still use QWERTY keyboards. Don't worry, so do I, but the point is that we are limited by a design that no longer need apply.

Some design issues have much more severe impacts. From the book (emphasis mine):

I was called upon to analyze the American nuclear power plant accident at Three Mile Island (the island name comes from the fact that it is located on a river, three miles south of Middletown in the state of Pennsylvania). In this incident, a rather simple mechanical failure was misdiagnosed. This lead to several days of difficulties and confusion, total destruction of the reactor, and a very close call to a severe radiation release, all of which brought the American nuclear power industry to a complete halt. The operators were blamed for these failures: "human error" was the immediate analysis. But the committe I was on discovered that the plant's control rooms were so poorly designed that the error was inevitable: design was at fault, not the operators. The moral was simple: we were designing things for people, so we needed to understand both technology and people. But that's the difficult step for many engineers: machines are so logical, so orderly. If we didn't have people, everything would work so much better...

So far in reading I've noted that the designs described in the book are tangible. I deal with intangibles. Over the years, I've dealt with software, processes and other aspects of organizing data into information - and over time some of those become things that are 'designed to fail' as the world evolves around them. These deserve attention as well.

A piece of software that spams users with alerts is a piece of software that will annoy people. A software process that annoys developers will eventually be bypassed by the developers. A feature that was designed in can quickly become a bug. A bureaucracy designed not to change (read: Faster: The Acceleration of Just About Everything ) can quickly become an impediment for progress and cause a level of frustration that almost any government office has been accused of.

Things change. Systems need to change with the times or they are designed to fail. Or, as you've probably heard someone say, 'Adapt or die.'

See also: Is you technology solving a problem or creating new ones?