Wednesday, May 30, 2012

Often In Error, Never In Doubt

This week I have a personal rant, pure and simple: I am sick of people acting as though all of their choices are perfect, without alternative and obvious.

Long ago and far away, I was introduced to rigorous decision making, a concept invented by the ancient Greeks. As part of this tradition, I was taught to carefully define my premises, apply logic to determine my alternatives and then make value judgments as to which of those alternatives I would support.

In such a methodology, there are facts, which are special because they are objective (universal), there are observations, suggestions and preferences, all of which are subjective (personal).

In attempting to make a decision, especially a decision involving a group of peers or near-peers, it is therefore important to establish the premises and agree upon the facts. Then alternatives can be generated which are both realistic and which tend to achieve the goal. Every member of the group can consider the alternatives, weigh them as they see fit, and discuss the relative merits of the alternatives and perhaps generate hybrid alternatives which have broad appeal.

Instead, I am constantly finding myself on the wrong side of this cycle:

  1. There is an issue, usually not a critical one from my perspective
  2. Someone senior announces a solution
  3. Many someones junior voice objections
  4. Objections are countered with simple denial
    1. "You are wrong, that is a not a problem"
    2. "Your problem exists, but it will handled by the solution"
  5. The solution fails along the expected dimensions
  6. Juniors are trained to pretend that the seniors were right
  7. I am asked to join in the pretense while providing a non-pretend remedy

I am unclear as to why anyone would want to pretend that all problems have a clear, simple, perfect solution. This pretense is obviously absurd and continually contradicted by everyone's life experience. One of the primary skills of adulthood is the acceptance of imperfection and the pursuit of the as-good-as-possible. Why pretend that when at work, suddenly perfections are possible and always achieved by the right process?

This is particularly frustrating for someone in my position, the position of actually having to make systems work in the real world. Unlike senior management, I do not have the option of simply insisting that my work is perfect and that all criticism of it is invalid.

Now that I write that out, I can see that appeal: how awesome that would be! Except for the constantly failing part, which I would hate, even if I could browbeat people into never mentioning it.

My current working hypothesis is this:

  1. We live in an age of astounding possibilities and baffling large numbers of choices
  2. Our culture prizes confidence, at least in men, and rewards the confident whether or not they are correct (Often in error, never in doubt)
  3. In order to be determined confident in complex, uncertain situations, one must be simplistic; keeping it simple is not enough
  4. In order to be simplistic, one has to ignore complicating realities
  5. Voila! we end up where I find myself: listening to falsehood asserted with authority
Of course, I could be wrong: that is part of my point, one can just about always be wrong which is why one should just about always be willing to engage dissenters and answer their objects. Alas, "Are no!" "Are too" does not constitute discussion, however often or loudly this interaction is repeated.

Wednesday, May 23, 2012

Mobile Meditations

We are feeling pressure to enter the mobile space. Everyone else is doing it, so we should too.

As a parent, I tell my daughter that "everyone else is doing it" is not a good enough reason to do something. But as an IT business, we find that our customers sometimes cannot evaluate us on our own merits. Instead, they need proxies to help them decide if we are good at what we do.

Proxy evaluation is sometimes hard to avoidable: I don't go into the kitchen to assess a restaurant; I look at the menu, I assess the ambiance, I rely on word-of-mouth.

So I understand the well-meaning advice that we have to enter the mobile space, to make some of our customers comfortable that we are keeping current and that we are still good at what we do. I understand that more and more people seem to rate a technology company by acquiring that company's free smart phone app and trying it out.

I have even decided to take this piece of advice and to put a mobile app or two on the docket for calendar 2012. I have also decided that discussing this decision will give insight into how we make decisions about what we do and how we do it. This will also serve as an example of an issue that is common in our practice: when something seems obvious at a high-level (create a mobile app!) but is actually quite unclear at the level of action (what kind? how fast? how expensive?)

Specifically, entering this arena raises a number of questions:

  • What is the goal?
    • marketing
    • remove a negative (reassure customers)
    • revenue
    • demonstrate competence in this field
  • What kind of app?
    • client for a server? eg Facebook app is a UI to their server
    • native app? eg native camera app
    • combination? eg some games with a multi-player option
  • What business model?
    • free? (loss leader, eg banking app)
    • freemium? (give away something, charge for more)
    • for-profit? (straight up product)
  • For which platforms?
    • IOS only?
    • Android only?
    • Both IOS and Android?
    • stand-alone, Wifi, 3G/4G, some combination?
    • smart phone, tablet, both?
  • What market?
    • extend our product line to mobile?
    • create a separate mobile app, perhaps an add-on to someone else?
    • both?
  • What development environment?
    • native to target platform
    • use a virtualization that runs on multiple environments
As our thought process matures, I will return to this topic to record how we decide and what we decide.

Wednesday, May 16, 2012

In Praise of "Throwaway" Software

A technical person who worked for one of our clients once dismissed my work as "throwaway software." I think that he meant the comment to be dismissive, but I was not insulted or perturbed; I knew exactly what he meant and I was flattered. His comment mean that I had met my goal of providing just what the users needed just when they needed it.

Like me, he had learned to program in the bad old days when development was largely unassisted and laborious. Computer time was the bottleneck, to so we conserved it. We labored over "large" chunks of code, trying to find all the bugs and typographical errors and silly mental mistakes before we dared waste precious processor time to compile our code.

When code is hard to create, hard to change and likely to be in service for years, it makes sense to move slowly and carefully, to write for the ages. "Nothing ever gets thrown away," we were told. "Anything you write is likely to be recycled and to live forever, so every line should be immaculate and immortal."

If you don't remember assembling programs into punch card stacks and then putting those stacks into wooden trays so sys admins could load them up, then you are going to have to take my word for it: coding used to be a cottage industry: we lovingly hand-crafted every line. We were thoughtful and careful and only did Important Things with the Mighty Mainframe. We were very aware of our High Priest status and we did not want to anger our computer/deity or waste our precious opportunities to interact with it.

These days, thank God, the balance has shifted dramatically: computing power and disk space are cheap, so development is less about writing fantastic code and more about refining your software so it does exactly what it needs to do. The computer helps with with syntax-aware editors, chromacoding, ease of compilation or interpretation, on-line help, etc.

With the advent of "wrong sizing," there are usually too few workers doing too much work. As mentioned before, human attention is at a premium. This includes both the humans writing the code and the humans using the code. This means that developers need to do what makes them productive in terms of writing code, and software needs to do what makes users productive at their jobs.

Often, this means writing special-purpose, "throwaway" tools to help specific users do specific jobs. If you are one of the users, struggling to do an important job, you are not going to like the description of your urgent need as "throwaway." If you need support for "only" a few months, after which the tool will never be used again, so what? You still have the need and computers can still fill it.

(To me, an "app" does a fairly high-level job in a fairly complete way, while a "tool" does a very limited job in a possibly unsophisticated or unpolished way.)

I am struck by the fact that builders don't have this hang up: special purpose scaffolding is not "throwaway building." No one says "what is the point of building concrete formers when you are just going to tear them down later?"

I watched an expert put up a shelving unit for me using a temporary scaffold, which was eye-opening. He was slight and elderly. I was bigger, stronger and younger. I offered to help him put up the unit. He was polite in his refusal of my help. I stuck around to watch because I was baffled. He eyed the unit and wall space. He threw together a scrap wood frame. He and I put the unit on the frame, at which point the unit was right where it needed to be. He screwed it to the studs in the wall with two cordless drills: the left hand drilled pilot holes and right hand used a screwdriver drill bit to sink the screws. With astonishing efficiency, the unit was up. He then removed the frame and, with the screwdriver drill running in reverse. disassembled the frame. Voila! a throwaway frame made the job almost trivial--assuming that one is skilled enough to throw together such a thing, which I am not.

I recently created such a tool to help one of our clients debug an interface. I knew going in that once the interface was working properly, the tool would be of limited use or interest. I also knew that without the tool, without the window into what was going on, the cycle of "we never got it!" being answered with "well, we sent it!" would go on forever.

As is so often the case, in the course of refining the tool and chasing down what the tool revealed, we found that there were many minor issues which appeared (falsely) to be a single, serious issue. Everyone was mostly right. No one was evil or stupid or uncaring. Once the tool gave us a shared context in which to have conversations, months of bickering were settled in days of rational, shared-reality-based email exchanges.

Is the tool "throwaway" software? Perhaps. Was it quick and easy to create? Yes. Does it provide excellent value for money? Absolutely. In this day and age, is "throwaway software" an insult? I would argue only for those living in the past.

(Note that I am not advocating writing buggy, sloppy code. Quite the contrary: weighting the effort toward refinement and debugging, in my experience, produces a lower bug count and a better user experience.)

My advice is to get out there and start whipping off solutions to small- and moderate-sized problems. Your users will thank for it, even if your software only lives as long as it has to.

Wednesday, May 9, 2012

Painless Tech Transition

I just noticed my trusty Palm Tunsten E gathering dust on my desk. For years, it was my trusty and trusted PDA, worthy companion to my various non-smart phones.

I have been trying to tidy up my workspace and overcome my tendency to hang on to technology long past its expiration date--which reminds me, anyone need serial cables or an i-Opener?

The fact of the matter is that I haven't used my Palm in ages. Our consultancy made an utterly painless transition off of the Palm OS platform and on the iOS platform. In other words, we swapped out our various Palms and swapping in iPhones and even an iPod Touch.

I am struck by how well that tech transition went, especially given that so many of our clients experience tech transition as slow, expensive and painful. So why did ours go so well?

I argue that our transition was painless because it was also patient and precise. We made the list of what we needed, what we had in the Palm + phone era and therefore what we expected to have in the iPhone era. Then we waited until we could get everything that we used to have. Simply put, we wanted to stop carrying three devices and starting carrying one.

(Some of us had PDA + phone + pager; others had PDA + phone + MP3 player. Either way, it was two devices too many.)

Our requirements were not very exotic:

  • ability to make and receive phone calls
  • ability to send and receive text messages
  • support for time-and-materials billing
  • contact support
  • ability to send and receive emails
  • ability to make and share notes
  • access to a shared calendar as well as a personal calendar

We wanted to cut-over without leaping into the void, so we didn't wait until our Palms died, although they were clearly wearing out. We started by having one of us get an iPod Touch and use it for a while. It took over a year before all the pieces we wanted were there and validated.

Once the iPod Touch user was up and running, moving to the iPhone was a no-brainer: the cost was significant, but well within reason. Our transition was painless in part because we had had a high degree of uniformity to start with. Her experience is moving from the Palm to the iPhone paved the way.

Originally, I had wanted to make sure that we had at least one Android user, one Blackberry user and one iPhone user. But the lengthy discovery process (largely me striking up conversations with fellow business travelers on trains and air planes) indicated that supporting multiple platforms internally is more resource intensive than I expected or could tolerate. To my surprise, the consensus was that the iPhone ecosystem was more mature and more business-oriented.

While the recent Windows phones are well-reviewed, at the time the Windows phones were so bad that I did long consider them.

I gather that Android is catching up and might be a more of contender now but I don't really know much about it.

What I do know is that the iOS platform has met and exceeded our expectations. The range of applications available is stunning and many of them are actually useful, instead of being merely impressive or amusing. I am shocked that I use the web browsing, mapping, on-line banking, built-in camera, music player and e-book reader as much as I do. I am not much of a gamer, but I even play games on it occasionally, which beats staring into space when I am too tired to read or work.

So a large part of the painless transition is the superior technology to which we transitioned, although I am constantly reminded of a warning given to me by an Android-oriented colleague: the iPhone is a lifestyle choice and one is likely happier if one embraces that. Although I am a Linux guy, I run a VirtualBox virtual machine with Windows in it mostly to support iTunes and to interface with my phone. So far, this has been a small price to pay.

But more than the quality of the target technology, I attribute our success to the lack of artificial deadlines and to existence of clear, precise definitions of success. The only drawback I see so far is the assumption by many of our clients that we got iPhone so we could be part of techno-cool kid herd. That way lies madness. Instead, shoot for making transitions that land you in a place at least as good as where you started. You would be surprised by how many people settle for less.

Wednesday, May 2, 2012

Rigid Engineers, Obsessive MBAs

I feel a good, old-fashioned rant coming on. Just for balance, I will combine two rants: Rigid engineers and obsessive MBAs.


Engineering is a great discipline. I like it. I try to adhere to its basic tenets. I feel that engineering has the following strengths: it is rigorous, it is regular and it focuses on the relevant.

Bridges tend to stay up. Planes tend to fly. Ships tend to float. If you have a house or a car or a smart phone, you owe the engineering discipline thanks.

Software engineering is a little bit different: our building materials are abstract and our rules are a bit fuzzy and our adherence to our rules is not guaranteed.

Weinberg's Second Law: If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization.

However, my rant today is not about loosey-goosey software engineering, but rather its opposite: inflexible software engineering. The discipline of engineering is not good when there is no good rule and some software usage requirements create a very sloppy execution path. Instead of a measured, graduated response to the complex set of requirement, instead of a long chain of events from messy reality to complex model of the messy reality, the rigid engineer gives us a clean abstraction. Even if the clean abstraction does not, in fact, support the actual business process to be automated.

I have had to deal with engineers who actually raised the following objections to proposed designs or changes to their beautiful cathedrals of structured code:

  • "That would break my parser"
  • "That would require a goto"
  • "That would importing a large, messy library"
  • "My structure can't do that"
The subtext is always the same: "I want reality to be changed to suit my model." This often expressed as "the user can do {long complex sequence of instructions} to get the same result. There is no need to change my code."

To me, this is the tail wagging the dog: software should do what it needs to do, first and foremost. It is the job of the engineer to make that happen. It is not the job of the engineer to change the scope of the task to make the software simpler, unless the scope is beyond the ability of the implementor. In that case, the implementation folks should say that and try to work with the users to find a compromise. Decreeing that some functions are a bad idea is not what I mean by "compromise."

This belief that if your implementation cannot support it, then the request is bad and the implementation is good is related to the "great code is not great software" idea I touched on before, here.


A colleague drew my attention to this great article:

This article is a study in how classical MBA management has hurt Sony. The article describes a situation that I find increasing common not just in consumer electronics, but also in IT. (I have ranted about other manifestations of this, notable here.)

This article describes the typical MBA focus on operating efficiency, a la Demming. This focus is great, short-term: assuming that what you are doing today is what you need to be doing tomorrow, then finding better ways to do that is a great idea.

But in technology, figuring out what you should be doing tomorrow is often very important. Operating efficiencies gained with obsolete technology are often smaller than operating efficiencies to be gained by using current or even leading-edge technology.

Taking chances and taking risks as you figure out what it all means to you pays real rewards.

But classical MBA management teaches managers to be wary of the new and to concentrate on the current. In an industry where 18 months is a typical development cycle, a three year plan to increase efficiency will be two cycles out of date by the time it is finished.

I understand that operating efficiency makes sense. I realize that most organizations cannot handle constant flux. I appreciate the need to plan and to budget. But I am horrified at the ever-increasing tendency of the managers we meet to assume that all technology is pretty much the same and that three-to-five year roll-out plans make sense for software, with "freezes" while the implementation is done and evaluated. No wonder so many companies are out of date and out of touch with IT.