Recently we shifted our company from Plain Old Telephone Service (POTS) to Voice over IP (VoIP). To my surprise, even though designing, creating and deploying new technology is our business, I was a bit apprehensive. I am assuming that this is a valuable reminder of what it is like to be on the other side of the table.
POTS is what telephone service has been since it was conceived: copper wires carrying signals from one telephone handset to another, with lots of switching and amplification in between.
I was comfortable with POTS, with faxes, with modems, with copper wires and splitters and the like. When we built new offices some 15 years ago, I had made sure that there were plenty of phone lines coming into our office: some for our internal PBX (that thing known to most as "the phone system"), some for our fax machines, some for our modems.
I was so comfortable with POTS that this transition was put off for a really long time and much of it happened when I wasn't looking.
Over the years, faxing migrated from special purpose fax machine hardware to fax servers and the like (Open Source fans that we are, we used HylaFax). We still needed a fax line, but not so many.
Broadband Internet access and the rise of Virtual Private Networks (VPNs) pretty much killed the need for modems as a way to access client resources from afar, so we stopped using them, even from the road.
The jump to smart phones had been carefully planned: we waited until we felt that they were mature enough to allow us to abandon our Personal Digital Assistants or PDAs (we were using Palm devices) to hold our contacts, handle our calendars, keep our secrets and track our billable time. Once smart phones could do all that, we went to smart phones instead of mobile phone + PDA.
When we finally adopted smart phones (we decided to go with iPhones for a
variety of reasons), we suddenly found our desk-bound phones to be kind
of a drag. We wanted to deal with only one phone system, but we didn't want to give our mobile phone numbers, nor did we want to lose the basic business functions of having a central phone system.
So we decided to go with a virtual PBX from RingCentral. This gives us the PBX features, but we can also run their smart phone app and use our smart phones as business handsets as well. So our business calls follow us around (during business hours) and I have reclaimed some precious desk space from the hulking handset. Our faxing is also handled through our virtual PBX.
It has been a couple of months now and we are very happy with the lower overhead, the feature set, the convenience and the greater access to our voice mail and our faxes. The computer room is much tidier without all those phone connections, the old hardware PBX and the line conditions, etc.
So why was I apprehensive? It was not the change to VoIP: I know that technology inside and out. In a previous incarnation, over ten years ago, I got Net2Phone's protocol up and running on a VoIP phone we were developing and their tech folks to whom I demonstrated it were very impressed as I was the first external person to get their protocol working on a non-computer. I spent hours making phone calls using VoIP and writing software to support VoIP. I am comfortable that the technology is even better now.
It was not the cost:benefit analysis which our operations person did and clearly showed that it was time.
Upon reflection, I think that it was inertia and simple unreasoning fear of change. Exactly the sort of thing that I have to fight in my clients when I create new technology for them to streamline their processes and better integrate their systems.
I respect fear of change: careful consideration is a good idea before every big decision. I understand better than most that not all change is good and that tearing apart the present does not guarantee that you will assemble a better future.
But I don't respect procrastination or paralysis by analysis: may God grant me the wisdom to know the difference.
Wednesday, November 30, 2011
Wednesday, November 23, 2011
Doing Without For Now
In these fiscally strapped times I am seeing a resurgence of an old financial misunderstanding: that not spending money is the same thing as saving money.
I understand that one should not spend money that one does not have. I am not advocating deficit spending, accounting magic or blowing your budget. Instead, I am advocating responsible investment.
Too many managers I encounter are rewarded for simply failing to spend money, or fear being penalized for spending money. Being "under budget" implies that you met the organization's goals by spending less than expected. This means either that you are good at getting results or bad at making budgets. "Failing to spend money" is not the same as being under budget: if you don't accomplish your goals, then you are failure who at least didn't waste money while failing.
But simply avoiding investment for the sake of not spending money is not fiscally responsible: it is the opposite of fiscally responsible. To make decisions without regard to future benefit is a mistake. If you want to go down this path, I can save you some time: this analysis leads to paralysis. Using this philosophy, you should never spend any money or take any action: not spending money will leave you with more immediate cash and not taking action will avoid mistakes. Just ignore the fact that not spending money can lead to lack of future money and not taking action can lead to not having a job.
To take a rather tired example from home ownership, not repairing a leaky roof gives you a short-term benefit (your bank account retains the money that you would have spent on the roof) and a long-term liability (your bank account will be hit much harder when the roof and collateral damage become so great you can no longer ignore them.) By the same token, if you don't choose a contractor, you avoid choosing the wrong contractor. So you have that consolation as your roof falls in.
I run into this same behavior in business fairly frequently, in the guise of the following sometimes reasonable statement: "we know we need X, but we’ll get it in the next release / next version / next purchase, so we will do without it right now."
The useful life of most of the systems I encounter is between three and five years. If you put off the decision for a year, you have lost much of the benefit the system can be expected to provide. If you put off the decision for two years, your potential loss is that much greater.
If your investment is a reasonable investment, you are missing the return on that investment every year you defer. In real terms, not spending money, if you spend it wisely, is actually costing you money.
When we speak of IT investments, we speak of more than dollars: IT can provide automation which makes your personnel more productive and less harried. With the time not spent in drudgery that should be done by a machine, your people can actually think about their jobs and improve their situation.
There is also the experience factor: if you make investments early and often, you can often make smaller, incremental investments and be guided by actual experience as you move toward your goal, instead of being guided by marketing literature as you wait until the last possible second and then leap onto a bandwagon.
I have heard a cogent counter-argument from reasonable people, which runs something like this:
I sympathize with the claim that larger organizations have such deep tendencies toward inertia that multi-stage plans are scary. But leadership requires actually taking reasonable risks instead of simply avoiding blame.
I understand that one should not spend money that one does not have. I am not advocating deficit spending, accounting magic or blowing your budget. Instead, I am advocating responsible investment.
Too many managers I encounter are rewarded for simply failing to spend money, or fear being penalized for spending money. Being "under budget" implies that you met the organization's goals by spending less than expected. This means either that you are good at getting results or bad at making budgets. "Failing to spend money" is not the same as being under budget: if you don't accomplish your goals, then you are failure who at least didn't waste money while failing.
But simply avoiding investment for the sake of not spending money is not fiscally responsible: it is the opposite of fiscally responsible. To make decisions without regard to future benefit is a mistake. If you want to go down this path, I can save you some time: this analysis leads to paralysis. Using this philosophy, you should never spend any money or take any action: not spending money will leave you with more immediate cash and not taking action will avoid mistakes. Just ignore the fact that not spending money can lead to lack of future money and not taking action can lead to not having a job.
To take a rather tired example from home ownership, not repairing a leaky roof gives you a short-term benefit (your bank account retains the money that you would have spent on the roof) and a long-term liability (your bank account will be hit much harder when the roof and collateral damage become so great you can no longer ignore them.) By the same token, if you don't choose a contractor, you avoid choosing the wrong contractor. So you have that consolation as your roof falls in.
I run into this same behavior in business fairly frequently, in the guise of the following sometimes reasonable statement: "we know we need X, but we’ll get it in the next release / next version / next purchase, so we will do without it right now."
The useful life of most of the systems I encounter is between three and five years. If you put off the decision for a year, you have lost much of the benefit the system can be expected to provide. If you put off the decision for two years, your potential loss is that much greater.
If your investment is a reasonable investment, you are missing the return on that investment every year you defer. In real terms, not spending money, if you spend it wisely, is actually costing you money.
When we speak of IT investments, we speak of more than dollars: IT can provide automation which makes your personnel more productive and less harried. With the time not spent in drudgery that should be done by a machine, your people can actually think about their jobs and improve their situation.
There is also the experience factor: if you make investments early and often, you can often make smaller, incremental investments and be guided by actual experience as you move toward your goal, instead of being guided by marketing literature as you wait until the last possible second and then leap onto a bandwagon.
I have heard a cogent counter-argument from reasonable people, which runs something like this:
- IT transitions are risky, even if done well
- There will be less risk if there are fewer transitions
- Fewer transitions means bigger ones farther apart
- The individual transitions might be more painful but ultimate time line is shorter and the new systems end up with a bigger footprint
I sympathize with the claim that larger organizations have such deep tendencies toward inertia that multi-stage plans are scary. But leadership requires actually taking reasonable risks instead of simply avoiding blame.
Wednesday, November 16, 2011
Hard vs Soft Mastery
Some years ago, as part of a discussion of computer programming styles, a colleague introduced me to the concept of "hard master" versus "soft master." Be warned: I am emphatically not making any value judgments here: I do not believe that hard masters are smarter, purer, better or more detail-oriented. I do not believe that soft masters are failing at being hard masters; I do not believe that a good hard master can do anything a soft master can do, or vice-versa.
Enough about what I don't mean: what do I mean? A little Google magic got me here, the horse's mouth: Feminism Confronts Technology by Judy Wajcman. Read the selected section for details, but the summary is this: the hard master feels he has to know all the details when using technology while the soft master knows how to get what she wants.
The use of gender-specific pronouns is not accidental: frequently hard masters are male and soft masters are female. Note that while there is a significant pro-male, anti-female bias in computer programming, this is not a bias from which I happen to suffer. I have worked with women for my entire technical career, and have found female technology interaction different but not inferior. In fact, I have found that this discipline has room for, and needs both, kinds of interaction. As a general rule, I want my bit-bashing systems software from an anti-social grumpy guy and I want my application software from an empathetic, flexible and cheerful gal.
I understand that these generalizations are not universal truths, that there are grumpy women who are antisocial and mathematical while there are cheerful men who are sensitive to how technology feels to use, and all possible permutations in between.
But I do not believe that all permutations are equally probable. I find distinct trends: men tend to define technological success and men tend to be hard masters. Women have their own technological approaches and these approaches tend to be deemed inferior by men, mostly men who cannot perform these "soft" tasks themselves.
Given this observation, I was not surprised at a recent recommendation of related reading from another colleague: an article from the Harvard Business Review that considers the assertion that women make your technical team smarter. I am sure that is true, at least for my team.
In the interests of full disclosure, I will admit to being a middle-aged man who thinks of himself as a designer, as something of a non-combatant in this fight. My attitude is that I can code anything I have to, but I am not a coder. Opinions of my coding by coders varies, but mostly I think that both hard and soft masters have reason to despair at my code. So I claim to be of neither camp: I need them both to get my designs realized, unless I end up having to code one or other kind of task myself. Which happens all too often in these recessionary times.
I find the hard versus soft dichotomy in server software versus client software. For example, configuring a web server is a heavily technical task which requires a deep understanding of networking, process management and security. By contract, creating a web page is also web-related, but has utterly different requirements, including a flair for graphic art and an understanding of usability.
I also see this division in the back end (data storage & retrieval) area as opposed to the front end (data acquisition through user interaction) area. Figuring out how to store data effectively is a job for a hard master who wants to think about file systems, database formats and the interaction of caching and performance. Figuring out how to get users to enter data quickly and accurately is about understanding how human beings use software.
From my perspective, there is trouble brewing for the hard masters. It is my observation that as abstraction rises mastery softens. When there was only simple hardware, Unix and ANSI C, someone could be a hardware expert, a firmware expert, a Unix system software expert and complete master of their own code base. Now, with server clusters and interpreted languages and database interfaces and web servers and programming environments embedded in web servers, I just don't see that as a viable or desirable goal. How do hard masters get anything done these days? They must have to restrict themselves to rather small areas of expertise.
Not only is abstraction on the rise, but I see a very real trade off between productivity and mastery which abstraction provides. Women who are happy to cruise along at the GUI-to-create-GUI level, such as Visual BASIC, kick my ass in terms of how fast and how custom they can make apps. Who cares how deeply they understand what underpins that abstraction? (Until that abstraction bites them in the butt; more of that anon.)
There has been a shift in the balance of power between apps creators (soft masters) and systems engineers (hard masters) as time has gone by. So far as I can tell, app programmers now rule: users understand what they do (create apps), economic buyers are willing to pay for what they do. Also for systems engineers, I find that users have no idea what they do, economic buyers don't relate to them or their work and organizations feel that they get the behind-the-scenes stuff for free, as part of buying apps, or smart network hardware, or whatever.
When I started out in the early 1980s, app programmers were the bottom of the ladder, while systems programmers did the real work, got paid the real money, and wrote apps whenever they felt like it. As a designer, I find that I no longer have to start by getting hard masters to buy into my design: in fact, I find that most of my gigs come from soft masters coming to me so that I can create a framework in which they can create the apps that people want. No one is interested in the lowest levels: is this MS-Access to a local database, or MS-Access as a client for a database server? Who cares?
Of course, users should care: bad infrastructure makes for a bad user experience. But users don't care about boring old infrastructure and systems engineering because most of the time they can afford not to.
All this does not bode well for the hard master. These days, I see hard mastery in demand only in those relatively rare instances when the abstraction fails or is incomplete. When you need a DLL to extend the Microsoft desktop, or a shared object to extend the Perl interpreter, you really need that. But how often does that come up? When I started out, we had five or six programmers, all male, two of whom were apps specialists. Over time, we have tended to add apps programmers, who tend to be women, and we have moved to consuming our hard mastery as a pay-as-you-go consulting commodity: we don't need full-time hard masters anymore.
This makes me a bit nervous: will the hard master be there in the future when I need him? But for now, I am busy being part of the problem: I need more soft masters and fewer hard masters I don't see that trend reversing anytime soon.
Enough about what I don't mean: what do I mean? A little Google magic got me here, the horse's mouth: Feminism Confronts Technology by Judy Wajcman. Read the selected section for details, but the summary is this: the hard master feels he has to know all the details when using technology while the soft master knows how to get what she wants.
The use of gender-specific pronouns is not accidental: frequently hard masters are male and soft masters are female. Note that while there is a significant pro-male, anti-female bias in computer programming, this is not a bias from which I happen to suffer. I have worked with women for my entire technical career, and have found female technology interaction different but not inferior. In fact, I have found that this discipline has room for, and needs both, kinds of interaction. As a general rule, I want my bit-bashing systems software from an anti-social grumpy guy and I want my application software from an empathetic, flexible and cheerful gal.
I understand that these generalizations are not universal truths, that there are grumpy women who are antisocial and mathematical while there are cheerful men who are sensitive to how technology feels to use, and all possible permutations in between.
But I do not believe that all permutations are equally probable. I find distinct trends: men tend to define technological success and men tend to be hard masters. Women have their own technological approaches and these approaches tend to be deemed inferior by men, mostly men who cannot perform these "soft" tasks themselves.
Given this observation, I was not surprised at a recent recommendation of related reading from another colleague: an article from the Harvard Business Review that considers the assertion that women make your technical team smarter. I am sure that is true, at least for my team.
In the interests of full disclosure, I will admit to being a middle-aged man who thinks of himself as a designer, as something of a non-combatant in this fight. My attitude is that I can code anything I have to, but I am not a coder. Opinions of my coding by coders varies, but mostly I think that both hard and soft masters have reason to despair at my code. So I claim to be of neither camp: I need them both to get my designs realized, unless I end up having to code one or other kind of task myself. Which happens all too often in these recessionary times.
I find the hard versus soft dichotomy in server software versus client software. For example, configuring a web server is a heavily technical task which requires a deep understanding of networking, process management and security. By contract, creating a web page is also web-related, but has utterly different requirements, including a flair for graphic art and an understanding of usability.
I also see this division in the back end (data storage & retrieval) area as opposed to the front end (data acquisition through user interaction) area. Figuring out how to store data effectively is a job for a hard master who wants to think about file systems, database formats and the interaction of caching and performance. Figuring out how to get users to enter data quickly and accurately is about understanding how human beings use software.
From my perspective, there is trouble brewing for the hard masters. It is my observation that as abstraction rises mastery softens. When there was only simple hardware, Unix and ANSI C, someone could be a hardware expert, a firmware expert, a Unix system software expert and complete master of their own code base. Now, with server clusters and interpreted languages and database interfaces and web servers and programming environments embedded in web servers, I just don't see that as a viable or desirable goal. How do hard masters get anything done these days? They must have to restrict themselves to rather small areas of expertise.
Not only is abstraction on the rise, but I see a very real trade off between productivity and mastery which abstraction provides. Women who are happy to cruise along at the GUI-to-create-GUI level, such as Visual BASIC, kick my ass in terms of how fast and how custom they can make apps. Who cares how deeply they understand what underpins that abstraction? (Until that abstraction bites them in the butt; more of that anon.)
There has been a shift in the balance of power between apps creators (soft masters) and systems engineers (hard masters) as time has gone by. So far as I can tell, app programmers now rule: users understand what they do (create apps), economic buyers are willing to pay for what they do. Also for systems engineers, I find that users have no idea what they do, economic buyers don't relate to them or their work and organizations feel that they get the behind-the-scenes stuff for free, as part of buying apps, or smart network hardware, or whatever.
When I started out in the early 1980s, app programmers were the bottom of the ladder, while systems programmers did the real work, got paid the real money, and wrote apps whenever they felt like it. As a designer, I find that I no longer have to start by getting hard masters to buy into my design: in fact, I find that most of my gigs come from soft masters coming to me so that I can create a framework in which they can create the apps that people want. No one is interested in the lowest levels: is this MS-Access to a local database, or MS-Access as a client for a database server? Who cares?
Of course, users should care: bad infrastructure makes for a bad user experience. But users don't care about boring old infrastructure and systems engineering because most of the time they can afford not to.
All this does not bode well for the hard master. These days, I see hard mastery in demand only in those relatively rare instances when the abstraction fails or is incomplete. When you need a DLL to extend the Microsoft desktop, or a shared object to extend the Perl interpreter, you really need that. But how often does that come up? When I started out, we had five or six programmers, all male, two of whom were apps specialists. Over time, we have tended to add apps programmers, who tend to be women, and we have moved to consuming our hard mastery as a pay-as-you-go consulting commodity: we don't need full-time hard masters anymore.
This makes me a bit nervous: will the hard master be there in the future when I need him? But for now, I am busy being part of the problem: I need more soft masters and fewer hard masters I don't see that trend reversing anytime soon.
Wednesday, November 9, 2011
Too Good to Fire, Too Old to Hire, Too Young to Retire
I find a disturbingly common situation amongst my cohort in the Information Technology realm: they feel stuck in their current job. As is typical with people who feel stuck in their job, they are not as productive as they should be--or as pleasant to be around as they used to be.
Part of this feeling comes from the current economic downturn, but a large part of it was present even during the heady boom times. That large part seems to be professional equivalent of the French idea of "a woman of a certain age." Women of a certain age are desirable, but with an expiration date. These professionals feel that they are required in their jobs, but only for the present.
Why are so many good and very good programmers and sys admins and db admins I know are languishing in limbo, unmotivated by their current job but unable or unwilling to find another job? I would characterize their plight this way:
Too Good To Fire
From the dawn of the business computer era, non-IT people have lived in fear of what would happen if the computer guru quit, taking all his (it was always a man) experience and special knowledge with him.
In or around 1968, I was a young lad interested in computers. Soon after the dawn of computers, I was saving my first programs onto paper tape in a closet at school, dreaming of some day saving my programs to mylar tape. Trying to be supportive of my unfortunate interest (wouldn't I be happier as a doctor or a lawyer?), my mother brought me into her place of work to meet the "the computer guy" (TCG).
It was obvious that at least that TCG in that organziation was regarded as strange and as a necessary evil. They would have liked to fire him, but they could not afford to lose him. My mother recounted in tones of awe that TCG had a light next to his bed in hist apartment across town that alerted him to problems with The Machine, the might mainframe computer. She was impressed with his dedication but also repelled by his lack of boundaries. This was and is a typical response to TCGs the world over. Hence "too good to fire" because it captures both the "we need him" and "we wish he were not here" aspects of many information technology jobs.
Over time, I have come to see that TCG as an archetype: a middle-aged man who is one or two technology waves behind the times, who is still critical to current operations but not part of future planning. He can see only an endless treadmill of doing exactly what he is doing now until he either drops dead in his office, makes it to retirement, or is made obsolete by some technology shift. What a waste: experience and talent turning into sullen bitterness.
Too Old To Hire
Why doesn't TCG just go find another job, in a place more congenial to tech types in general, or to him in particular? That is a good question and one that I have asked various TCGs over the years. The answer is usually "no one will hire me. I've looked."
Is this self-pitying drivel, a reflection of TCG's personality issues, or a prevailing prejudice? I suspect that it is that last one. If you are TCG and you are looking for a new job at a new company, I believe that you have two choices: either the tech-oriented nirvanas such as Google, Apple, Amazon or Microsoft or a tech-oriented division of non-tech company.
(In theory, TCG could start or join a start-up, but that is a rather rarified nitch and requires many more personal resources than being a computer guy requires. However, these days, every TCG seems to be an embittered potential entrepreneur: "I could have started Twitter/Facebook/YouTube.")
The first category is only open to the best techs, since there are more applicants than there are jobs. TCG may be the best {fill-in-the-blank} you have ever met, but he might not be great as compared to our entire industry.
The second category really does seem to have a barrier to entry, a distinct ageism. A tragically common theme in our business is that young hirees are best because:
Too Young to Retire
Sadly, TCG is often too young to retire, and that seems to be his only option. The commonly expected arc of a career seems to be this:
Note that if you were a "people person" you quite likely would not have started slinging code in the first place. So you are stuck with some unappetizing choices:
What Does This Mean To Me?
So now that you have read this far, you may be wondering what the point of this diatribe is. Here is the context-specific point:
To the co-workers of poor old TCG, I say this: remember that his sour puss may have more to do with being stuck than with being a misanthrope. More importantly, you are going to be hard-pressed to find a stick that will motivate TCG: his life already sucks. Try to find carrots instead, such as interesting small projects or chocolate or whatever it is that is safe, legal and appealing to TCG.
To the managers of TCG, I say this: you are probably kidding yourself if you think that TCG doesn't know that you plan to jettison him as soon as you can. You might find that sending him to training and showing other interest in his future is a better way to motivate him than pretending that you value him while counting the days until you can retire the system and fire TCG.
To TCG I say: find an interest in the future, even if you feel stuck. Without an interest in the future, you will end up bitter, hard to get along with and unhappy. Either embrace what you see coming or find another track or work your network to see if there is anything out there for you. Even false hope is better than despair.
Part of this feeling comes from the current economic downturn, but a large part of it was present even during the heady boom times. That large part seems to be professional equivalent of the French idea of "a woman of a certain age." Women of a certain age are desirable, but with an expiration date. These professionals feel that they are required in their jobs, but only for the present.
Why are so many good and very good programmers and sys admins and db admins I know are languishing in limbo, unmotivated by their current job but unable or unwilling to find another job? I would characterize their plight this way:
- I'm too good to fire
- I'm too old to hire
- I'm too young to retire
Too Good To Fire
From the dawn of the business computer era, non-IT people have lived in fear of what would happen if the computer guru quit, taking all his (it was always a man) experience and special knowledge with him.
In or around 1968, I was a young lad interested in computers. Soon after the dawn of computers, I was saving my first programs onto paper tape in a closet at school, dreaming of some day saving my programs to mylar tape. Trying to be supportive of my unfortunate interest (wouldn't I be happier as a doctor or a lawyer?), my mother brought me into her place of work to meet the "the computer guy" (TCG).
It was obvious that at least that TCG in that organziation was regarded as strange and as a necessary evil. They would have liked to fire him, but they could not afford to lose him. My mother recounted in tones of awe that TCG had a light next to his bed in hist apartment across town that alerted him to problems with The Machine, the might mainframe computer. She was impressed with his dedication but also repelled by his lack of boundaries. This was and is a typical response to TCGs the world over. Hence "too good to fire" because it captures both the "we need him" and "we wish he were not here" aspects of many information technology jobs.
Over time, I have come to see that TCG as an archetype: a middle-aged man who is one or two technology waves behind the times, who is still critical to current operations but not part of future planning. He can see only an endless treadmill of doing exactly what he is doing now until he either drops dead in his office, makes it to retirement, or is made obsolete by some technology shift. What a waste: experience and talent turning into sullen bitterness.
Too Old To Hire
Why doesn't TCG just go find another job, in a place more congenial to tech types in general, or to him in particular? That is a good question and one that I have asked various TCGs over the years. The answer is usually "no one will hire me. I've looked."
Is this self-pitying drivel, a reflection of TCG's personality issues, or a prevailing prejudice? I suspect that it is that last one. If you are TCG and you are looking for a new job at a new company, I believe that you have two choices: either the tech-oriented nirvanas such as Google, Apple, Amazon or Microsoft or a tech-oriented division of non-tech company.
(In theory, TCG could start or join a start-up, but that is a rather rarified nitch and requires many more personal resources than being a computer guy requires. However, these days, every TCG seems to be an embittered potential entrepreneur: "I could have started Twitter/Facebook/YouTube.")
The first category is only open to the best techs, since there are more applicants than there are jobs. TCG may be the best {fill-in-the-blank} you have ever met, but he might not be great as compared to our entire industry.
The second category really does seem to have a barrier to entry, a distinct ageism. A tragically common theme in our business is that young hirees are best because:
- they know the new and/or current technology
- they don't cost as much
- they are more adaptable
Too Young to Retire
Sadly, TCG is often too young to retire, and that seems to be his only option. The commonly expected arc of a career seems to be this:
- get hired as a bright young current IT footsoldier
- get even better with real-world experience
- consider management
- if "no" to managemet likely stall as you are pigeon holed in what used to be current tech
- if yes to management, get promoted to team leader
- possibly get promoted to area supervisor
- possibly get promoted to manager,
- possibly get promoted to VP
Note that if you were a "people person" you quite likely would not have started slinging code in the first place. So you are stuck with some unappetizing choices:
- Limbo
- Being a manager even if you don't like dealing with people
What Does This Mean To Me?
So now that you have read this far, you may be wondering what the point of this diatribe is. Here is the context-specific point:
To the co-workers of poor old TCG, I say this: remember that his sour puss may have more to do with being stuck than with being a misanthrope. More importantly, you are going to be hard-pressed to find a stick that will motivate TCG: his life already sucks. Try to find carrots instead, such as interesting small projects or chocolate or whatever it is that is safe, legal and appealing to TCG.
To the managers of TCG, I say this: you are probably kidding yourself if you think that TCG doesn't know that you plan to jettison him as soon as you can. You might find that sending him to training and showing other interest in his future is a better way to motivate him than pretending that you value him while counting the days until you can retire the system and fire TCG.
To TCG I say: find an interest in the future, even if you feel stuck. Without an interest in the future, you will end up bitter, hard to get along with and unhappy. Either embrace what you see coming or find another track or work your network to see if there is anything out there for you. Even false hope is better than despair.
Wednesday, November 2, 2011
Impure Implementations
I am a big fan of using the right tool for the job. When it comes to carpentry or surgery, this concept seems obvious and well-understood and usually followed. But it comes to large IT projects, I find a real tendency in large organizations toward trying to solve all problems with whatever single tool they have blessed. Let us call this tendency the Silver Bullet Assumption.
Many organizations are Visual BASIC shops, or Visual C++ shops, or Python shops, etc. This baffles me: there are many excellent tools out there, but there is no tool that is great for every aspect of a large project.
I find that large projects usually have most or all of these aspects:
Even if such a tool existed, who would use it? Someone who understood all those different domains?
In our consultancy, we have a break down that I think is pretty common, or used to be: we have systems people (who work mostly under Unix) and apps people who work mostly under Windows and Web work lies somewhere in between.
We use MS-Office or web pages to provide UIs on the desktop, web pages and thin clients to provide data entry on the floor, Unix servers to provide print service, file service and web service. It is hard to recall a project of any scope that did not cross these boundaries.
We are constantly asked questions about implementations which assume that everyone does everything: the Unix systems programmer who is supposed to know about MS-Access apps and vice versa. When we push back, we find that many organizations have the notion of "programmer," or even "systems programmer" versus "applications programmer" versus "server guy" but all these programmers are using the same environment: Windows or Unix and it is mostly Windows.
Clear as we are about our design philosophy, even we occasionally have requests for "pure" implementations, with the hope that if the technology under a large project is consistent, that large project will be easier for local IT to understand and support.
But this is often a forlorn hope: if your people do not understand bar code grokking or TCP/IP-based protocols, it very likely won't help if the thing they don't understand is implemented in a familiar technology. AT worst, they will have a false confidence that will lead them to fiddle where they should not fiddle.
(I speak from bitter experience, of course. Ah, the support phone calls which start by saying that some of our technology does not work and end with them admitting that they "fixed" something else right before the mysterious new failure began.)
I just don't buy the premise, that being fluent in systems, apps, networking, infrastructure and databases is a reasonable expectation, let alone the usual case. You know that you need network people, desktop support people, server people, etc. Why do you think that they all should be working in the same environment? What does that even mean, when you think about it: how is a desktop app like a print server?
This illusion of the benefits of purity is encouraged by vendors, so I suppose the customers are not really to blame. The first time I laid hands on Oracle, lo! these many moons ago, I was stunned at all the special-purpose configuration file formats and languages I needed to learn in order to tune the installation. But the client thought of themselves as a pure Oracle shop. This is like saying that all of humanity is part of a pure human language shop--we just use different flavors of language.
Very recently, I worked with a system that was billed as all Windows, all the time. Except that when push came to shove and I needed to debug some of its behavior, I come to find out that the core was a Unix server running a ported COBOL app. Egad! Knowing that it was COBOL through which the data was passing made debugging that systems interface much easier, by the way.
Why tell the customer that they are buying a Windows app running on Windows servers, with some kind of remote back end? I don't know: it must be comforting to someone somewhere.
I prefer to be more upfront with my clients: I will use whatever technology will get the job done, with an eye to accuracy and speed. I want to save my time and their money. I try to use technology that they already own, but I cannot guarantee that--unless they want to pay extra; often LOTS extra.
If I have to use MS-Access at the front, FTP in the middle and Oracle on the back end, then so be it. I find the choice between requiring minimal maintenance, but making local IT uncomfortable, and requiring lots of maintenance, but making local IT (probably falsely) confident, an easy one to make.
Just last month, we shut down a system of ours that had been in continuous operation since early 1984. That's 27 years of service, with an average of under 10 hours of attention per year. This system's impure implementation made local IT nervous, but it also allowed us to adapt to the dramatic infrastructure changes over that time. In the end, it was time to retire it: a 16 bit environment runtime environment under a 32 bit operating system running on a 64 bit architecture is a bit baroque even for me.
So while nothing lasts forever, I claim that the concept is sound: until there is a single, simple, all-encompassing technology, use what makes sense, even if the final product has multiple technology environments under the hood. There is no silver bullet and there never was.
Many organizations are Visual BASIC shops, or Visual C++ shops, or Python shops, etc. This baffles me: there are many excellent tools out there, but there is no tool that is great for every aspect of a large project.
I find that large projects usually have most or all of these aspects:
- an inbound interface for acquiring data
- a collection of data processing functions
- a way to store the data
- a user interface (UI) to view the data
- reports and exports to send the processed data down the line
- an outbound interface for sharing data
Even if such a tool existed, who would use it? Someone who understood all those different domains?
In our consultancy, we have a break down that I think is pretty common, or used to be: we have systems people (who work mostly under Unix) and apps people who work mostly under Windows and Web work lies somewhere in between.
We use MS-Office or web pages to provide UIs on the desktop, web pages and thin clients to provide data entry on the floor, Unix servers to provide print service, file service and web service. It is hard to recall a project of any scope that did not cross these boundaries.
We are constantly asked questions about implementations which assume that everyone does everything: the Unix systems programmer who is supposed to know about MS-Access apps and vice versa. When we push back, we find that many organizations have the notion of "programmer," or even "systems programmer" versus "applications programmer" versus "server guy" but all these programmers are using the same environment: Windows or Unix and it is mostly Windows.
Clear as we are about our design philosophy, even we occasionally have requests for "pure" implementations, with the hope that if the technology under a large project is consistent, that large project will be easier for local IT to understand and support.
But this is often a forlorn hope: if your people do not understand bar code grokking or TCP/IP-based protocols, it very likely won't help if the thing they don't understand is implemented in a familiar technology. AT worst, they will have a false confidence that will lead them to fiddle where they should not fiddle.
(I speak from bitter experience, of course. Ah, the support phone calls which start by saying that some of our technology does not work and end with them admitting that they "fixed" something else right before the mysterious new failure began.)
I just don't buy the premise, that being fluent in systems, apps, networking, infrastructure and databases is a reasonable expectation, let alone the usual case. You know that you need network people, desktop support people, server people, etc. Why do you think that they all should be working in the same environment? What does that even mean, when you think about it: how is a desktop app like a print server?
This illusion of the benefits of purity is encouraged by vendors, so I suppose the customers are not really to blame. The first time I laid hands on Oracle, lo! these many moons ago, I was stunned at all the special-purpose configuration file formats and languages I needed to learn in order to tune the installation. But the client thought of themselves as a pure Oracle shop. This is like saying that all of humanity is part of a pure human language shop--we just use different flavors of language.
Very recently, I worked with a system that was billed as all Windows, all the time. Except that when push came to shove and I needed to debug some of its behavior, I come to find out that the core was a Unix server running a ported COBOL app. Egad! Knowing that it was COBOL through which the data was passing made debugging that systems interface much easier, by the way.
Why tell the customer that they are buying a Windows app running on Windows servers, with some kind of remote back end? I don't know: it must be comforting to someone somewhere.
I prefer to be more upfront with my clients: I will use whatever technology will get the job done, with an eye to accuracy and speed. I want to save my time and their money. I try to use technology that they already own, but I cannot guarantee that--unless they want to pay extra; often LOTS extra.
If I have to use MS-Access at the front, FTP in the middle and Oracle on the back end, then so be it. I find the choice between requiring minimal maintenance, but making local IT uncomfortable, and requiring lots of maintenance, but making local IT (probably falsely) confident, an easy one to make.
Just last month, we shut down a system of ours that had been in continuous operation since early 1984. That's 27 years of service, with an average of under 10 hours of attention per year. This system's impure implementation made local IT nervous, but it also allowed us to adapt to the dramatic infrastructure changes over that time. In the end, it was time to retire it: a 16 bit environment runtime environment under a 32 bit operating system running on a 64 bit architecture is a bit baroque even for me.
So while nothing lasts forever, I claim that the concept is sound: until there is a single, simple, all-encompassing technology, use what makes sense, even if the final product has multiple technology environments under the hood. There is no silver bullet and there never was.
Subscribe to:
Posts (Atom)