What's the worst mistake you ever made during your IT career? Was it as bad as deleting all the VP's files... irretrievably? How about modifying a payroll program so that nobody received any overtime pay? IT pro Becky Roberts has spent the past 16 years building a solid tech career, but that doesn't mean there haven't been a few bumps in the road. Here's a look at what she remembers--with no small amount of embarrassment--as her most appalling professional mistakes, along with the painful but invaluable lessons learned.
We've all had at least one or two embarrassing moments on the job, whether they involved inadvertently wreaking havoc on a system, making a social gaffe, or mishandling a project. IT pro Becky Roberts decided to come clean and share her worst career moments--along with the lessons she took away from each experience.
This article is also available as a PDF download.
Over the past 16 years of being paid to make computers and people work together in perfect harmony, I have collected a number of incidents that make me wince and blush in embarrassment when I think of them. The mistakes I've made fall roughly into three categories: technical, political, and career management. Here, in no particular order, are my most outstanding screw-ups and the lessons I have, I hope, learned.
#1: Accidentally deleting the VP's files without having a backup. I don't even remember how I did this. Not only did I delete the files, but it wasn't until the format was in process that I realized my mistake. I spent a nervous 30 minutes deciding how to deal with the situation. Should I lie and try to shift the blame? I couldn't blame any other person, as I was the whole IT department. Should I just return his computer and act dumb? "Well, the files were there when I gave the computer to you." Nothing I could think up felt right. In the end, I simply walked into his office, handed him his computer,and confessed, "I have screwed up. I deleted all your files and have no means of getting them back. It was completely my fault." Silence. Then: "Okay. Please be more careful in future." That was it. That was all he said. I could've kissed his feet, my feeling of relief matched only by the feeling of abject stupidity and incompetence.
Lesson learned? BACK UP BACK UP BACK-UP. Never delete, move, modify, upgrade, update, patch, flash, or format without making at least one backup. I have never knowingly lost a file since.
#2: Modifying the payroll program so no one received OT. This was at a ceramics factory in the Midwest. Payroll consisted of a Basic program on a minicomputer. A new rule for calculating overtime was to go into effect, and I was given the assignment of making the appropriate modification. I made the required change on a copy of the live program and did a walkthrough. The logic seemed flawless. I showed the program to my boss and he gave it his blessing. He said that he would put it into production at the start of the next pay period. Alarmed, I asked if we had a test system. I had been working for the company for just two months and was not familiar with the infrastructure. He grinned and said we didn't have one.
Two weeks later, a virtual riot broke out as the employees opened their checks to discover the awful truth: no overtime, none, nil, nothing, nada. My boss said I could go home early as I was looking horrifically pale.
Lesson learned? This is a tough one. Obviously, I had made a programming error and needed to improve my skills. But should I have realized that my abilities as a programmer weren't up to the task and tried to refuse the assignment? I did express my qualms about putting an untested program into production, but perhaps I should've done so more forcefully. Probably the most important lesson I learned from this incident is to ask very detailed questions about the infrastructure when interviewing for a new position and try to identify and avoid companies that don't support best practices.
#3: Going live on a demo copy of Exchange. I was a new hire into a two-person IT department and given the project of installing MS Exchange. The company was using a text-based freeware e-mail system, but it was so difficult to use there was no mail to migrate, so this could be a simple install.
I obtained a demo copy of Exchange, installed and configured it, and selected a small group of test users. I set up training for all users who were interested and soon expanded the "test" group to include all the employees. As days turned into weeks, the amount of data being stored grew rapidly. Users created folders and archives and entered their contacts. I ordered the appropriate Exchange licenses, assuming that they could be applied to the demo version. Finding no way to apply the license, I called Microsoft only to learn the heart-stopping truth that there was no way to apply the license, and worse, after 90 days the demo system would simply cease to function. This was day 88. Needless to say, I spent the next 48 hours on the phone with Microsoft tech support setting up a new server to replace the demo one, migrating data, and changing client configuration. It was one big, panicky mess. The only good to come out of this situation was that it turned into a great opportunity for learning.
Most salient of the many lessons learned?
* Don't go live on a demo application without first checking that it doesn't self-destruct on its expiration date.
* Make a project plan and stick to it. I should not have expanded the test group to the whole company.
* Improve communications with the users. They should not have been allowed to use the test system to the extent that they became dependent on it.
* Insist on being allowed to attend training before assuming responsibility for a new system.
#4: Taking backups for granted. Server backups were my sole responsibility. I set up the backup server and religiously changed tapes every day. The very first time a user needed a file restored, I discovered that the folder containing the file had not been successfully backed up for more than three months. Worse, the particular file in question had never been backed up. Suppressing the urge to blame the failure on a software fault rather than my own negligence, I went to the user and apologized. He was not at all impressed.
Lessons learned?
* Never take backups for granted.
* Check backup logs in detail daily.
* Practice restores on a routine basis.
* Set up a schedule for reviewing the backup strategy.
#5: Possessing unique knowledge. Although it may seem that possessing unique knowledge and skills within a company should offer some job security, not only is this a delusion but the possession of such knowledge can also become a burden that negatively affects your personal life. On several occasions throughout my career, I have allowed myself to be in just such a situation, where my failure to share my knowledge led to interrupted weekends with the family, phone calls at all hours of the night, phone calls while on vacation, and my favorite, a phone call to the ICU less than six hours after brain surgery.
Lesson learned? Document, share, and train. Sometimes, in a small company or during an implementation, it can be impossible not to possess unique knowledge. But staying aware and alert to the danger can minimize this risk. When provided with the appropriate documentation, it is surprising what even a relatively unskilled alternate can achieve in a crisis.
#6: Creating inadequate self-documentation. In addition to failing to document procedures for the purpose of sharing knowledge, I have shot myself in the foot on more than one occasion by failing to document a procedure or configuration I was sure I would remember. While working under pressure, with users breathing down my neck, it's all to easy to take shortcuts and make worthless self-promises to document later, when the crisis is over. Unfortunately, the next crisis hits, and then the next, and soon the documentation is forgotten until it's needed in the midst of yet another crisis.
Lesson learned? This lesson hasn't been fully learned yet. I've started taking screen shots and brief notes while working under pressure, but I still procrastinate in doing a full write-up after the crisis is over.
#7: Failing to establish the extent of my authority at the start of projects. Over the years, I have been the project manager for a variety of implementations, upgrades, and migrations. With one exception, each project was successful, in that the defined objectives were met by the deadline. But the process by which this was achieved was not necessarily the most efficient or the least stressful.
I hadn't seen the need to establish my authority at the start of a project as, until recently, all the members of each team had respected it. On a more recent project, in a very hierarchically structured company, my team consisted of a few peers, my boss, a couple of managers, and a VP. A more experienced PM suggested that I call a meeting specifically to determine exactly how much authority I had over the members of my team. I thought this was unnecessary and soon began to suffer as a result.
The project had various critical path items that had to be complete by specific dates, but despite my fervid attempts to communicate this to the team members responsible for the items, they would frequently thwart me: "I'm taking Friday off. My boss approved it." "I can't do that today; I need to do this." I had all the responsibility for the success of the project but none of the authority necessary to ensure its success. As a result, it was my first project not to meet the deadline.
Lessons learned? The first step of any project I am managing will be to establish what authority I am to be accorded. And if it's not sufficient to guarantee the success of the project, I'll have the temerity to either ask for more or suggest that a more senior project manager be appointed.
#8: Sending an insensitive e-mail to an employee. I received an e-mail from an employee detailing an extensive list of problems she was experiencing with her computer. Without any malicious intention, I shot off a reply saying that it sounded like new computer time. Thinking no more about it, I added her problems to my list of tasks for the day. A few minutes later, I was summoned into an emergency meeting with my supervisor and his boss. As I walked in the door, I was handed a printout of my e-mail reply to the employee and told to explain myself.
Thoroughly confused, I stated that I didn't understand what was going on. My boss explained that the employee was extremely offended by my e-mail as she had interpreted my levity as a refusal to take her problems seriously or help her. I was shocked that a few innocent words, an attempt at a lame joke, had been so drastically misunderstood. I was instructed to apologize to the user and fix her problems immediately.
Lessons learned?
* Do not attempt to be funny or clever in e-mail; play it completely straight.
* Have a formalized procedure for handling requests for help.
* Always inform users of when they can expect their problems to be addressed upon receipt of their request for help.
#9: Not taking advantage of free training and certification opportunities. Each time I have updated my resume in preparation for seeking a new job, I've regretted not having formal certifications to accompany my experience. This has been particularly irritating when the company I am trying to leave has a policy of paying for any classes the employees want to take, whether they're relevant to the business or not. It's kept me from applying to several jobs I was otherwise qualified for simply because they required possession of particular certifications.
Lessons learned? Take advantage of all free training opportunities, even if they have to be pursued out of business hours.
No comments:
Post a Comment