Saturday, April 14, 2007

Working for The Man? Advice to a young programmer

Working for Google is full of surprises. When I first arrived I started to get to know my office-mate. He's a laid back, rather cool but studious-looking guy with longish hair. I asked him what he did and learned a lot about how students were taught parallel processing in a cluster environment. Politely he responded with the same question and I started to tell him about Samba and what I was currently working on. "You remember around 1988 when AT&T came out with a file-sharing protocol called RFS (Remote File System) to compete with NFS (Sun's Network File System)..." I continued.

"I was eight years old in 1988," he replied.

After I'd finished checking for obvious facial wrinkles in the bathroom, I decided to go on a quest to find other engineers in the building who were at least as old as I was, and felt much better when I found some. But it set me thinking about what kind of advice I would give if I could meet myself at his age, in order to guide the young Allison into a promising engineering career. So, in the best spirit of "The Screwtape Letters," here is some of what I've learned so far about making yourself a career in writing software.
If it's not what you love, don't do it

I've worked with many programmers during my career. Without a doubt, the only ones who are any good at it are those who see writing code as art, a creative process. I know it's an obvious lesson, but it's really important. If you want to make lots of money and retire early, don't start by writing software; learn about business and start a company instead. I've run into so many poor programmers, in both senses of the word, who got into the field because they "wanted to be the next Bill Gates." Bill Gates didn't get rich by programming, he got very rich by being very good at running a company. I've had to fix code created by these people and it isn't pretty. Eventually they usually move into management where they might have a chance to find their true calling.
Learn the architecture of the machine

Many programmers, especially those who write for virtual machines such as Java or the .NET CLI, think that low-level machine architecture and processor instructions don't matter anymore. That's still not true, and I don't believe it ever will be. Someone who understands what the machine is really doing underneath all the modern layers of glop such as virtual machines, garbage collection algorithms, network and threading abstractions, will always be able to solve problems better than someone who lets the compiler or the "execution environment" they're using make all the decisions for them. These days the effects of processor caches and memory bandwidth mean that it's even more important to understand the lower levels of computer architecture than it used to be in order to be a good programmer. The good news is that modern tools like the amazing free software tool "valgrind" can emulate an entire processor in software and make understanding what is going on at each line of code as simple as looking at a visualization of execution time. Using resources efficiently matters when you're dealing with modern clusters containing thousands of machines.
Reputation is important

The days of starting at IBM after college and working there in obscurity until you retire are long gone. Any modern programmer will move between many companies in his or her career. It is very important to be able to show your next employer what you have done, and what you are able to do in a team. Free software/open source is the ideal way of doing this. It's not just a better way of producing software, it's actually better for the reputation of the people creating it. One of the first things I do when evaluating someone is to look for samples of their code out there on the Internet. If you work on proprietary software you can't show anyone anything, and real code speaks louder than any list of projects you claim to have worked on.
Proprietary environments are a trap

I used to be a Microsoft Windows programmer, as well as a UNIX/POSIX programmer. The knowledge I've gained about programming POSIX is still useful, even though I learned a lot of it over 20 years ago. My Windows knowledge is now rather out of date, and getting more so over the years. It isn't worth my time anymore to keep up with each increasingly baroque change to the Windows environment. Just as an example, over this time the latest "hot" communication paradigm that Microsoft recommended developers use in Windows changed from NetBEUI, to NetDDE, then OLE, followed by OLE2, then COM, DCE/RPC, DCOM, and now currently seems to be Web Services (SOAP and the like). Meanwhile, in the UNIX world the Berkeley socket API was useful in the 1980's and is still the core of all communications frameworks in the open standards world. All the UNIX RPC, object and Web Service environments are built on that stable base.

Learn as much about the proprietary environments as you need to be able to help people port programs over to open standards. You'll never be lost for work. The same is true of any proprietary environment, not just Windows. Windows just happens to be the one I know best. What will get very interesting in the future is the effect a fully open Java platform will have on the software environment in the next 10 years. After initially ignoring Java due to its proprietary restrictions, I now believe Java and its associated libraries have the potential to be the next POSIX.
The network really is the computer

There are now no interesting non-networked applications. Standalone computers are devices for watching stored video or listening to music, usually on airplanes. People doing offline email are simply working in an extreme case of a network disconnect, a rather large network latency if you will. The Internet has become the real computing environment of the next century and all programming will become network programming. This is a more challenging environment than programmers have been used to, with connection, latency and concurrency problems making our work much more interesting than it used to be on the standalone DOS box. All entertainment and communications such as television, radio and the telephone network will move onto the Internet. Poor Sun Microsystems were 20 years too early with their "the network is the computer" slogan, but they will eventually be proven right.
The community is more important than your employer

Are corporations fundamentally amoral? If they can make more money by outsourcing your job to India or China, or recycling employees into fertilizer for the rose garden at corporate headquarters, will they do it? I once had to listen to several high-level executives (for a previous company that shall remain nameless) waiting for the private corporate jet complain how inefficient it was that the country was run by democratically elected politicians as "they just didn't understand business."

Corporations are great places to work when things are going well, and I enjoy the perks as well as the next employee, but I'm very careful even in my optimism to not make the mistake of thinking this is the way things will always stay. In the free software/open source community, the people you're collaborating with and creating code with are the people you can really depend on. While you might not get on with all of them personally, they share your common goal of making sure that the code is the greatest, most beautiful work of art that all of you can create together. Smart corporations, at least the ones you'd want to work for, hire from that pool of people, and even though individual corporations may stumble and fall, if you're part of our community you should be able to successfully manage your career between the occasional stormy periods of corporate upheaval.

If you come from a coal mining area as I do, you can't finish a piece like this without paying homage to Merle Travis's wonderful song about really having to work for a living. No matter how much complaining we do, at least we're not "workin' for the man" :-).

You load sixteen tons, what do you get?
Another day older and deeper in debt
Saint Peter don't you call me 'cause I can't go
I owe my soul to the company store
-- Merle Travis

Good luck young Mr. Allison, and let me know if you have any more advice for him. I'd love to hear it.

Is Flash better than Java?

For various reasons, Java applets have never worked all that well but we some folks use them anyway, for example in business charting and other graphic visualizations. Over the past few years alternatives to Java in and around the browser have been gaining traction, especially Flash. In this article I will argue that it's time for Java to take a lesson from its competitors (on the client side) and address some of the shortcomings that are allowing them to flourish.

Flash
The most serious competition for webtop Java is Macromedia Flash. Flash started life as a very small footprint (~180K) media plug-in for Internet Explorer and Netscape.Adobe acquired Macromedia a few years ago, and has been pouring resources and marketing into it. Now, Flash 9 is an advanced development platform. Through Flex and ActionScript, you can write sophisticated browser-hosted interactive business applications. The run-time is still relatively small (much smaller than Sun's Java, about 1MB), it's frequently updated with "frictionless" installs, supports vector graphics, streaming video, Ajax-y back-end interaction, and many other features that users find appealing.

Flash Lite is extending the Flash ecosystem to mobile devices. What was previously the domain of J2ME (now called Java Micro Edition or Java ME) is now being Flash-enabled. Because Flash is vector-based, applications are easier to port from one screen size to another, and users don't see jaggies because Flash has always done anti-aliasing. When I read this article from Adobe it really made me think. Even if you don't do anything with J2ME, Flash Lite is draining more mind-share away from Java.Adobe Apollo brings Flash to desktop applications, in direct competition with Java Swing and Rich Client Platforms. It holds an attractive promise of being able to share code between browser-based, mobile-based, and desktop-based applications. Well, hasn't Java been doing the same thing for 10 years? Yes, but its execution has been flawed in many ways.

Analysis
Ok, so what does all this mean for developers? The software industry has a tremendous investment in Java code, millions of lines and billions of dollars. We don't want to move to another language like ActionScript just for the fun of it, right? If there was a good enough reason, then it could be justified. I'm all in favor of new ideas and moving to better, more efficient technologies. In fact I've used that argument more than once to move C code to Java or C#. But recoding from Java to ActionScript? There's no way that qualifies. IMHO, there's no technical reason that Java can't do everything that Flash can do, with a little work.

The most serious problem that has allowed alternatives to flourish is the poor Java applet user experience. What if we could toss that out and replace it with something better?

I'm talking about work to make Flash-like small footprint installs of a Java-based run-time. Instant-on and streaming compressed content (no more Java progress bars). Frequent updates, fully backward compatible (no content left behind). Viewer-like security with no scary prompts for the user but no complicated signing either. And so forth. I'm sure you all have lots of ideas about this too.

What about F3?
Some have suggested that the new F3 language is Java's "Flash killer". F3 is an experimental declarative scripting language with static typing, type-inference, declarative syntax, data-binding, 2d graphics, declarative anmiation, and more. It demonstrates that the Java platform is highly competitive with competing platforms such as Flash. But F3 is a different language from Java. I'm talking about using Java code that already exists. Also I haven't seen any demos of F3 running inside the browser, only through Java WebStart (.jnlp). F3 doesn't address the underlying problems with applets and Java installs that Flash addresses. It can't be a Flash killer until its user experience is as good as or better than Flash.
Conclusion
For years the industry has waited for Sun to solve this problem and they haven't. Now, with Java 7 being open source, we could do something about it ourselves. Java experts could work together to build an open community around a new Java-based viewer (which we might not even want to call "Java"), and reverse the brain-drain away from Java technologies, preserving our investment in Java code and expertise.

Is Flash better than Java? No, just far better executed for some important use cases. Failing to address this, and allowing Flash and other alternatives to continue to grow and improve unchecked, is going to lead to competitive disadvantages or needless conversions down the road for Java developers. Instead of participating in endless arguments about esoteric subjects like closures and native code, competition from Flash should light a fire under the Java community and force it to respond with radically better user-facing solutions.

ITIL in a Nutshell

If you do a
quick search on IT best practice frameworks, you will find no shortage of
foundations/architectures on which to build your IT organization. COBIT
(Control Objectives for Information and Related Technology), ISO9000 (International
Standards Organization set of quality standards), CMM (The Capability Maturity
Model), and Six Sigma are just a few. Another framework which is gaining
interest in the United States is ITIL.

ITIL
stands for IT Information Library and it is a framework developed in Britain in
the 1980s that addresses service delivery and support of IT services. Widely
accepted internationally, it is just beginning to make significant inroads in
the US. Privately, IBM, EDS, HP, Mead, and GM have adopted the framework. In
the public sector, the states of Virginia and Wisconsin and Oklahoma City,
among others have embraced the framework.

Why ITIL?

So
what makes ITIL so special? Combing the literature, it appears that the
consensus opinion on ITIL is that it is unique because of its strict focus on
service delivery and IT operations as opposed to general techniques involving
quality management or the implementation of standards. To many that have become
involved with ITIL they see it becoming the de facto standard for all IT shops
in the US as it has become so internationally. I have read in one source that
the US and Canadian Governments will soon require IT contractors to use ITIL,
but I have not been able to confirm that through any other sources at this
time.

Specifically what is ITIL
and how do I access the library?

ITIL
as mentioned above is a collection of best practices that has been developed
into a series of 8 books that run about $114 dollars each. They are:


Service Support: Covers the basic processes
involved with support to the enterprise such as Service Desk, Incident
Management, Problem Management, Configuration Management, Change Management and
Release Management.

Service Delivery: This book focuses on the
planning and delivery of services and includes topics such as Capacity
Management, Financial Management for IT Services, Availability Management,
Service Level Management, and IT Service Continuity Management.

Planning to Implement
Service Management: “This book answers the question ‘Where do I start with ITIL?’. It
explains the steps necessary to identify how an organization might expect to
benefit from ITIL and how to set about reaping those benefits.”

Infrastructure Management: As the title infers, this
book covers everything about managing your telecommunications infrastructure
including Design and Planning, Deployment, Operations, and Technical Support.

Application Management: Covers the management of
applications from inception to retirement and everything in between.

Software Asset Management: Seeks to explain what
software asset management is, why it is important and how to manage them.

Security Management: “This guide focuses on
the process of implementing security requirements identified in the IT Service
Level Agreement, rather than considering business issues of security policy.”

The Business Perspective: “This book is
concerned with helping business managers to understand IT service provision. Issues
covered include Business Relationship Management, Partnerships and Outsourcing,
and continuous improvement and exploitation of Information, Communication and
Technology (ICT) for business advantage.”

You
can obtain these books here: http://www.itil.co.uk/publications.htm

Do you need another
framework?

I have to be
honest with you, every time I read about another framework my first reaction is
to roll my eyes. I have been around long enough to experience many “better
than sliced bread” phenomenon that–if only implemented–will make my
organization a superstar. And, of course, there are always an army of
consultants to be hired and classes that need to be taken and certifications to
achieve in order to “realize the potential” of the framework.

And
in that sense ITIL is no different. You can invest the time and resources in
understanding the framework to build up expertise in order to implement it
(which several organizations have) or you can hire someone to help you along.

Additionally,
like many of the frameworks that have come into vogue before it, integrating
the processes involved with ITIL takes time–usually measured in years.

However,
with all that said, frameworks can prove beneficial. I think there are very few
if any IT organizations in existence that can claim to be perfect and have no
need for improvement. Most can stand some enhancements to their operations. The
tough questions are: where can we improve and how do we go about doing it?
That’s where frameworks are beneficial.

Personally,
this framework intrigues me–partly, I guess, because it was originally written
by government workers, and I intend to research it further. If you are
similarly intrigued, here are some places, besides the books, that you can get
information to see if ITIL is right for you and your organization:

ITLPEOPLE.com
www.itilpeople.com

Six reasons IT managers miss their budgets--and 11 tips to come in on target

Learn why you may be having trouble hitting your budget and see what items are essential to developing a realistic and achievable plan.

If you're scrambling to save your budget halfway through the year, you're not alone. IT managers are generally not taught how to budget and fall prey to certain tendencies, like allowing the CEO to reduce expenses without adjusting expectations on services and deliverables. IT management expert Mike Sisco looks at the various reasons IT managers have budgeting problems and then offers some suggestions for developing aggressive but achievable plans. Items of key importance include budgeting for salary and benefits, telecommunications costs, and hardware and software maintenance, as well as paying close attention to upcoming initiatives and considering past expenses, which can indicate future trends.

here is the pdf

10 things you can do to organize and lead effective meetings

It's easy to disparage those tedious meetings that are run by someone else--but are your own meetings any more useful and productive? These pointers will help ensure that your colleagues don't cringe every time they receive a meeting notice with your name on it.

Every so often, you may find yourself walking out of a meeting feeling hopeful and energized by the ground that was covered, the ideas that emerged, and the issues that were resolved. A meeting like that doesn't happen by accident. Someone took the time to consider and define its purpose and to manage the process effectively.

here is the pdf file

IT consultant Shannon Kalvar put together a list of 10 suggestions for planning, organizing, and conducting successful meetings. For example:

* Know what action you expect from the meeting. Meetings draw people away from their daily tasks and into a closed, influenced environment. As the organizer, you have the attendees' attention. It's up to you to use it wisely. The moment you squander it, the meeting grinds to a halt. Spend a few minutes before the meeting trying to answer the following question: 'What do I expect the attendees to DO at the end of this meeting?'

* Never send a meeting to do a conversation's work. Electronic messaging systems give us the power to invite everyone and everything in the organization to our meetings. But the power to do something doesn't make it a wise choice. If you need to speak to only one or two of the meeting's attendees, just go to their cubes and have a conversation. It takes less time and communicates more information.

* Maintain focus. In every meeting, someone derails the discussion with a host of tangents that detract from the meeting's real goal. Do not let this happen to your meeting. Cut off speakers who want to ramble on about related but unimportant issues. Develop and maintain a reputation as a hard, organized meeting leader so that people don't challenge your authority during the meeting itself.

Eleven qualities of successful IT managers

Developing these core competencies will help you achieve higher success levels and will position you for greater career responsibility.

Your career advancement hinges on a long list of capabilities, including working proactively, having strong follow-up skills, and negotiating effectively with vendors. But according to Mike Sisco, when you dig down into what makes an IT management career successful, 11 traits top the list. Among the items are:

* The ability to assess needs
* The ability to build the team
* The ability to implement change management processes
* The ability to implement a client service mindset

here is the pdf file

A look at CrAzY bosses, past and present

Michael Kanellos, an editor for News.com, provides his perspective about a topic that most people can relate to: crazy bosses. According to Kanellos, "To be in the Hall of Fame, they can't just be crazy. Like Highlights magazine, they have to be crazy 'with a purpose.' So with that in mind, here's a list of touched leaders, in no particular order, who I wish to salute for their zany personal behavior and the dreams they accomplished."

Read the entire article, "Perspective: The crazy boss Hall of Fame," to see which bosses made it into the CrAzY Hall of Fame.

Have you ever worked for a crazy boss? Share your crazy boss story in this discussion thread, and the craziest one (let's keep it real, please) will win a TechRepublic mug.

Geek Trivia: A tax to grind

Which company filed the largest U.S. tax return in history?

It's spring in the northern hemisphere, when young men's fancies turn to thoughts of love. But if you live in the United States, thoughts of love will have to wait -- at least for another week. That's right, the Internal Revenue Service (IRS) tax deadline looms, and many are just getting around to gathering up their receipts and W-2 forms.

Even outside U.S. borders, this date has some particular importance, as the IRS is willing to trawl the ends of the Earth to get its due share. Such has been the case for the last 90 or so years.

Most American schoolchildren learn that U.S. income tax originated in 1913 with the ratification of the 16th Amendment to the U.S. Constitution. In truth, only the current version of U.S. income tax exists because of that legislation. The real origin of U.S. income tax goes back to the American Civil War.

The Revenue Act of 1861 actually imposed the first U.S. income tax -- 3 percent of all annual incomes above $800. Intended to finance Union forces during the war, the law outlasted the conflict by seven years.

Why go to the extraordinary trouble of passing a Constitutional Amendment to establish a federal income tax if Congress already had the power to levy income taxes? In 1895, the U.S. Supreme Court ruled that the Constitution essentially prohibited an "unapportioned" federal tax on income derived from property.

Apportioning taxes (i.e., distributing them based on each state's portion of the U.S. population), as well as splitting the hair of determining which income comes from wages as opposed to property, basically made federal income tax an impossible technical quagmire even though it was technically legal.

The 16th Amendment untied these knots: "The Congress shall have power to lay and collect taxes on incomes, from whatever source derived, without apportionment among the several States, and without regard to any census or enumeration."

Despite this effort at simplification, however, U.S. tax law has only grown exponentially more complex in the intervening decades -- so much so that U.S. corporations with assets over $50 million must file electronically.

That's probably a good idea. Last year, a U.S. company filed the largest federal tax return in U.S. history -- one that would have required tens of thousands of pages to print.

WHICH COMPANY FILED THE LARGEST U.S. TAX RETURN IN HISTORY?
Which company's 2006 U.S. tax return was the largest return ever filed -- one so massive that it would have required tens of thousands of pages to print if the firm had not filed electronically?

The corporation in question is none other than General Electric, which submitted a 237-MB e-filed tax return to the IRS in May 2006. (Like other large corporations, GE received an extension, meaning its tax deadline was September 15 -- not April 15.)

By most estimates, a paper version of the tax return would have been approximately 24,000 pages long. Printing the return on 8.5"x11" paper and laying the pages end-to-end would have traced a line more than four miles in length.

Compare that to the entirety of the infamously massive U.S. Internal Revenue Code -- the formal title for all federal tax law in the United States -- which is roughly 24 MB. To print it, you would only need 7,500 pages -- and it would only stretch to roughly 1.3 miles.

Put another way, GE's 2006 tax return was almost 10 times larger than the entirety of the laws that dictate how to file the return. (The IRS regulations that implement the tax code are another -- vastly longer -- story.)

Why the long return? GE is a multinational conglomerate, with ties to dozens of industries in dozens of countries. While perhaps best known as a maker of light bulbs and household appliances, GE actually has six major divisions that delve into commercial and consumer finance, healthcare, entertainment, and industrial development.

Each division has major subdivisions, and each subdivision has its own complicated tax profile. For example, Universal Studios -- which takes in hundreds of millions of dollars in motion picture revenue (and tax liability) every year -- is just one branch of NBC Universal, GE's entertainment arm.

Are you a technologist or a magician, or both?

"Any sufficiently advanced technology is indistinguishable from magic."

– Arthur C. Clarke

I recently discovered this quote from Clarke, and it has been haunting my thoughts ever since. It reminded me of a time a couple years ago when I took my son to the bathroom at a brand new Home Depot. The lights turned on automatically when we entered, the water automatically poured out when he put his hands under the faucet, and when he reached up toward the paper towel dispenser, it quickly spit a towel at him, which made his eyebrows shoot up and his mouth drop open. When we walked out, he told my wife, "Mom! They've got a magic bathroom!"

Of course, I wasn't quite as impressed. I knew that the bathroom was powered by some cheap and simple motion-sensing technologies and that Home Depot used them to save money by cutting down on wasted resources. The bathroom was magical to my son because he didn't understand why these things worked the way they did or the behind-the-scenes technologies that made it happen. To me, it drove home the point that what we view as "magic" almost always involves mystery. Once you understand how something works, you take away the mystery, and usually the sense of magic as well.

As a result, those of us who understand the hows, the whys, and the science behind technology can lose our sense of wonder and magic. That's especially true for IT pros, who have to make the magic work — and keep it working — every day. However, most of the end users that IT pros support still occasionally have a sense of magic about technology, especially when it involves making their jobs easier, faster, or better.

Making magic

I can think of several times when I implemented solutions for users that felt like magic to them:

* The first time I set up a VPN connection for a user that needed to work remotely, the user was slack-jawed. She had just gotten a cable Internet connection at home. I had already set up a VPN server at the main office, so I simply put a shortcut on the user's home desktop for the VPN client connection. The user double-clicked, authenticated, and then was able to browse the company's file shares, access her e-mail via Outlook, and connect to line-of-business apps. "It's just like being in the office!" she exclaimed.
* Similarly, when I first demonstrated a Remote Desktop connection for someone running Windows XP, it drew excitement and wonder. The person had already been using a VPN connection on his laptop for a couple years but had some latency issues when connecting to some of the resources at the main office. Since both his office PC and his laptop were running XP, I suggested that he just use his laptop and VPN to make a Remote Desktop connection to his office PC. I set up a Remote Desktop shortcut on his desktop. He double-clicked, authenticated, and maximized the screen on the remote connection. "This looks just like sitting at my desk!" he said.

More recently, there have been a few times when software engineers here at CNET Networks have used some new techniques and technologies to overcome some long-standing problems with backend tools and user interfaces. I won't bore you with the details, but the bottom line is that they have significantly reduced the production time of some important tasks and enabled us to finally have the ability to implement some good ideas that we've wanted to try for years. In retrospect, it's pretty magical stuff.

I think what Clarke is saying in his quote is that even if you understand how it works, that doesn't mean that it's not magic. Thus, when you work in IT, you shouldn't forget that in your daily work you are not always just a technologist. You are not always an IT professional. Sometimes you are a wizard. Sometimes you are a magician. I'd like to tip my hat to all the wizards and magicians out there. Of course, there are times when end users have other names for you, especially when the magic isn't working for them, but don't let that obscure the times when you do make the magic happen. That's one of the things that makes the job great.

Share your magic

Can you think of times when you've designed, deployed, or implemented something that has astonished your users and made you look like a magician?
JUST THINK AGAIN!!!

Dinosaur Sightings: Windows splash screens from 1.01 to Vista

Ever since the dawn of the Windows operating system, Microsoft has used Splash Screens as a means of distracting us while the operating system is loading. In addition to their entertainment value, the Windows splash screens typically provide us with some sort of feedback on the progress of the sometimes lengthy startup process.

In this gallery, you’ll be able to browse through all the splash screens from Windows 1.01 to Windows Vista.

here is the link.. just click on the next screen ...and browse through the screens..

Microsoft confirms Vista OEM hack

Hack may allow users to bypass antipiracy feature, but no action is planned yet, according to a post on the Windows Genuine Advantage blog.

In response to widespread chatter on blogs and forums, Microsoft has acknowledged the presence of hacks that may allow pirates to bypass the product activation security feature in its Windows Vista operating system.

According to a post by Microsoft Senior Product Manager Alex Kochis on the Windows Genuine Advantage developers' blog, Microsoft has identified two ways in which hackers have broken the product activation security feature on original equipment manufacturer PCs that come bundled with Vista. But the Redmond, Wash.-based tech giant does not yet have plans to snuff out this threat.

"We focus on hacks that pose threats to our customers, partners and products," Kochis wrote. "Our goal isn't to stop every 'mad scientist' that's on a mission to hack Windows. Our first goal is to disrupt the business model of organized counterfeiters and protect users from becoming unknowing victims."

Microsoft first introduced product activation as a security feature with its Windows XP operating system, which launched in 2001.

Reports of a vulnerability in Vista's product activation began to surface last month with word of a crack called "Vista Loader 2.0," an enhanced version of the "Vista Loader 1.0" that was devised by Chinese hackers, according to a March 10 post on the My Digital Life blog. Vista Loader, the post explained, simulates an OEM motherboard's basic input-output system, software that is responsible for communication between the machine's hardware and the operating system. Consequently, with a BIOS simulator, the registration process that would normally lock out an unauthorized copy of Windows Vista could be bypassed.

While Microsoft is not immediately taking action, Kochis did acknowledge on the Windows Genuine Advantage blog that this could be a problem. "Because Windows Vista can't be pirated as easily as Windows XP, it's possible that the increased pressure will result in more interest in efforts to attack the OEM Activation 2.0 implementation," Kochis wrote.

Last month, it was believed that hackers had found a loophole in Vista's product registration, but Microsoft refuted the claim shortly afterward. Another alleged hack, this one involving a random product key generator, was also debunked in March.

Tuesday, April 10, 2007

Where Java EE goes horribly wrong

Sure, Java EE covers everything but the corner cases... but why, oh why are there so many corner cases?!

Java EE rocks, it really does. I'm usually quite happy working with it. But it strikes me -- while it handles nearly everything but the corner cases, it leaves so many corner cases as to make itself less useful than it could or should be. It's the Achilles' heel of Java.

For example, look at my recent travails with getting a JMS destination populated for an EIS component. This shouldn't be that big of a deal, really, considering that I've an inbound EIS component that communicates with a resource in JNDI.

Here's the backstory: I'm writing an EIS component to receive mail. It needs to falsify email addresses (i.e., all addresses are dynamic) and my intent was, as email came in, to forward each email to a JMS queue. The JMS queue would be listened to by a message-driven bean, which would ignore the email or ... well... not ignore it.

I wanted this in an EIS component because that way I don't have to remember to start up all of my application components: I'd start my database and my application server, and be done with it.

Thus, my EIS component wants to have JMS ready to go when it starts, because otherwise it'll bind to a port and have nowhere to go with it. That's where my "problem" comes in.

It turns out that in Glassfish, JNDI isn't populated until after server startup finishes. Thus, my EIS component initializes, and it will never have JNDI available to it when it starts. That's by design, apparently, and is the source of a bug I filed with Glassfish already, if I'm not mistaken. (Bug #1950, which happens to be the year I wasn't born.)

I need a Glassfish Lifecycle module, to watch for when the server is ready, and then I can get the JNDI context from the lifecycle process. I can start threads there (as long as I shut them down) and do whatever I need. This is in some ways a cleaner solution than what I'd done, because instead of balkanizing SubEthaSMTP, I can just start SubEthaSMTP in a lifecycle module and have done.

But... what happens if I want to move to, say, JBoss? Or OC4J? Or WebLogic? Or WebSphere? Geronimo? JONAS? RESIN? I'm using Glassfish for the moment, for development - primarily because it's the reference implementation of the specification, not for any magical qualities it has as an application server. Writing a lifecycle event is fine if I'm staying on Glassfish, but I have no desire to be tied to a specific server if I can help it. These lifecycles are specific to Glassfish - other servers aren't likely to follow its lead.

Java EE has, in this case, failed me. This isn't that big of a deal, except for this is the kind of corner case that you run into over and over and over and over again with Java EE.

You want to know why people look at Spring as if it's the Messiah? You want to know why TheServerSide.com keeps yakking about OSGi and grids and Ruby and AJAX and other stuff? THIS IS WHY. They provide ways around the corner cases. Not always - I'm waiting for an OSGi or Spring template for an application server (i.e., "inject an app server into this bean and start it...") but that allows us to treat the corner cases like what they are - corner cases instead of barriers to constantly be confronted.

Java EE doesn't suck - not really - but let's be real, it DOES suck in ways it shouldn't. The spec has to have ways to address these corner cases, provide things that developers like me can rely on. I know I have options here, I'm just dissatisfied with every cotton-pickin' one of them.

Some people are going to argue that providing injection points for these problems will increase complexity - witness people complaining about all the phaselistener things in JSF 2.0. Hey, I get that. But let's be realistic: you don't have to use the capabilities if you don't need them. It's just that if you do need them, right now, you're left adrift, screwed over by spec authors who feel that it's more important to address common capabilities instead of common NEEDS.

This has got to change, or else Bruce Tate's proclamation that Java is "dead like COBOL" - meaning quite alive, but not extremely vibrant - will come true after all.

Are you more I or T?

At some point in your management career, you have or will come across a personality test/inventory known as the Myers-Briggs. It is usually associated with career/management training/counseling as a tool to better understand yourself and your tendencies. Without going into a great amount of detail, the test breaks your personality down into four dichotomies: Extraversion/Introversion, Sensing/Intuition, Thinking/Feeling, and Judging/Perceiving. So after taking the test, you might be ranked as an ISTJ or an ENTP. There are 16 combinations that are possible.

I have been doing some reading that suggests, for people in the IT field, that it may be more beneficial to think in terms of whether you are more I or more T when it comes to career counseling.

The I and T obviously come from Information Technology (IT), and we have coupled the two words for many years. In fact, if asked their profession, many programmers, network engineers, etc. might reflexively say "IT."

However, new thinking among IT professionals points to a future when there will be a split between I and T — and people will need to choose between the two career categories.

Previously, the accepted thinking in the industry has been that the best IT professionals are the perfect blend of the two components: understanding the needs and goals of the business side (I), while still having a good handle on the nitty-gritty of technology (T). And to a limited extent, you can have this.

However, as you grow in your career and take on more management responsibility, it is not unusual to start losing some T while growing stronger in the I. There is a perfectly good explanation for this: increasing management responsibilities take you away from the hands-on aspect of technology and force you to maintain your T through reading, not doing.

Yet, as many of you point out in your blog comments, you don’t want to be I at all. You entered the field because of your interest in technology, and you want to remain technically competent. This is very similar to the concept of Applied Science vs. Pure Science.

I gather from my readings (look up versatilists for more info or see: http://www.gartner.com/press_releases/asset_139314_11.html) that in the near future, IT professionals will need to choose the area of expertise that best suits them, and they will be choosing between I and T. According to my research, if you choose T, you had better be EXTREMELY competent because much of that kind of work is being off-shored or given to employees with H1B Visas who will work for less. After all, "The World is Flat", and we live in a global economy.

Conversely, it is predicted that those that focus on the I (information design and management, process design and management, and relationship management) have much rosier futures.

I am condensing a great deal of literature in the statement above and it is probably a gross simplification; however, I think it gets the gist across.

The question then is, do you buy into these predictions and if so, to what degree? There is no doubt we have seen a movement towards making technology, infrastructure, and support — as well as application development — into more of a commodity that can be bought or outsourced. Are we really headed down a road where all T-people will work for a handful of vendors and everyone that is not more I will be out of a job? Is this really where we want to go, and even if it isn’t, is there anything that can be done about it? Or will there be a point where there is pushback and companies change their tune about outsourcing and bring expertise back in house? We have seen some of this, as well, in the Fortune 500.

Personally, I’m not sure about our industry track record when it comes to predicting the future. Having said that, it's never too late to begin assessing your skill set and deciding where you want to be in the next three, five, and 10 years. If you clearly want to focus on technology, make sure you stay current and can show your worth over your competition both here and abroad. If you want to focus on information, then make sure you have the communication and management skills to be a player in that arena. Because of my background, I find it hard to think about doing I without ever having gotten my hands dirty in the T — but I’m sure it is possible. What do you think? Will you have to make this career choice and if so, which will you choose? Is this a reality we want or can it be avoided? Let me know what you think!

Know thy hacker: Investigate what hackers do, how they do it, and how you can protect yourself

Follow an offbeat, non-technical investigation of what hackers do, how they do it, and how you can protect yourself in this sample chapter from Steal This Computer Book 4.0: What They Won't Tell You About the Internet. Learn where the hackers are, how they probe a target and sneak into a computer, and what they do once they get inside. Begin by getting into the mind of a hacker to explore why hackers choose their targets, then closely examine how hackers find a target. Chapter coverage includes: war dialing, port scanning, war driving, accessing a WiFi network, and probing sites by Google hacking, as well as steps you can take to defend your system against these methods.

here is the pdf

Monday, April 9, 2007

Are people in positions of authority really smarter than the rest of us?

Are people in positions of authority really smarter than the rest of us?
Read this article last week and it really hit a note with me. Are all the people i think are more knowledgable than me really so? Ok granted they've more experience than me, and when it comes to a Dr, fair enough..but lets say teachers right..I recently overheard a guy who does a bit of teaching say he just rips stuff of the net and hands it out in class, thats how he "keeps up with the new stuff". Anyone can do that.

Here's the article, any thoughts?

One of the most terrifying lessons I have learned is that, by and large, grown-ups don't really know what they are doing. As a schoolkid, I mistook my teachers for all-knowing, infallible beings protected by an invisible forcefield of adulthood. Even as I grew older, left school, became a student, left polytechnic and became a fledgling adult myself, I laboured under the delusion that people in positions of authority were inherently more "adult" than I was - that they possessed some kind of on-board mental computer that guided them towards making the right decision, even if I didn't always agree with it.
My overdue epiphany finally arrived in my mid-20s, at a barbecue, when I found myself talking to a girl the same age as me who was a schoolteacher, and she described how, much of the time, she was teaching the kids things she had only read the week before in the textbook. As long as she stayed one chapter ahead, she was fine. At first I was genuinely surprised; I had thought all that knowledge was stored in their heads. Then it got worse.
I met a doctor, not much older than myself, who was a) drunk and b) pretty stupid. I realised that in terms of age, I had caught up with the "adults", and was horrified to learn they were all just as ham-fisted as me. At least the young ones were. The older generation surely had a better handle on things, I reasoned. They had to, or the world would slide into chaos. Then I passed 30 and realised I still didn't have a clue what was going on. Now I'm 36, and if there is one thing I do know, it's that I still don't know that much. No one does. Everybody's winging it. Everything is improvised.
And the world never "slides into chaos" - it's perpetually chaotic because all of us, from beggars to emperors, are crashing around trying to make the best of an unpredictable universe. We are little more than walking mistake generators. Dumb animals, essentially. Things would be just as messy if hens ruled the world. This is true, and it's scary. But also sort of glorious.
Consider that an extended caveat for the following humiliating confession: I don't understand the news. Not entirely. Let me explain: I watch and read the news, not obsessively, but probably often enough to be doing my bit as a concerned citizen. But I can't keep up with it. I follow it, but I don't always truly follow it, if you see what I mean.
Entertainment news aside, every story comes with a complex back story consisting of a million tiny events, of countless shades of right and wrong, of mistake piled upon mistake, successes and failures, injustices and struggles. It's like trying to follow the plot of the most complicated and detailed soap opera ever made, one that was running for centuries before you started tuning in. To truly understand a major news story often requires real effort - more than many people are willing to give - which is why most of us know more about celebrities than, say, the Israel-Palestine situation.
I think people who work in hard news often forget this. They are submerged in it. They know the cast, they have followed the storylines and they can't help assuming their readers or viewers have similar knowledge. In reality, most people probably missed the crucial, earlier episodes, and subsequently can't quite relate to the story. We can see it's important - it's the news! - but we don't always feel its importance. If more of us did, there would probably be open revolt - or at least more revolt, more often.
In my mid-20s I wrote for videogames magazines. I was proud of my work. It was just an excuse to write jokes really, and it was great fun. But while videogame fans seemed to like what I did, it was baffling to the average Joe: peppered with terminology about polygon counts and frame rates, and gags that referenced other, older games. To the casual observer, it was a minefield of unfamiliar acronyms.
This is fine for specialist writing but it alienates the outsider. A lot of news coverage is specialist writing. It's news written for news fans. And the stuff that isn't seems to consist of stories about Sienna Miller's arse, which is easy to follow because, well, there's not much to it. Because she is so thin.
I can't help thinking that what we need now, perhaps more than ever, is a populist and accessible Dummies' Guide to Now. The BBC News website does this brilliantly, with regular bite-sized primers attached to major stories, which attempt to explain the back story to newcomers clearly and concisely, without being patronising or stupid. It has simple titles such as "Who is Scooter Libby?", and is a rare oasis of clarity. I would like to see it launch some kind of 24-hour "news companion" channel, or red-button service, that does the same thing on TV: a rolling fill-in-the-blanks service that helps you get up to speed. A catch-up service for reality, if you like. Not dumbed-down news, but clear information - something that often gets lost in the 24-hour scramble of breaking developments and updated headlines.
Maybe it's just me who craves that. Maybe I'm thick. Maybe the rest of you understand everything and I'm alone in my ignorance. But I doubt it. I think the vast majority of us are winging it, at least 18 chapters behind in the textbook and secretly praying no one else will notice. If we all knew more, we would do more to lend a hand, instead of shrugging and hoping the news might some day go away or submerging ourselves in comforting trivia. Don't just tell us what is important. We might not have paid attention earlier. Toss us a bone. Tell us why.

Customize Linux with these 10 boot prompt options

One of the reasons Linux-based systems are so popular is their configurability -- almost every aspect of the operating system's behavior can be customized, making it a wonderful platform for new application development or for deployment into highly customized networked environments. What most users don't realize, however, is that this customization can begin even before the system starts, at the boot prompt itself.

Ten of the most useful Linux prompt options available to you at boot time are explored in Table A. These options can be activated by typing them in at the LILO or GRUB boot prompt, with appropriate modifiers as necessary.

Note: Because kernel versions change quite rapidly, you should always consult your kernel's documentation before using a particular boot option.
Table A

Options
root =

Description
This option defines the Linux root device. It tells the kernel which filesystem should be mounted as the root filesystem (/).

Usage
Use this option when you need to force the kernel to mount a different filesystem as root, typically when performing a rescue operation or setting up a multi-boot system.

Options
ro, rw


These options control how the root filesystem is mounted, whether read-only or read-write.

Usage
Use these options to control whether programs can write to the root file system. Typically, the ro option is used when the file system is damaged and needs to be examined and repaired.

Options
init =

Description
The init process is the system initialization process -- the first process the system runs. This option controls the path to this initialization process.

Usage
Use this option to alter the binary used as the initialization process, often used to boot the system without a password or run a custom startup sequence.

Options
initrd =
Description
The Linux boot loader has the capability to load a bare-bones system from a RAM disk image. This option tells the boot loader where to find this disk image.
Usage
Use this option for dual-phase startup, first using a generic kernel configuration and then loading a customized configuration on top of it. This is most useful when working in environments with wildly different hardware configurations, or when developing a modular kernel, perhaps on low-resource systems.

Options
single
Description
This option forces the system to boot in single-user mode.
Usage
Use this option to activate your Linux system without multi-user support, typically for rescue operations or when performing administrative tasks that require exclusive access to system resources.

Options
mem = K|G|M
Description
This option tells the kernel to use only the specified amount of memory. The amount may be specified in kilobytes (K), megabytes (M) or gigabytes (G).
Usage
Use this option to restrict the amount of memory available to the kernel, usually for simulating kernel performance on low-resource systems with limited memory.


Options
debug
Description
This option tells the kernel to activate its built-in debugging features. Debugging messages are sent to the appropriate event log.
Usage
Use this option when developing new kernel-level code, to test your changes or to view internal status messages sent by the kernel.

Options
vga =
Description
This option controls the VGA mode the kernel will use.
Usage
Use this option to alter the kernel video mode, or to present the user with a list of available video modes to choose from.

Options
panic =
Description
Typically, if a critical error occurs, the kernel "panics" and halts the system completely. A manual reboot is required to restart the system. This option alters this behavior, by specifying the number of seconds the kernel should wait before automatically rebooting the system.
Usage
Use this option to automatically reboot the system in the event of a critical error, particularly useful in the case of unattended server systems.

Options
profile =
Description
This option activates kernel profiling, which is a simple way to analyze overhead and understand which kernel functions are consuming the most resources. Profiling information is stored in /proc/profile.
Usage
Use this option to analyze kernel internals and performance, typically when developing kernel-level code.

Why Microsoft is under assault from all corners

For those keeping an antitrust scorecard in the IT industry, it is increasingly difficult to keep track of all the players.

Intel was sued in the United States, and it has faced antitrust investigations in Japan, Korea and Europe. Sony leads a list of memory chipmakers under antitrust investigation in the United States. Apple's iTunes pricing and interoperability formats have been subject to regulatory scrutiny on antitrust grounds in Europe.

Why is it that very few large IT players are immune from antitrust attack? Are they simply unable to comport themselves with the law? Or is this regulatory trend indicative of governmental lack of faith in the very engine that has created sustained economic growth and innovation in the IT sector: the free market?

One thing never seems to change: Microsoft is always enduring some antitrust challenge--even when it is working with other industry players to create better products. Take, for example, Microsoft's recent agreement with Novell to make Windows server software interoperate better with the Linux server products of Novell.

Last month, oblivious to this agreement, the European Commission issued another statement of objections alleging that Microsoft engaged in bad faith to thwart interoperability in the server market. The Commission's proposed remedy would require Microsoft to make its valuable intellectual property available to its competitors--for free.
While it is difficult to understand the European Commission's pursuit of Microsoft in a highly competitive server market, the Microsoft-Novell agreement was also attacked by the open-source community's Free Software Foundation. The FSF objects to any cooperation between proprietary vendors and open-source vendors, and it vowed to prevent similar deals via its update of the General Public License.

At first glance, the FSF and the Commission attacks on Microsoft appear to be unrelated. But the common thread is this: the attacks are based on a lack of faith that consumer demand will lead a market to where consumers want it to be. It is based on a faulty assumption that a company can use its intellectual property to harm competition rather than fuel it.

The FSF's assault on the Microsoft--Novell deal demonstrates its open hostility to Microsoft's--or any other company's--use of its intellectual-property rights to protect its innovations and inventions. This position is directly contrary to a central premise of free-market economics: IP protections will encourage investment and result in a wider breadth and depth of innovation.

But the inexplicable actions of the European Commission would take us in the same direction as the FSF. The Directorate General for Competition is the regulatory enforcement agent of Europe. Clear European law provides explicit protection to intellectual property through the Parliament's "Software Directive" and many published court decisions.

Yet the Commission alleges that Microsoft has established "unreasonable" prices for its protocol licensing of its server technology in Europe. The Commission characterizes Microsoft's proprietary server software protocols, which is protected by patent, copyright and trade secret law, as containing "virtually no innovation."

The Commission then remarkably concludes that everyone in the industry, nonetheless, "needs" Microsoft's protocols, and that Microsoft should provide them "royalty-free." What the Commission demands in the end is that Microsoft make its intellectual property available to its competitors for free.

Attempting to "outlaw" the Microsoft-Novell deal through changes to the GPL or trying to force Microsoft to disclose its software protocols through regulation and litigation both suffer from the same erroneous foundational assumption--that there is something wrong with the operation and functioning of the free market in general, and that IP protections that underlie the free market.

Microsoft and Novell recognized the basic fact that the consumer is truly in charge of software markets--not regulators, nor free-software advocates like the FSF. The impetus for their groundbreaking agreement was consumer demand. Enterprise customers operating Windows and Linux software wanted better performance.

Both companies realized that the fortunes of both would improve through the agreement. Market forces provide the driving incentive for real solutions. Those wishing to "control" markets should take note.

Yahoo IM, Kerberos, Firefox, and Kaspersky AV vulnerabilities

This week will see five or more Microsoft Security Bulletins which I will cover in my monthly Locksmith column and newsletter.

There is no real word yet as to the content except that there will be one or more security patches and some non-security patches.

But, while we are waiting for those to be released on Tuesday, we have several other things to worry about, starting off with a new Kerberos vulnerability for *nix systems. (Microsoft uses a proprietary version of Kerberos.)

Kerberos

The MIT krb5 telnet daemon reportedly has a vulnerability which would allow a remote attacker to gain root access without a password, see CVE-2007-0956. The details had not yet been posted when I last checked.

FrSIRT’s list of advisories connected with CVE-2007-0956, http://www.frsirt.com/english/CVE-2007-0956.php

includes:

Mandriva

Turbolinux

Ubuntu

Redhat

Fedora

Debian

Gentoo

And more.



Kasperski AV product threats


A number of vulnerabilities have been discovered in Kaspersky products including:

Anti-Virus for Windows Workstation version 6.0 and earlier

Anti-Virus for Windows Server version 6.0 and earlier

Internet Security version 6.0 and earlier

Anti-Virus version 6.0 and earlier



Those using Kasperski products should note that the worst of the four newly reported vulnerabilities are remote code execution threats and should update to the latest version (6.0.2.614).

http://www.kaspersky.com/productupdates

Also, see:

http://www.kaspersky.com/technews?id=203038694
http://www.kaspersky.com/technews?id=203038693





Firefox

There is a remote code execution vulnerability in versions of Firebug prior to 1.01.

The fix is to update to Firebug version 1.02

https://addons.mozilla.org/en-US/firefox/addon/1843



Yahoo! Messenger

The popular IM service has a buffer overflow vulnerability in an ActiveX control used in versions 5.x through 8.x of Yahoo! Messenger which can let an attacker run arbitrary code on user’s systems if the innocently surf past malicious HTML code on a web site while IM is loaded.

See:
http://messenger.yahoo.com/security_update.php?id=031207

This affects ANY Yahoo! Messenger version installed prior to March 13, 2007 and users mush update their program to protect against this critical threat in the ActiveX Audio system.



So, I guess it’s all quiet while we await the big bombs this month from Microsoft (AHH sarcasm).

Sunday, April 8, 2007

India High-Tech Industry Out of Workers

The Trouble With Success: India's High-Tech World Searches Frantically for More Workers

MYSORE, India (AP) -- At the heart of the sprawling corporate campus, in a hilltop building overlooking the immaculately shorn lawns, the sports fields and the hypermodern theater complex, young engineers crowd into a classroom. They are India's best and brightest, with stellar grades that launched them into a high-tech industry growing at more than 25 percent annually. And their topic of the day? Basic telephone skills.
"Hello?" one young man says nervously, holding his hand to his ear like a phone. "Hello? I'd like to leave a message for Number 17. Can I do that?"

Nearly two decades into India's phenomenal growth as an international center for high technology, the industry has a problem: It's running out of workers.

There may be a lot of potential -- Indian schools churn out 400,000 new engineers, the core of the high-tech industry, every year -- but as few as 100,000 are actually ready to join the job world, experts say.

Instead, graduates are leaving universities that are mired in theory classes, and sometimes so poorly funded they don't have computer labs. Even students from the best colleges can be dulled by cram schools and left without the most basic communication skills, according to industry leaders.

So the country's voracious high-tech companies, desperate for ever-increasing numbers of staffers to fill their ranks, have to go hunting.

"The problem is not a shortage of people," said Mohandas Pai, human resources chief for Infosys Technologies, the software giant that built and runs the Mysore campus for its new employees. "It's a shortage of trained people."

From the outside, this nation of 1.03 billion, with its immense English-speaking population, may appear to have a bottomless supply of cheap workers with enough education to claim more outsourced Western jobs.

But things look far different in India, where technology companies are spending hundreds of millions of dollars in a frantic attempt to ensure their profit-making machine keeps producing.

"This is really the Achilles heel of the industry," said James Friedman, an analyst with Susquehanna Financial Group, an investment firm based in Bala Cynwyd, Pa., who has studied the issue.

"When we first started covering the industry, in 2000, there were maybe 50,000 jobs and 500,000 applicants," he said. Now there are perhaps 180,000 annual openings, but only between 100,000 and 200,000 qualified candidates.

For now, industry is keeping up, but only barely. A powerful trade group, the National Association of Software Services Companies, or NASSCOM, estimates a potential shortfall of 500,000 technology professionals by 2010.

On the most basic level, it's a problem of success. The high-tech industry is expanding so fast that the population can't keep up with the demand for high-end workers.

Tata Consultancy Services, for instance, India's largest software company, hires around 3,000 people a month. The consulting firm Accenture plans to hire 8,000 in the next six months and IBM says it will bring on more than 50,000 additional people in India by 2010.

A shortage means something feared here: higher wages.

Much of India's success rests on the fact that its legions of software programmers work for far less than those in the West -- often for one-fourth the salary. If industry can't find enough workers to keep wages low, the companies that look to India for things like software development will turn to competitors, from Poland to the Philippines, and the entire industry could stumble.

The responses range from private "finishing schools" polishing the computer skills of new graduates to multimillion-dollar partnerships spanning business, government and higher education. The biggest companies have built elaborate training centers. The Mysore campus, for instance, was little more than scrub-filled fields when Infosys, India's second-largest software firm, based in the nearby technology hub of Bangalore, began building here in earnest three years ago.

In America, the campus would be nothing unusual. But in India -- with its electricity outages, poverty and mountains of garbage -- the walled-in corporate fantasyland, watched over by armed guards, is anything but normal.

It has 120 faculty members, more than 80 buildings, 2,350 hostel rooms and a 500,000-square-foot education complex. There's a movie complex built inside a geodesic dome. An army of workers sweeps the already-spotless streets and trims the already-perfect lawns.

Month by month, it's getting bigger. Today, some 4,500 students at a time attend the 16-week course for new employees. By September, there will be space for 13,000.

Infosys spent $350 million on the campus, and will spend $140 million this year on training, said Pai, the human resources chief.

"This is the enormous cost we have to pay to ensure we have enough people," he said.

They're not the only ones.

IBM's technical skills programs reached well over 100,000 Indians last year, from children to university professors. At Tata Consultancy Services, measures range from a talent search as far afield as Uruguay to having executives teach university classes -- all designed simply to make people employable.

Most industry leaders believe these investments will pay off, and India will remain competitive. But most are also guarded in their optimism.

"We should be able to get through this year, but if we don't get things like finishing schools into place we'll see an actual shortage," said Kiran Karnik, the NASSCOM chairman.

Much of the problem is rooted in a deeply flawed school system.

As India's economy blossomed over 15 years, spawning a middle class desperate to push their children further up the economic ladder, the higher education system grew dramatically. The number of engineering colleges, for instance, has nearly tripled.

But the problems have simply grown worse.

India has technical institutes that seldom have electricity, and colleges with no computers. There are universities where professors seldom show up. Textbooks can be decades old.

Even at the best schools -- and the government-run Indian Institutes of Technology are among the world's most competitive, with top-level professors and elaborate facilities -- there are problems.

The brutal competition to get into these universities means ambitious students can spend a year or more in private cram schools, giving up everything to study full-time for the entrance exams.

Instruction is by rote learning, and only test scores count.

"Everything else is forgotten: the capacity to think, to write, to be logical, to get along with people," Pai said. The result is smart, well-educated people who can have trouble with such professional basics as working on a team or good phone manners.

"The focus," he said, "is cram, cram, cram, cram."

Things are different at the Infosys campus.

"The premier concern in college was to get maximum marks," said Sanjay Joshi, a 22-year-old engineer midway through Infosys' training course. "Here, the focus is totally on learning."

Much of that learning is technical, mostly focusing on programming. But "soft skills" classes, as they're called, also include such things as e-mail etiquette and problem-solving.

Then there are off-hours. The average age on campus is 22 and for some of them it's their first time away from home. There's a soccer field, a cricket field, a swimming pool with a juice bar, a bowling alley and a gym. There are racks of bicycles to ride.

You could drown in politeness. "Ride Carefully" a sign warns bicyclists at a gentle curve in the road; "Enjoy your visit," a passing student tells a visitor.

Everywhere, there are well-groomed, well-mannered young people.

On a recent morning, students filed into a large classroom for a programming course.

By 8:45 a.m. -- 15 minutes before class began -- the room was nearly full. Row after row of students sat quietly, waiting for the teacher.

404-letter words

The time has come to celebrate yet another of those all-too-unrecognized geek-centric holidays (which I may have just made up): 404 Day! Every April 4th, Web surfers of every persuasion should take time out to celebrate that one universal experience of all Internet consumers and professionals -- the 404 Page Not Found error. No matter which sites you frequent, which ISP you use, or which operating end of the browser zealot spectrum you fall on, we've all had our share of 404s.

So, where did the 404 come from (besides the server, of course)? Like pretty much everything World Wide Web-related, the 404 is an official component of the Hypertext Transfer Protocol (HTTP) specification ratified by the World Wide Web Consortium (W3C).

It first appeared in the version 0.9 HTTP spec, adopted in 1992. If you track down that document, you'll notice a rather telling signature: TimBL. That's the byline of one Tim Berners-Lee, he of the "I invented the World Wide Web and the first Web browser" fame. The same guy who made the modern Web page possible also invented the Page Not Found.

Genius though he was, Berners-Lee didn't spin the HTTP status codes out of whole cloth but based them on the preexisting File Transfer Protocol (FTP) status codes. If you compare the two code listings, you'll find only 10 overlapping codes: 100, 200, 202, 425, 426, 500, 501, 502, 503, and 504.

Only 100 and 200 have similar meanings under both standards -- OK and Continue, respectively -- so it's clear Berners-Lee didn't copy FTP into HTTP. For the record, there is no code 404 in FTP, so that infamous error message is original to the Hypertext Transfer Protocol by way of TimBL.

Rumor has it that, whether or not Berners-Lee suspected that code 404 would become famous by virtue of link rot and lazy sysadmins, he intended that particular numeric to include a sly inside joke. You see, the HTTP status code system bears a striking resemblance to the CERN laboratory building numbering system. CERN, the Swiss techno-mecca, is the birthplace of the World Wide Web, leading some to infer that code 404 is a subtle reference to room 404 at CERN.

The only problem with that theory -- or, rather, that urban legend -- is that there is no room 404 at CERN, and there never has been. The real meaning and origin of the 404 code is far more mundane, with each digit having a specific significance.

WHAT DO THE NUMBERS IN STATUS CODE 404 SIGNIFY UNDER THE FORMAL HTTP SPEC?