Fresh out of high school, I got a job working in a large bank.  My favorite task was inputting sales figures into a DOS based system and watching it crash when I tried to print reports. I extended my high school computing knowledge by working over the phone with second level support. I confessed that I could understand what they were doing, but I wouldn’t know where to start.

 

They saw some raw potential in me and invited me onto an IT project to roll out new computers to the bank branches nationwide. I was to learn everything on the job: DOS commands, device drivers, Windows NT architecture, TCP/IP addressing, token ring networks, SMTP communications and LDAP architecture. My first IT task was cloning computers off an image on a portable backpack, when the registry didn’t exist & Windows was just a file copy away.

 

Very little was done with a GUI. The only wizard was Windows Installer. When you learn about IT from the ground up and you understand the concepts, you can then troubleshoot. When things didn’t work (and you couldn’t just google the error) I could apply my knowledge of how it should be working, know where to start with some resolution steps and ‘trial and error’ my way out of it. Now, that knowledge means I can cut through the noise of thousands of search results and apply ‘related but not quite the same’ scenarios until I find a resolution.

 

More than one person has commented to me that this generation of teenagers doesn't have the same tech-savvy skills as the previous one, because they are used to consumer IT devices that generally just work. So, in a world where technology is more prevalent than ever, do we get back to basics enough? Do we teach the mechanics of it all to those new to the industry or to someone looking for a solution? Or do we just point them to the fix?