Skills to Retire On
Published 02 Jun 2016. Tags: computer-science.
Albert, Jenny, and I were chatting yesterday on the way back from PyCon about “skills to retire on.” By that we meant skills or technologies that:
- Are timeless, in the sense that they’re unlikely to go stale over the course of your career, and
- Have a lot of depth. There’s always more to learn about them, and the incremental things you learn will still be useful.
These are skills that you can usefully spend your whole career improving.
TCP is an example. It’ll still be around in thirty years, and there’s always more to learn about it.
Conversely, Web development has a lot of depth—I’m always learning new stuff about it—but the Web’s younger than I am and might not be around until I retire. There’s also enough churn that plenty of the things that I’ve learned in the last few years are already obsolete. It certainly doesn’t feel “timeless.” Since those skills will expire rapidly, getting better at Web development might not be the best investment of my time.
We put together a little list of technologies that felt like they were both permanent(-ish) and had significant depth:
- IP, UDP, and TCP
- shell scripting (and, more generally, munging Unix tools together)
- C, including Unix syscalls
- SQL and relational databases
- Emacs and/or vi
- boring old lock-based,
- math, especially discrete math, probability, statistics, and linear algebra
- object-oriented design patterns
- regular expressions
- build systems, especially
- algorithms & data structures
- assembly and computer architecture (x86 is probably a safe bet)
- classic distributed systems algorithms (consensus, snapshots, etc)
The fact that this is all old stuff isn’t an accident. A convenient way to estimate how long a technology will last is to assume that it’s in the middle of its useful life.2 Since C has been around for forty years, we should assume it won’t die for another forty. Elixir first showed up four years ago, so we’d guess that it’ll be around for about four more years. This is a really crude way to estimate, but it’s probably not a bad first approximation.
Anyway, this was an interesting conversation for me. Is it better to pick up new trends as they come along, or should we hunker down and really invest in learning a few things in depth? Probably a bit of both! Learning new technologies is a high-risk/high-reward proposition: you may be learning the Next Big Thing, but more likely not.3 These skills aren’t as flashy, but you can be confident that your time learning them is well-spent.
By the way, if you’re interested in learning more about some of these, I wrote an article awhile back on Reading the CS Canon. That has a bunch of specific book and paper recommendations covering a number of these topics.
I’m not sure that git will be in common use in thirty years, but I suspect most of the concepts will still be valid. ↩
I can’t find a citation, but I’ve seen this idea pop up here and there. ↩
That’s not really true, of course, since you’ve still learned something and lots of skills transfer well and undergoing the process of learning is improving your brain, probably, but that’s not quite what I mean here, so let’s not quibble about it. ↩