My knowledge gets outdated: APIs change, the top of the hour is a different one.
Skills and especially abstract skills don’t get outdated as fast:
Writing cuda kernels is surprisingly like the stuff we did in my first ever C class 25 years ago and I am still reading doc the same way my teacher taught me in 7th grade.
The more you look the more things are the same.
All that is done was done and has been done before; there is nothing new under the sun.
I think there are skill and knowledge components to math—skill at manipulating equations, knowledge of theorems and identities. These play into each other, of course.
It is similar to programming or software in that it involves both skills and knowledge. But the knowledge doe not quickly become obsolete. If you think about it, just about every activity involves both knowledge and skill. There are also different depths to knowledge. The knowledge required to be functional in a field may not be the same level of knowledge required to advance that field.
Learning a specific technology for a single project may have a short half-life. However, good coders aren't defined by tech knowledge, but by their deep understanding. If you can make great presentations in Powerpoint, everyone knows you'll still make good presentations in Google Slides.
Exactly. It's like the difference between thinking about tech proficiency in terms of "being good at C++" vs. being good at software engineering and being language agnostic.
I am not sure about that example. A lot of people learn to use a specific piece of software and memorise how to do stuff and are therefore confused by the slightest difference.
Do people really believe that knowledge from more than 30 months ago has no value? Even the people doing keyword searches on resumes are smarter than that.
I started doing NLP in 2014. First, I was using SVM and feature vectors, then word embeddings, then handcrafted neural network models, then fine-tuning transformer encoders, then working with LLMs. In that time I worked with huge number of technologies, libraries, and algorithms. A hiring manager recently asked me what my experience with AI agents are, and I had to say that it's basically zero.
Okay, he was obviously very new to the field and had no idea, but it illustrates how the field progressed in the past 10 years, and a person who is just joining has very similar starting line to old-timers. The breadth of knowledge I have is of course extremely useful and I am able to get new concepts really fast, as there are many similarities. But the market in general does not care that much really.
The article compares its value to that a Nokia flip phone. Nokia flip phones, while not as valuable as an iPhone, aren't worthless. They can still fetch something on the open market.
They are talking about the half-life, so it should never become valueless, just drop exponentially.
Also, skills and knowledge are different things, right? I’d believe that half the skills picked up in a fast-growing field are obsolete after a couple years.
The Lindy effect can be a useful heuristic. Something invented 30 months probably has less long-term value something that was invented 10 years ago and is being used.
What is "half of what you learn"? Frankly, I think that people underrate the amount of learning that goes into even the smallest things in software development. I think about system utilities: bash/zsh, git, vim, tmux, make, ssh, rsync, docker, LSPs, grep -- all of these are useful and have been useful for a decade or more. C, C++, Java, Python -- all languages which have been useful and will continue to be useful; languages like Go and Rust are really exceptions, not the rule, and even when new ones come onto the scene, by and large languages stay the same more than they change. Things like threading and concurrency, how to manage mutexes. Or background information about the Linux kernel, how it works, how paging, processes, system-calls and the like work underneath the hood. Of course, architectural information is essential if you're doing anything performant, and the minimum time for things to change in that space is the 3 years hardware development cycle, and more practically 5 years or more. Even with GPUs: many things have changed, but practically if you learned CUDA 5 years ago you'd still be doing great today.
It goes even further than single tools like those.
Very little about Go is entirely novel. It’s a newer descendant of C with some additions and excisions but mostly similar syntax. The language is newer, but the way you use it isn’t that new. Java is about equivalent to C++. PHP and Ruby were inspired by Perl and took things in different directions, while Perl itself was an amalgamation of C, BasicPlus, Lisp, shell, and the common shell tools like sed, awk, and tr. From Ruby comes Crystal. From PHP comes Hack.
tmux isn’t entirely different from screen.
git works differently, but on the surface isn’t drastically different from Subversion.
ssh is basically rsh with encryption.
Zsh and Bash are basically in the Bourne / Korn family trees of shells.
Docker has a nicer CLI interface, but a lot of the same concepts as LXC, LXD, jails, and more. Podman inherited a lot from Docker.
make, cmake, imake, Ant, SCons, and other build systems cover the same tasks with different syntax.
GitHub, GitLab, and Jenkins all cover CI/CD differently, but how to build a reliable pipeline is a skill that transfers to new syntax and configuration file names.
>Consider the mathematics of futility: a company investing $2m in training its technical team today can expect half that knowledge to depreciate faster than a luxury car.
It's not like companies are molting their stacks every two years or something. If you are hired today, you skill and knowledge will evolve as the company evolves
I tried and failed to find some kind of concrete methodology that they used to get to the number 30 months. I'm still waiting for quadratic algebra to make my knowledge of linear algebra obsolete.
The things I've learned about the JVM in the 2000's are still mostly true, perhaps with a bit of tweaking.
The things I've learned about process, project management, some distilled concepts around refactoring, testing - all still very valuable and as true as when I learned them. Perhaps not the specific tools, but the concepts are valuable.
Learning C decades ago still has lots of value. Not to mention SQL - come on.
Learn that cool JS tech stack a few years back? Yeah, it's probably dead or radically changed. That integration with Company X? Same.
So clearly there's some distinction to be made here. People are still programming in FORTRAN in some niches. You can decide to invest in boring and stable approaches, or live on the bleeding edge relying on someone's weekend vibe code session.
"courtesy of Harvard Business Review" - there's your problem. Don't look to some MBAs to give you nuanced tech insight. The author of this article: "Harald Agterhuis" is just some recruiter. Of course he's got an incentive to push this BS.
It seems to be relevant to hiring and startups, that is, if you hire people for particular tech skills, or build your startup around some specific area of expertise, half of these skills will be irrelevant in 2.5 years.
On a personal level, the fundamentals will be useful for your entire career, and the more you know, the faster you will be able to get the skill-du-jour. But the idea is that on your résumé, expect to change half of the lines in your "skills" section every 2.5 years, even if it takes you no more than a few hours to add these lines.
It also brushes aside tech in industries like defense, aviation, assembly lines, etc... where you have big, expensive machines, certifications, and projects that span decades. I wouldn't be surprised to find some Fortran code somewhere in the foundries that build the latest AI chips as EUV lithography literally took decades of R&D before it went to production.
I started my career working with Apache, jquery, mysql, and php. Moved to Angular and Java. Then Go, k8s, postgres and a large etc. Back to php. Kotlin. Some python. Some serverless…
And that’s just the tangible stuff. Not to speak about DDD, TDD, clean code, cicd, debugging, and a large etc.
My skills don’t degrade. Each thing I have learned in the past helps me when learning the next hot thing. These “outdated” skills are an investment, not a waste; they are what makes me being a software engineer with X years of experience.
The most important tech "skill" is understanding the underlying nature of how the tech works and the theory of why it works that way. Implementations and permutations will come and go, but this never changes.
Even so, there are specific skills you can learn that are older than most people and will continue to be relevant, like SQL, vi, the terminal.
If you think about all the software you use or interact with on a daily basis, the median one is probably around 30 years old. The issue isn't that skills are becoming less valuable, it's that folks have attention spans that are too short to learn how existing systems work and/or finish what they start.
You have to advance your career, acquiring skills, reputation, and status, as your skills depreciate. Budget time for "career maintenance", and consider the future when selecting a position.
Or you can pick a field where things change less fast, and enjoy life instead of doing "career maintenance" in your free time.
Of course if all the career maintenance happens on paid company time, then I see no problem. But unfortunately for SWE's at least that's often not the case.
I think he is confusing knowledge with skill.
My knowledge gets outdated: APIs change, the top of the hour is a different one.
Skills and especially abstract skills don’t get outdated as fast: Writing cuda kernels is surprisingly like the stuff we did in my first ever C class 25 years ago and I am still reading doc the same way my teacher taught me in 7th grade.
The more you look the more things are the same. All that is done was done and has been done before; there is nothing new under the sun.
Does math fall into skills or knowledge? I think it falls into knowledge. Math never goes obsolete.
I think there are skill and knowledge components to math—skill at manipulating equations, knowledge of theorems and identities. These play into each other, of course.
I think advanced math is a form of art.
It is similar to programming or software in that it involves both skills and knowledge. But the knowledge doe not quickly become obsolete. If you think about it, just about every activity involves both knowledge and skill. There are also different depths to knowledge. The knowledge required to be functional in a field may not be the same level of knowledge required to advance that field.
Not all knowledge. Basic POSIX knowledge should serve you for a long time
Learning a specific technology for a single project may have a short half-life. However, good coders aren't defined by tech knowledge, but by their deep understanding. If you can make great presentations in Powerpoint, everyone knows you'll still make good presentations in Google Slides.
Exactly. It's like the difference between thinking about tech proficiency in terms of "being good at C++" vs. being good at software engineering and being language agnostic.
I am not sure about that example. A lot of people learn to use a specific piece of software and memorise how to do stuff and are therefore confused by the slightest difference.
Those are the people who will fall behind...
Not really, there are quite a few organizations that won't hire you unless you have 15 years experience in Google Slides exactly.
Good engineers can easily change “power point” for “google slides” as needed in their cvs.
Do people really believe that knowledge from more than 30 months ago has no value? Even the people doing keyword searches on resumes are smarter than that.
I started doing NLP in 2014. First, I was using SVM and feature vectors, then word embeddings, then handcrafted neural network models, then fine-tuning transformer encoders, then working with LLMs. In that time I worked with huge number of technologies, libraries, and algorithms. A hiring manager recently asked me what my experience with AI agents are, and I had to say that it's basically zero.
Okay, he was obviously very new to the field and had no idea, but it illustrates how the field progressed in the past 10 years, and a person who is just joining has very similar starting line to old-timers. The breadth of knowledge I have is of course extremely useful and I am able to get new concepts really fast, as there are many similarities. But the market in general does not care that much really.
We were arguing about agents 25 years ago. Everything goes around.
The article compares its value to that a Nokia flip phone. Nokia flip phones, while not as valuable as an iPhone, aren't worthless. They can still fetch something on the open market.
They are talking about the half-life, so it should never become valueless, just drop exponentially.
Also, skills and knowledge are different things, right? I’d believe that half the skills picked up in a fast-growing field are obsolete after a couple years.
Mary Meeker apparently believes it, but she's made a career out of being confidently wrong about things.
The Lindy effect can be a useful heuristic. Something invented 30 months probably has less long-term value something that was invented 10 years ago and is being used.
Yes. Memorizing how a framework works is not knowledge. Knowledge is the deep understanding thing LLMs can't do.
Easily half of the crap I learn from over 2 yrs ago is completely worthless. Mental models of code bases that I no longer work on as one example.
What is "half of what you learn"? Frankly, I think that people underrate the amount of learning that goes into even the smallest things in software development. I think about system utilities: bash/zsh, git, vim, tmux, make, ssh, rsync, docker, LSPs, grep -- all of these are useful and have been useful for a decade or more. C, C++, Java, Python -- all languages which have been useful and will continue to be useful; languages like Go and Rust are really exceptions, not the rule, and even when new ones come onto the scene, by and large languages stay the same more than they change. Things like threading and concurrency, how to manage mutexes. Or background information about the Linux kernel, how it works, how paging, processes, system-calls and the like work underneath the hood. Of course, architectural information is essential if you're doing anything performant, and the minimum time for things to change in that space is the 3 years hardware development cycle, and more practically 5 years or more. Even with GPUs: many things have changed, but practically if you learned CUDA 5 years ago you'd still be doing great today.
It goes even further than single tools like those.
Very little about Go is entirely novel. It’s a newer descendant of C with some additions and excisions but mostly similar syntax. The language is newer, but the way you use it isn’t that new. Java is about equivalent to C++. PHP and Ruby were inspired by Perl and took things in different directions, while Perl itself was an amalgamation of C, BasicPlus, Lisp, shell, and the common shell tools like sed, awk, and tr. From Ruby comes Crystal. From PHP comes Hack.
tmux isn’t entirely different from screen.
git works differently, but on the surface isn’t drastically different from Subversion.
ssh is basically rsh with encryption.
Zsh and Bash are basically in the Bourne / Korn family trees of shells.
Docker has a nicer CLI interface, but a lot of the same concepts as LXC, LXD, jails, and more. Podman inherited a lot from Docker.
make, cmake, imake, Ant, SCons, and other build systems cover the same tasks with different syntax.
GitHub, GitLab, and Jenkins all cover CI/CD differently, but how to build a reliable pipeline is a skill that transfers to new syntax and configuration file names.
That’s like saying the workout you did 2 years ago or the food you ate 2 years ago are now worthless.
That some skills decay in value as time goes on is very old concept and already has a name: https://en.wikipedia.org/wiki/Half-life_of_knowledge
Beyond that, the essay is a rambling mishmash of ideas and unsourced assertions with no real point to it.
>Consider the mathematics of futility: a company investing $2m in training its technical team today can expect half that knowledge to depreciate faster than a luxury car.
It's not like companies are molting their stacks every two years or something. If you are hired today, you skill and knowledge will evolve as the company evolves
I tried and failed to find some kind of concrete methodology that they used to get to the number 30 months. I'm still waiting for quadratic algebra to make my knowledge of linear algebra obsolete.
This is lacking a whole lot of nuance.
The things I've learned about the JVM in the 2000's are still mostly true, perhaps with a bit of tweaking.
The things I've learned about process, project management, some distilled concepts around refactoring, testing - all still very valuable and as true as when I learned them. Perhaps not the specific tools, but the concepts are valuable.
Learning C decades ago still has lots of value. Not to mention SQL - come on.
Learn that cool JS tech stack a few years back? Yeah, it's probably dead or radically changed. That integration with Company X? Same.
So clearly there's some distinction to be made here. People are still programming in FORTRAN in some niches. You can decide to invest in boring and stable approaches, or live on the bleeding edge relying on someone's weekend vibe code session.
"courtesy of Harvard Business Review" - there's your problem. Don't look to some MBAs to give you nuanced tech insight. The author of this article: "Harald Agterhuis" is just some recruiter. Of course he's got an incentive to push this BS.
My recommendation? Flag this low quality article.
It seems to be relevant to hiring and startups, that is, if you hire people for particular tech skills, or build your startup around some specific area of expertise, half of these skills will be irrelevant in 2.5 years.
On a personal level, the fundamentals will be useful for your entire career, and the more you know, the faster you will be able to get the skill-du-jour. But the idea is that on your résumé, expect to change half of the lines in your "skills" section every 2.5 years, even if it takes you no more than a few hours to add these lines.
It also brushes aside tech in industries like defense, aviation, assembly lines, etc... where you have big, expensive machines, certifications, and projects that span decades. I wouldn't be surprised to find some Fortran code somewhere in the foundries that build the latest AI chips as EUV lithography literally took decades of R&D before it went to production.
I started my career working with Apache, jquery, mysql, and php. Moved to Angular and Java. Then Go, k8s, postgres and a large etc. Back to php. Kotlin. Some python. Some serverless…
And that’s just the tangible stuff. Not to speak about DDD, TDD, clean code, cicd, debugging, and a large etc.
My skills don’t degrade. Each thing I have learned in the past helps me when learning the next hot thing. These “outdated” skills are an investment, not a waste; they are what makes me being a software engineer with X years of experience.
The most important tech "skill" is understanding the underlying nature of how the tech works and the theory of why it works that way. Implementations and permutations will come and go, but this never changes.
Even so, there are specific skills you can learn that are older than most people and will continue to be relevant, like SQL, vi, the terminal.
If you think about all the software you use or interact with on a daily basis, the median one is probably around 30 years old. The issue isn't that skills are becoming less valuable, it's that folks have attention spans that are too short to learn how existing systems work and/or finish what they start.
You have to advance your career, acquiring skills, reputation, and status, as your skills depreciate. Budget time for "career maintenance", and consider the future when selecting a position.
Or you can pick a field where things change less fast, and enjoy life instead of doing "career maintenance" in your free time.
Of course if all the career maintenance happens on paid company time, then I see no problem. But unfortunately for SWE's at least that's often not the case.
tell this to a debian maintainer.