"Have you done any programming in OpenGL?"
"Do you have experience with JQuery?"
"Have you shipped at least one commercial app that was made with Flutter?"
These are common types of questions which often arise during an engineer recruitment process.
Such questions, for sure, are crucial for ensuring that the job candidate will be able to jump right into one's role as soon as possible, and leverage the full maturity of one's expertise in whichever technical framework the company has been working on top of. This is definitely understandable, especially from the standpoint of business owners and marketing experts whose main concern is to complete a viable product and ship it on a timely manner.
As a developer, however, I cannot always ignore an incessantly recurring notion that the emphasis on such specific technical details are sometimes unbearably superficial, and that it is often based on some kind of short-sighted mannerism, politics, and gatekeeping in the IT industry rather than a plan that bears any long-term goal in mind.
Staying up-to-date with popular contemporary technologies is indeed a valuable habit in the job market. After all, one needs to learn how to use SASS in order to avoid writing multiple chunks of CSS in a repetitive manner, learn how to code in C++ in order to ensure that a AAA game is as computationally optimized as possible, and learn how to code in TypeScript because type ambiguity in Javascript can annoy the hell out of everyone.
It is all understandable, and I probably would've had a much greater chance of having high-paying jobs if I focused on mastering quantifiable bite-sized techniques such as specific programming languages, game engines, APIs, and etc, and proceeding so far as to prove my mastery over those techniques by taking tests and building a collection of certificates. And this would've actually helped me not only land on a coding job more easily, but also be equipped with enough skills to build functioning products more quickly.
There is a major problem with this, though.
My brain won't be able to stay fresh enough forever. I am still fairly young and thus am able to adapt myself to new trends in the software industry quite easily, but what about decades from now? When I become an old, middle-aged man in his 50s, will I still be able to learn the details of upcoming technologies of that time, in a competitive manner?
Let's imagine that it is the year 2050, and the most popular game engine I was using during my early years (Unity) has become obsolete. A new generation of incredibly smart kids have already come up with brand new species of hardware, a set of much more powerful programming languages, and much more efficient development pipelines. The aged brain of my older self will never stand a chance against these fiercely intelligent youngsters, who will never hesitate to replace every single grumpy old developer who has nothing else to do than spending the rest of their lifetime blaming the rest of the world for not keeping in tune with their obsolete old-school way of thinking. No matter how hard I try, the biological consequence of my aging process will eventually pull me out of the software development industry and into the coffin of retirement.
Just like Fortran, COBOL, Visual Basic, and myriads of hot technological standards of the 20th century are now nearly gone from the mainstream, today's hot technological standards will eventually sink into oblivion. Unity engine will be old, Unreal engine will be old, Golang programming will be old, and Rust programming will be old. Sure, some of them have established themselves as de facto standards of the industry and thus will last much longer than others (e.g. C/C++, OpenGL/DirectX, HTML/CSS/Javascript, etc). However, it is just a matter of time and luck until they are out of the spotlight. Once a bright young mind happens to come up with a new technology that is unquestionably superior to one I was working with for the last 2 or 3 decades, I will be screwed. At that moment, the only shield which will protect me from rendering decades of my experience obsolete will be the unwillingness of old corporate executives to fit themselves into new technologies, as well as internal corporate politics and gatekeeping.
However,
There are also things that are not going to be obsolete, no matter how much time passes by. Pure academic subjects, such as mathematics, natural science, and other purely theoretical domains of knowledge, will hardly become obsolete even after centuries of time sweeps through countless generations.
Calculus was invented in the 17th century, yet it still is the foundation of many contemporary topics in engineering such as optimization and control. Statistics was popularized in the 19th century, yet it still forms the backbone of scientific reasoning and is becoming ever more important due to the ongoing researches in machine learning & data mining. Discrete mathematics is just as old as the two aforementioned fields, yet it is the foundation of the theory of computation upon which all sorts of algorithms, data structures, architectural models, design patterns, and languages are formulated.
This is why I am focusing on learning more fundamental topics in engineering than specific techniques. Academic subjects which do not depend their existence upon industrial standards, such as programming paradigms (functional programming & logic programming), graph theory, computational logic, mathematical morphology, and other theoretical bodies of ideas which often provide deep insights in terms of how one should structure and implement computer programs, will withstand the deaths of contemporary technologies whilst still being relevant to practical applications.
This learning habit, I believe, will grant me an honor to have conversations with younger generations of engineers that are more productive than a rant like:
"IN MY DAYS, I had to use a tool called "Game Engine" and memorize a bunch of optimization tricks to circumvent the limited computing power of my time! Kids these days just don't understand how hard it used to be!"