The internet has had an amazing impact on the world. Not only does it allow us access to more information faster than ever possible before, but it also allows us to have access to software and libraries instantly. As a developer, I rely on tools like Maven or Gradle to automatically retrieve code libraries from the internet on demand. But this speed of disseminating information and code has also created a ‘flavor of the week’ environment for programming frameworks and technologies. Twenty years ago, if you were a professional developer you probably worked in C, C++, Visual Basic, or Delphi. Today, it’s hard to count all the different languages out there. In the Java world alone we have Java, Scala, Groovy, Clojure, Kotlin, and others. For JavaScript frameworks, developers can choose from jQuery, REACT, and Angular – which just accounts for the most popular options. With each new project, companies examine the tools available and change their development stack to utilize the latest and greatest. What used to take a decade or more to become legacy code is now obsoleted in years.
When I entered the IT world, I needed to know one programming language – that was it. Now, looking at the technical requirements provided by employers is a daunting task as each company has cherrypicked what they believe to be the best technologies in a variety of different realms. Developers are increasingly expected to be experts of an unimaginable number of technologies. What’s worse, few technologies never reach maturity before they’re completely overhauled. Consider, for instance, Angular. While it’s an excellent framework for application development, it’s nearly impossible to find answers to questions since the framework has changed so much since its creation. Long-term, the IT world is going to end up with innumerable projects that are no longer maintainable because their technology stack has become obsolete or developers have moved on to the new flavor of the week. Even worse, programmers are becoming a jack-of-all-trades which means that code quality is poorer, bugs increase, and time-to-market is increased as nobody has a mastery of anything they’re working with.
None of this is bringing any value to end users or tech companies. In fact, it’s just creating more code that will be obsolete before it’s even deployed.