Basics remain
Having recently reviewed the technologies across the industry led me to reflect on how today's tech stack compares to what was common 12 years ago.
Surprisingly, many tools have remained in use and popular: GitHub, Docker, Python, PostgreSQL, MongoDB, Elasticsearch, Redis, REST/WebSockets, NGINX, Jira/Asana, Sentry, and Jenkins. The web frontend has evolved significantly with the introduction of event-driven, modular architecture. Slack and Discord replaced, or complement, older methods, and Trello has simplified Kanban. Additionally, GitHub Actions are a great CI/CD alternative, and VSCode seems to be edging out PyCharm.
If you adopted any of these technologies more than a decade ago, you likely didn't need to migrate to or learn the emerging alternative – a clear win-win. However, it wasn't always obvious which technologies would endure. Judging whether a new technology will last can be a challenge for any technical leader. The fear is building a product only for its core dependency to become obsolete later (the more fundamental the technology, the higher the potential cost). One strategy is to assess the level of support for the technology, or use its longevity as an indicator of reliability (aka. the Lindy Effect). Typically, the more a technology is used, especially at a lower level, the longer it is likely to survive – consider Flask, a straightforward library that has endured for years (although some upgraded to FastAPI).
Taking those lessons to the ML stack (including agents), one might feel overwhelmed by the plethora of options available. It’s worth considering the stable and well-supported libraries (such as PyTorch and FAISS)1.
Two principles emerge: invest in foundational, lower-level technologies, and consider a technology's age and the size and responsiveness of its community.
Probably why many developers hesitate to invest in Google's GCP/Vertex due to the company’s history of discontinuing products, despite having substantial active user bases.

