What do we do from here?
I find it a bit troubling that, knowing what we know, the world still keeps spinning as if everything that has happened is the natural state of development. Some people say it's the pervasiveness of social issues that are happening nowadays, some say it's social media, some say it's the compounding economic pressure because of misplaced government policies, maybe all of it combined has numbed us to the many things that are happening. As someone who graduated from two technology-aligned degrees in the last five years, and has put a significant effort into trying to keep constantly updated with the world's development (because technology and geopolitics are intricately linked), the last few years have been more than just concerning.
Where do we start... the AI development race, the energy usage of AI, the job cuts in the name of efficiency, AI scraping and using private data for its training purposes (and getting away with it), people who are part of the workforce embracing and "leveraging" AI ignoring all of its harmful side effects, the junior job pipeline being cut in front of our eyes, and don't even get me started on politics. But hey, now we can generate code just by talking to our computers! Hurrah!
On a serious note though, these technologies are not just being used to code up a few indie games or to-do pages for junior developers. AI companies are selling their services to the US government to fuel spying tech and unmanned missiles. Just a few years ago, one might have had a hopeful, positive outlook on the world's trajectory. It's amazing how all of those wishful dreams could be washed away in one fell swoop within the timeframe of a couple of years. Regardless of the state of the world we are in currently, in technology, in feels like everything's been moving at the speed of light. Just a few years ago, for example, when I started my master's degree, coding with AI assistance was deeply frowned upon. However, nowadays even lecturers are making presentations about how to leverage AI to code, senior engineers are vibe coding with a plethora of tools to choose from, ranging from Cursor, Claude Code, Copilot, etc.
It's already March of 2026, we don't know if God-like AI is on the horizon or not (though some claim it is), but one thing is for damn sure: even though no one understands its upsides or downsides, almost everyone is pushing each other to practice "AI adoption" and those who don't are "left behind".
To be honest with you, from an ethical standpoint, does this really make sense? The technology is what it is: we can give AI certain inputs and it garbles that information as context into outputs based on its previous learnings from training data. I guess the fact that it's so powerful in accelerating certain aspects of work which previously would have been considered as mundane, is the alluring part for businesses looking for yet more ways to cut costs.