I disagree. I used to be a software engineer (and may be again at some point) and the problem with avoiding junior developers is that we need them if we ever want to have any senior developers.
Also, LLMs don’t replace 90% of what a software engineer does. Copilot or whatever is a nice tool that spits out code. It’s not able to architect shit or choose the right tech to use in the first place.
And to be honest, it seems like A.I. progress has hit a bit of a wall and the reality is that it may take decades, trillions of dollars, and maybe even an energy revolution to ever reach its imagined potential. Look at full self-driving cars. The tech seemed like it was 90% there about a decade ago but that last 10% of any big project is the real challenge.
I just see a different picture in the industry. Decision makers also use AI to evaluate your work. If the AI judges that your solution is not good, you face more resistance than if you submitted a solution close to the AI expectations. You are inherently incentived to not introduce original thought beyond what your executives can have explained to them by AI anyway.
I fully understand that this is short-sighted behavior, but it’s real bottom-line-thinking of today.
I disagree. I used to be a software engineer (and may be again at some point) and the problem with avoiding junior developers is that we need them if we ever want to have any senior developers.
Also, LLMs don’t replace 90% of what a software engineer does. Copilot or whatever is a nice tool that spits out code. It’s not able to architect shit or choose the right tech to use in the first place.
And to be honest, it seems like A.I. progress has hit a bit of a wall and the reality is that it may take decades, trillions of dollars, and maybe even an energy revolution to ever reach its imagined potential. Look at full self-driving cars. The tech seemed like it was 90% there about a decade ago but that last 10% of any big project is the real challenge.
I actually personally fully agree with you.
I just see a different picture in the industry. Decision makers also use AI to evaluate your work. If the AI judges that your solution is not good, you face more resistance than if you submitted a solution close to the AI expectations. You are inherently incentived to not introduce original thought beyond what your executives can have explained to them by AI anyway.
I fully understand that this is short-sighted behavior, but it’s real bottom-line-thinking of today.