(Go stick your head in a pig!)
Come to think of it, “share and enjoy” is exactly the way I would expect an AI-generated YouTube video to end.
(Go stick your head in a pig!)
Come to think of it, “share and enjoy” is exactly the way I would expect an AI-generated YouTube video to end.
Maybe you should considering your career could be at stake.
AI is quickly becoming an integral part of basically every career imaginable. Those that actually take the time to learn how to use it properly are going to inevitably be in a far better position than those too scared to figure it out. The real challenge is finding the balance between using AI as the tool that it is and just getting an easy answer (which, considering all the downvotes I’m getting, is probably the part yall are justifiably concerned with). We need to teach the world (ourselves) how to use AI, not avoid it, and run away like we keep doing. This cat is out of the bag and ain’t never going back.
I am reminded of an argument i had years ago about people relying on google to do their jobs.
I argued that using google to give you the answer to a problem doesn’t help you in the long run. Instead of understanding the solution and being able to use that understanding to solve problems in the future, you just become dependent on google to get you through the day.
It is much more important to learn why a solution fixes a problem and the steps you take to understand the elements of the solution. It opens more doors, and you learn how to use your brain.
Both thinking and googling will get people far, but if google ever went away, only the thinkers would survive.
This is happening again but this time its AI.
The funny thing is, the people who made google search, and the people who created AI, are likely the thinkers.
We had a fresh CS graduate who could only function with ChatGPT. He gave up thinking completely, ChatGPT whatever-latest-model was his thing. He was always arguing with us that the AI told him to do it this way or another. He could not take input from folks with two decades plus of experience during review. He bragged that AI would replace us all in a year. He did not last two months with us - my boss cut him loose after lots of bugs and hideous refactorings. He was more of a drag on the team than any help. Don’t become that guy.
When I code using AI I get the best results by being very specific and write a class with pseudo code for it to fill out with the missing code.
If I just ask for it to write me a class that can X I often get some simple example code directly from stackoverflow.
It’s decent at writing simple string tools etc., because that’s what is out there, the day it starts writing code from API documentation will be a big milestone.
Currently it’s just a parrot that knows Python.
I work with Linux and was recently obligated to work with “Linux admins” from another company. One of them had apparently never used Linux before. I don’t begrudge anyone their lack of experience, but they shouldn’t be in positions that require fairly extensive experience.
Anyway, at one point they were doing a screenshare of some (very simple) code that I wrote but that I’m pretty sure they didn’t know I wrote. They were all collectively trying to figure out how the (again, very simple) script worked (it literally just changed permissions and renamed some things, IIRC). For every single line, they would copy and paste it into ChatGPT and ask what the line did. It was kind of amazing to watch.
That sounds super painful to work with. But also a hilarious anecdote so you got that.
Honestly, it was painful, but mainly because of the ridiculous number of meetings they forced on us. Watching them bumble through messing up their tasks was pretty entertaining.
My job for the last decade has been working with sysadmins on Linux systems. Notice I didn’t say “Linux sysadmins” because most of them aren’t. They know a few commands by rote, but anything beyond that is impossible magic. The concept of the working directory, navigating the file path, permissions, and networking are all beyond their understanding.
I call them “turtles on posts” because they couldn’t have gotten themselves in that position and are now stuck. And since this has been happening for years it’s got nothing to do with AI.
Fortunately for me, I’m probably in the lower half of my company in terms of qualifications; it’s one of the best workforces in which I’ve ever participated. It actually bothered me a lot when I started, but as the saying goes, if you’re the smartest person in the room you’re probably in the wrong room.
The underqualified staff were with another company with whom we were required to work.
This right here is the big, glaringly obvious problem with AI, especially in academics. But, it’s also exactly why this whole issue isn’t really a big deal as long as long enough people learn to use AI correctly. Those that don’t learn and fall into the trap of easy solutions and laziness will always, inevitably fail as soon as they get to the real world and must then either learn or fade into obscurity. Those that do learn how to utilize AI will find far more success and will hopefully be able to pass on their skills and knowledge. Thus, the system, given enough time, kinda corrects itself eventually. It’s just a bit dangerous until then, hence why we need to teach and learn rather than fear what’s coming.
I feel like you are so close to realising why your argument is rubbish.
AI is absolutely a good tool. But only for people who understand what they are doing already.
Its good to help you with arduous tasks but you need to be able to review what it does with knowledge and experience or you wont understand what it gets wrong.
I use it in my job to help me to write large access lists. If i give it the parameters, the addresses i want to give access to and in what ports and protocols etc it can dump a hughe ACL and i can review it and correct any errors i find.
If i didnt know how and ACL was written, didnt know the correct syntax and didnt understand where it should be placed i could very easily apply a dodgey ACL to a live network and fuck things up for everyone.
You keep saying you need to learn how to use it and then its fine.
But its not. You need to learn that its mostly dumb and you need to scrutinise everything it does.
That scrutiny is exactly what I’m getting at when I say we need to learn how to use it. AI is really powerful, but is so incredibly far from being the magic bullet that people think it is. It is just a tool that needs to be applied carefully and responsibly, of course only the people that understand what they are doing are going to succeed. My argument is that we need to be building that understanding and sharing it as widely as possible so that even more people can use the tools properly. And, yes, that means check the fucking output, use your brain instead of replacing it.
AI is quickly replacing a lot of careers.
And it will continue to do so.
I’m amazed that you think otherwise when it’s happening right now.
Also, “taking the time to learn to use it” takes all of what, a couple of days of reading at most if you want it to do something really unusual? We’re not talking about advanced coding here.
AI is not replacing much of anything, not yet anyway. It is evolving and forcing the world to evolve with it. While AI is used to write notes, summarize content, generate content, integrate data, organize life, etc., all of that still requires input of some kind from someone. Careers are going to be all about performing that input and interpreting the result. People will not be replaced (except the ones that refuse to keep up), they will just fill a different role.
You clearly understand nothing about AI if this all you think it is. Sure, anyone can type a prompt and get a garbage result in about 30 seconds, but there is a hell of a lot more to it if you want to actually solve a real problem using AI. Learning advanced coding isn’t actually a bad idea for the future.
Maybe you can understand a different perspective if you stop thinking of AI as gimmicky solution and start thinking of it as what really is, a powerful set of tools meant to make finding the solution easier, nothing more.
https://www.cnn.com/2023/03/01/media/axel-springer-ai-job-cuts/index.html
https://www.wypr.org/2024-05-30/wall-street-journal-layoffs-continue-despite-lucrative-ai-deal-and-record-profits
https://www.ccn.com/news/technology/biggest-tech-layoffs-in-2024-2025-focus-on-ai/
Edit: Here’s one more for kicks- https://www.hollywoodreporter.com/business/business-news/buzzfeed-ai-creators-news-shut-down-1235483607/
First, we are discussing careers, not individuals. No shit people are losing jobs, but guess what, that is exactly what happens when careers evolve or new ones are created. Every. Single. Time.
Think about when precision machining was invented, when printing presses where invented, when cars where invented, when computers where invented, when the fucking internet was invented, etc. Yes, a fuck-ton of people were suddenly out of a job. But then suddenly there are also a whole bunch of brand new jobs and careers to fill. People either learn to adapt and fill those roles, or they don’t, and they get left behind. AI isn’t really any different except that it is happening right now and it’s therefore hard to see what’s to come.
That’s just how the world works. It’s sad and frustrating, I know, but being scared and hiding your head in the sand doesn’t change that fact. Learning how to live and thrive with the new stuff does, though, so maybe let’s try that instead.
Second, and this isn’t to discount everything you linked, but you understand that there is a huge bias going on here, right? People are understandably scared about the future, and the media latches onto that fear and creates articles that feed the narrative beast. But often times, the articles completely neglect to talk about the other side of the coin, which is what we are discussing here.
Okay, well maybe you see things like the death of journalism and the death of criticism and the death of voice-over acting and the death of music composition to be good things, but I don’t know that you’re in the majority there.
And you may also see the massive ecological disaster that AI is becoming is a good thing too. I certainly do not.
https://en.wikipedia.org/wiki/Environmental_impacts_of_artificial_intelligence
What I’m trying to get you to understand is that this isn’t the death of anything, not really. Though it can certainly feel like it, especially right now. This is just the growing pains that goes along with literally every single major advancement in human history, and we always have this same exact unproductive argument. Yes, people get hurt along the way, but that is exactly why it is our collective responsibility to learn properly and mitigate as much of the damage as possible, being scared is never the way to do that. This isn’t the doom of society. It’s simply the dawn of a new version of society.
Continue being scared and wither into obscurity, or learn to adapt and thrive with what is inevitably going to be an integral part of our lives, career or not. The choice is really yours alone, but I want to see everyone succeed, including you and anyone else that reads this garbage.
Also, the ecological impact of AI is an entirely separate discussion, and I would appreciate it if you didn’t pretend to know my stance on the matter to support your arguments. If you want to have that discussion, we can, but not like this.
You mean sort of like it wasn’t the death of artisan textile production after the industrial revolution? Except it absolutely was and people literally starved while the rich got richer on the new machines. Sounds familiar.