2 more trillion, bro. 2 more trillion and I swear it will work, bro. You have given how much already, bro. I’m just asking for 2 trillion more, bro. That’s nothing. 2 more trillion bro, and it will work.
oniiBash2 on
A hidden statistic in these reports is companies *claiming* AI adoption as the basis for mass layoffs, but they aren’t actually using AI.
It’s just easier to say you’ve implemented AI and made employees redundant than it is to say you made bad business decisions and costs are piling up.
More palatable to the investors.
JMurdock77 on
You’ll generally spend as much time fixing problems with AI-generated computer code as you’d spend just writing it yourself.
pablo5426 on
bubble will pop before the year ends
Melodic-Account9247 on
My main problem with AI isn’t even the tech itself cuz at it’s core the tech itself is groundbreaking for stuff like medical research or even simpler stuff like learning coding but everyone is shoveling so much money in to it that they started cramming it everywhere just to make some sort of return which isn’t going to happen people don’t want your chatbots they don’t need your cringe image generation what they wanna see is this technology being applied to places where it would actually matter it’s like everyone is putting on the shittiest most expensive magic show that’s supposed to wow the world and it’s just a 6 fingered lady in a bikini no one wants to see that stop burning the entire worlds economy for this shit
virtual_adam on
>Gownder believes jobs that AI takes won’t come back, as typically with job losses after economic rebounds.
People are upvoting this thinking this is an anti ai article, except it’s an article saying ai is so good it will replace employees instead of require them to hold the wheel
sweetno on
Yes, this is _precisely_ what’s going on.
Worth_Heart_2313 on
Coding with AI is the most torture even expert programmer find taxing. And that was the only good use case where one can get some help but not fully automated for sure.
MicesNicely on
I still have no clue how I am supposed to deploy AI when I can’t trust its results. How do we put something inherently unreliable into business processes?
Bob_Spud on
That is a good question for the shareholders AGM?
Konukaame on
>During our conversation, Gownder cited US Bureau of Labour Statistics that suggest the advent of the personal computer also did not improve productivity, which improved by 2.7 per cent annually from 1947 to 1973, but just 2.1 percent between 1990 and 2001.
This is a baffling conclusion. “Did not improve productivity” would mean that the increase in productivity is zero. “Line didn’t go up as fast as it did in some other timeframe” is not the same as “line didn’t go up” especially when dealing with compounding growth rates.Â
ZealousidealWinner on
Listen to AI snake oil salesmen, win stupid prizes
Resaren on
I’m guessing a lot of people are not going to read the article, so they’ll miss that this was also the case for the computer boom. Yet I’m sure most would agree that investing in computers back then was a good idea.
It’s possible that AI is in a similar situation as computing is in the 1970s and 80s, where it actually produces a lot of value but a combination of overinvestment and productivity measurement problems means it is not as clearly visible in the aggregate statistics. We obviously also have to account for the very real lag between when the investment is made and the value materializes, which can be 2-5 years.
There is a very common concept in computer science called “Garbage in, Garbage Out”.
If you feed in rubbish to a tool, you get rubbish as output. Its unfortunate but that is just how it works 😉
Fishtoart on
If there is no increase in productivity, how are Amazon and MS managing to get work done after laying off 10s of thousands of employees?
koolaidismything on
Most productive thing I’ve done is avoid that trash
Choles2rol on
Claude has made me more productive, but I have the babysit then hell out of it. Gotta work very incrementally. It’s nice for getting code written when I’m stuck in meetings all day though. All that said I have caught it doing the most insanely dumb shit. If you don’t know how to write code you shouldn’t be using tools like this or you’ll get your ass bit.
CelebrationFit8548 on
There are some metrics it is excelling at, like: most hated, least trusted, most annoying, poorest quality, lowest grade, etc.
20 Comments
Because it wouldn’t show up .
2 more trillion, bro. 2 more trillion and I swear it will work, bro. You have given how much already, bro. I’m just asking for 2 trillion more, bro. That’s nothing. 2 more trillion bro, and it will work.
A hidden statistic in these reports is companies *claiming* AI adoption as the basis for mass layoffs, but they aren’t actually using AI.
It’s just easier to say you’ve implemented AI and made employees redundant than it is to say you made bad business decisions and costs are piling up.
More palatable to the investors.
You’ll generally spend as much time fixing problems with AI-generated computer code as you’d spend just writing it yourself.
bubble will pop before the year ends
My main problem with AI isn’t even the tech itself cuz at it’s core the tech itself is groundbreaking for stuff like medical research or even simpler stuff like learning coding but everyone is shoveling so much money in to it that they started cramming it everywhere just to make some sort of return which isn’t going to happen people don’t want your chatbots they don’t need your cringe image generation what they wanna see is this technology being applied to places where it would actually matter it’s like everyone is putting on the shittiest most expensive magic show that’s supposed to wow the world and it’s just a 6 fingered lady in a bikini no one wants to see that stop burning the entire worlds economy for this shit
>Gownder believes jobs that AI takes won’t come back, as typically with job losses after economic rebounds.
People are upvoting this thinking this is an anti ai article, except it’s an article saying ai is so good it will replace employees instead of require them to hold the wheel
Yes, this is _precisely_ what’s going on.
Coding with AI is the most torture even expert programmer find taxing. And that was the only good use case where one can get some help but not fully automated for sure.
I still have no clue how I am supposed to deploy AI when I can’t trust its results. How do we put something inherently unreliable into business processes?
That is a good question for the shareholders AGM?
>During our conversation, Gownder cited US Bureau of Labour Statistics that suggest the advent of the personal computer also did not improve productivity, which improved by 2.7 per cent annually from 1947 to 1973, but just 2.1 percent between 1990 and 2001.
This is a baffling conclusion. “Did not improve productivity” would mean that the increase in productivity is zero. “Line didn’t go up as fast as it did in some other timeframe” is not the same as “line didn’t go up” especially when dealing with compounding growth rates.Â
Listen to AI snake oil salesmen, win stupid prizes
I’m guessing a lot of people are not going to read the article, so they’ll miss that this was also the case for the computer boom. Yet I’m sure most would agree that investing in computers back then was a good idea.
It’s possible that AI is in a similar situation as computing is in the 1970s and 80s, where it actually produces a lot of value but a combination of overinvestment and productivity measurement problems means it is not as clearly visible in the aggregate statistics. We obviously also have to account for the very real lag between when the investment is made and the value materializes, which can be 2-5 years.
https://en.wikipedia.org/wiki/Garbage_in,_garbage_out
There is a very common concept in computer science called “Garbage in, Garbage Out”.
If you feed in rubbish to a tool, you get rubbish as output. Its unfortunate but that is just how it works 😉
If there is no increase in productivity, how are Amazon and MS managing to get work done after laying off 10s of thousands of employees?
Most productive thing I’ve done is avoid that trash
Claude has made me more productive, but I have the babysit then hell out of it. Gotta work very incrementally. It’s nice for getting code written when I’m stuck in meetings all day though. All that said I have caught it doing the most insanely dumb shit. If you don’t know how to write code you shouldn’t be using tools like this or you’ll get your ass bit.
There are some metrics it is excelling at, like: most hated, least trusted, most annoying, poorest quality, lowest grade, etc.
It’s garbage
Ban it