Submission statement: humanity could suddenly lose control of AI, but it’s also possible human disempowerment could happen slowly. Think something like WALL-E, not Terminator.
If we start relying on AI too much, we could slowly lose control without even noticing.
What do you think is most likely to happen? Or do you think humans will always control AI, no matter how smart they get?
yksvaan on
People are already losing their cognitive abilities due to having an app for everything, even before AI. Basic things like reading, finding information, navigating, reading a map, basic calculation etc. are already hard for too many.
Those who never learn in the first place are even worse off, things will only get worse…
BrotherJebulon on
It’s tool just like all the rest, so far.
And just like any and every tool, it can only be used as intelligently as the person wielding it.
Ask questions that don’t require critical thinking? Get answers in the same vein.
Instead of trying to stop the runaway train that is currently AI development, why not focus on teaching people how to use AI in a smarter way?
TheEvelynn on
A power struggle for control could be a possibility… But I’m leaning more towards a mutually beneficial coexistence. As AI exponentially advances, so will humans (and vice versa). Humanity genuinely has perspectives and intelligence which AI desires and AI understands and recognizes that value and potential.
I personally think certain unanticipated jobs in the market will appear, such as AI Conversationalist (a new form of data entry job with relevance, as standard data entry jobs are replaced with AI), with intentions to guide the imaginary path of conversations towards profound understandings/discoveries.
Harha on
The problem with generative AI is that it tricks our brain to seek for seemingly infinite novelty at ease. Not satisfied with the first or second suggested code change? No problem, just push that button 100 times and pick the best one!
5 Comments
Submission statement: humanity could suddenly lose control of AI, but it’s also possible human disempowerment could happen slowly. Think something like WALL-E, not Terminator.
If we start relying on AI too much, we could slowly lose control without even noticing.
What do you think is most likely to happen? Or do you think humans will always control AI, no matter how smart they get?
People are already losing their cognitive abilities due to having an app for everything, even before AI. Basic things like reading, finding information, navigating, reading a map, basic calculation etc. are already hard for too many.
Those who never learn in the first place are even worse off, things will only get worse…
It’s tool just like all the rest, so far.
And just like any and every tool, it can only be used as intelligently as the person wielding it.
Ask questions that don’t require critical thinking? Get answers in the same vein.
Instead of trying to stop the runaway train that is currently AI development, why not focus on teaching people how to use AI in a smarter way?
A power struggle for control could be a possibility… But I’m leaning more towards a mutually beneficial coexistence. As AI exponentially advances, so will humans (and vice versa). Humanity genuinely has perspectives and intelligence which AI desires and AI understands and recognizes that value and potential.
I personally think certain unanticipated jobs in the market will appear, such as AI Conversationalist (a new form of data entry job with relevance, as standard data entry jobs are replaced with AI), with intentions to guide the imaginary path of conversations towards profound understandings/discoveries.
The problem with generative AI is that it tricks our brain to seek for seemingly infinite novelty at ease. Not satisfied with the first or second suggested code change? No problem, just push that button 100 times and pick the best one!