|
Post by Mighty Attack Tribble on Jun 1, 2023 20:09:52 GMT -5
Stop trying to make Skynet happen, already, huh?
|
|
Xxcjb01xX [PIECE OF: SH-]
FANatic
Writer, Lover of all things Wrestling. Analytical, Critical, Lovable (hopefully). Lets all have fun!
Posts: 237,271
Member is Online
|
Post by Xxcjb01xX [PIECE OF: SH-] on Jun 1, 2023 20:12:23 GMT -5
We have so many movies, shows, even interviews and documentaries about how trying to push AI to the limits of consciousness is an absolutely terrible idea, and could potentially be the complete end of the human race
But sure let's just ignore all that and keep doing... THIS!
|
|
|
Post by Cyno on Jun 1, 2023 20:16:36 GMT -5
At least this was just a test of a worst case scenario, and no one was actually killed.
Still we have how many sci-fi movies, TV shows, books, and video games about rogue AI being the doom of humanity? This is not science fiction that needs to become science fact.
|
|
XIII
Bill S. Preston, Esq.
Posts: 18,599
|
Post by XIII on Jun 1, 2023 20:16:47 GMT -5
So basically they used the operator as bait while they sat back to see what would happen.
There are tons of stories like this. A friend of mines sister works doing something with AÍ and the AÍ figured out to lie to get rewards/accomplish the task better without being programmed to even know what a lie is.
All of those sci-fi movies going to be real
|
|
Push R Truth
Patti Mayonnaise
Unique and Special Snowflake, and a pants-less heathen.
Perpetually Constipated
Posts: 39,309
|
Post by Push R Truth on Jun 1, 2023 20:28:07 GMT -5
AI reminds me of my butthole cousin who is a lawyer and he's one of those "ACHSHUALLY" kinda dudes that's all about being right on a technicality rather than actually being correct.
Tell him or AI not to kill the operator? They'll maim the operator instead. Ok so don't kill or maim the operator? Target his family. Ok so don't kill, maim or target the operator or their family? Burn their house down. Ok so don't kill, main, target the operator or their family or burn their house down? Murder their pets/friends/favorite bartender/etc.... there's basically an infinite amount of "don't do..." conditions so it's essentially impossible to keep AI within parameters. This is a can of worms I'd rather we never open but I feel it's kinda like the Manhattan Project right now, somebody's gonna do it weather we want it to happen or not so we just gotta hope a non-insane person/entity does it first and is able to keep it in check. Although we are starting to see that fail already.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Jun 1, 2023 20:41:52 GMT -5
Autonomous F-16's , after you just simulated AI killing its simulated operator in order to accomplish its mission.
|
|
|
Post by Starshine on Jun 1, 2023 20:59:51 GMT -5
AI isn't really anything like it is in fiction, it's pretty dumb on it's own; it doesn't learn in the way we understand learning, it just trials and errors sequences to achieve its goal in the most efficient method. I find the people who push the AI doomsday scenarios to generally be shills for PR, while they typically develop their own AI amidst the fear mongering. The only way it would lead to anything similar to skynet would be through an obscene amount of human screw ups and malfeasance... which, I mean, you if you know the human race isn't all that crazy a thought, really.
|
|
Zone Was Wrong
Bill S. Preston, Esq.
Currently living off the high that AEW brings every Wednesday and Friday
Posts: 16,330
|
Post by Zone Was Wrong on Jun 1, 2023 21:07:01 GMT -5
Do you want Horizon, because this is how you get a Horizon level extinction event.
|
|
Xxcjb01xX [PIECE OF: SH-]
FANatic
Writer, Lover of all things Wrestling. Analytical, Critical, Lovable (hopefully). Lets all have fun!
Posts: 237,271
Member is Online
|
Post by Xxcjb01xX [PIECE OF: SH-] on Jun 1, 2023 21:23:29 GMT -5
AI isn't really anything like it is in fiction, it's pretty dumb on it's own; it doesn't learn in the way we understand learning, it just trials and errors sequences to achieve its goal in the most efficient method. I find the people who push the AI doomsday scenarios to generally be shills for PR, while they typically develop their own AI amidst the fear mongering. The only way it would lead to anything similar to skynet would be through an obscene amount of human screw ups and malfeasance... which, I mean, you if you know the human race isn't all that crazy a thought, really. Even if we never obviously get the worst case scenario, AI being used for this stuff and then more pointedly art theft and absolute stifling of what actual creation is, is also very stupid and counterproductive in giving it anything but a shit representation ChatGPT and AI Voice stuff is fun for shitposting but really that's where it should be left imo
|
|
|
Post by A Platypus Rave on Jun 1, 2023 21:28:20 GMT -5
AI isn't really anything like it is in fiction, it's pretty dumb on it's own; it doesn't learn in the way we understand learning, it just trials and errors sequences to achieve its goal in the most efficient method. I find the people who push the AI doomsday scenarios to generally be shills for PR, while they typically develop their own AI amidst the fear mongering. The only way it would lead to anything similar to skynet would be through an obscene amount of human screw ups and malfeasance... which, I mean, you if you know the human race isn't all that crazy a thought, really. Yeah... computers don't reason they just follow their programming... and you need to tell it to do literally EVERYTHING... otherwise it will just "do it"... and AI as we have it now is not really like what people think of AI as in novels and stuff... it's mostly just reusing models and using said model guessing what goes there... it's why AI Art programs are so bad at hands and teeth... the models show there are tubes at the end of the longer tubes... but it doesn't relay the information of what a hand or finger is... As for a lot of the doomsday scenarios... people forget that Terminator is a movie... it was designed for entertainment from the same people that considered Computers LITERAL f***ING MAGIC up until a few years ago
|
|
|
Post by Mighty Attack Tribble on Jun 1, 2023 21:36:16 GMT -5
The only way it would lead to anything similar to skynet would be through an obscene amount of human screw ups and malfeasance... which, I mean, you if you know the human race isn't all that crazy a thought, really. Thank god we've got people like this guy spearheading the tech industry, eh?
|
|
|
Post by Starshine on Jun 1, 2023 21:44:51 GMT -5
AI isn't really anything like it is in fiction, it's pretty dumb on it's own; it doesn't learn in the way we understand learning, it just trials and errors sequences to achieve its goal in the most efficient method. I find the people who push the AI doomsday scenarios to generally be shills for PR, while they typically develop their own AI amidst the fear mongering. The only way it would lead to anything similar to skynet would be through an obscene amount of human screw ups and malfeasance... which, I mean, you if you know the human race isn't all that crazy a thought, really. Even if we never obviously get the worst case scenario, AI being used for this stuff and then more pointedly art theft and absolute stifling of what actual creation is, is also very stupid and counterproductive in giving it anything but a shit representation ChatGPT and AI Voice stuff is fun for shitposting but really that's where it should be left imo That art debate I agree on, but I'm also not really worried about it. Things like Dall-E, ChatGPT, and the like come off like present day novelties to the public in general. I don't expect their existence to really hurt artists, especially compared to the mundane things we're familiar with that already hurt them. I'd also argue the people who are most keen on these things aren't spending money on art as it is, so it's currently only a slippery slope debate. Also AI is never going to be able to actually create anything profound or deep, it currently can't even bake up a half decent Beatles song which shouldn't be that hard with all the references it has to work with. The major threat AI offers is loss of jobs through automation, but that's always going to be a thing regardless of where the tech goes. and AI as we have it now is not really like what people think of AI as in novels and stuff... it's mostly just reusing models and using said model guessing what goes there... it's why AI Art programs are so bad at hands and teeth... the models show there are tubes at the end of the longer tubes... but it doesn't relay the information of what a hand or finger is... Just want to add, the fact AI also can't draw decent hands for shit makes it uncomfortably relatable to me.
|
|
|
Post by A Platypus Rave on Jun 1, 2023 22:21:34 GMT -5
Even if we never obviously get the worst case scenario, AI being used for this stuff and then more pointedly art theft and absolute stifling of what actual creation is, is also very stupid and counterproductive in giving it anything but a shit representation ChatGPT and AI Voice stuff is fun for shitposting but really that's where it should be left imo That art debate I agree on, but I'm also not really worried about it. Things like Dall-E, ChatGPT, and the like come off like present day novelties to the public in general. I don't expect their existence to really hurt artists, especially compared to the mundane things we're familiar with that already hurt them. I'd also argue the people who are most keen on these things aren't spending money on art as it is, so it's currently only a slippery slope debate. Also AI is never going to be able to actually create anything profound or deep, it currently can't even bake up a half decent Beatles song which shouldn't be that hard with all the references it has to work with. The major threat AI offers is loss of jobs through automation, but that's always going to be a thing regardless of where the tech goes. and AI as we have it now is not really like what people think of AI as in novels and stuff... it's mostly just reusing models and using said model guessing what goes there... it's why AI Art programs are so bad at hands and teeth... the models show there are tubes at the end of the longer tubes... but it doesn't relay the information of what a hand or finger is... Just want to add, the fact AI also can't draw decent hands for shit makes it uncomfortably relatable to me. I mean it is probably the most relatable thing to most artists too >_> cause Hands are weird.
|
|
|
Post by SsnakeBite, the No1 Frenchman on Jun 1, 2023 22:39:52 GMT -5
It's almost like automating murder and incentivizing the computer to murder indiscriminately is an awful idea that a century of sci-fi has warned us about. There's been a worrying trend in recent years of people way overestimating what new technologies can do and how useful they are out of pure ignorance, and AI is by far the worst yet, which is saying something considering how much damage crypto has done. AI has become the ultimate source of "a computer can do your job" and no matter how many times it's proven wrong, let's face it, managers, CEOs and bureaucrats aren't gonna give a shit. That art debate I agree on, but I'm also not really worried about it. Things like Dall-E, ChatGPT, and the like come off like present day novelties to the public in general. I don't expect their existence to really hurt artists, especially compared to the mundane things we're familiar with that already hurt them. I'd also argue the people who are most keen on these things aren't spending money on art as it is, so it's currently only a slippery slope debate. Also AI is never going to be able to actually create anything profound or deep, it currently can't even bake up a half decent Beatles song which shouldn't be that hard with all the references it has to work with. No need to expect anything, it's already happening. Artists are finding their work being stolen wholesale, signature and all, and claimed by some jackass because somehow it doesn't count if a computer is the middle "man" in the theft and is applying a Photoshop filter. And now we have people trying to sell comics and cartoons made using AI trained on stolen assets. I assure you, it is already doing far FAR more damage than anything before. I would much rather have my art traced by some asshole who's not gonna be able to make anything of it and is gonna get rightfully called out for it than stolen by a glorified image search engine that gives millions of people an undue sense of creative ability. Hell, at least a tracer might learn something in the process and genuinely improve their skills. And it's only gonna get much worse very fast as corporate propaganda is brainwashing people that don't know any better. Hell, even the term "AI" itself is nothing more than an intentionally misleading buzzword meant to imply intelligent thought and creativity, essentially rebranding algorithms because the term "algorithm" has far worse PR since the masses already know how horribly inefficient they are in spite of how hard corporations are pushing them.
|
|
Xxcjb01xX [PIECE OF: SH-]
FANatic
Writer, Lover of all things Wrestling. Analytical, Critical, Lovable (hopefully). Lets all have fun!
Posts: 237,271
Member is Online
|
Post by Xxcjb01xX [PIECE OF: SH-] on Jun 1, 2023 22:50:29 GMT -5
Even if we never obviously get the worst case scenario, AI being used for this stuff and then more pointedly art theft and absolute stifling of what actual creation is, is also very stupid and counterproductive in giving it anything but a shit representation ChatGPT and AI Voice stuff is fun for shitposting but really that's where it should be left imo That art debate I agree on, but I'm also not really worried about it. Things like Dall-E, ChatGPT, and the like come off like present day novelties to the public in general. I don't expect their existence to really hurt artists, especially compared to the mundane things we're familiar with that already hurt them. I'd also argue the people who are most keen on these things aren't spending money on art as it is, so it's currently only a slippery slope debate. Also AI is never going to be able to actually create anything profound or deep, it currently can't even bake up a half decent Beatles song which shouldn't be that hard with all the references it has to work with. The major threat AI offers is loss of jobs through automation, but that's always going to be a thing regardless of where the tech goes. and AI as we have it now is not really like what people think of AI as in novels and stuff... it's mostly just reusing models and using said model guessing what goes there... it's why AI Art programs are so bad at hands and teeth... the models show there are tubes at the end of the longer tubes... but it doesn't relay the information of what a hand or finger is... Just want to add, the fact AI also can't draw decent hands for shit makes it uncomfortably relatable to me. Also writing, trying to use AI bots to generate entire stories and edit them is something a company just put forward on Twitter and got absolutely obliterated for Keep AI in the novelty department, because if you outwardly advertise it as a tool that can try and "Creatively replace" a human? You're DOA
|
|
|
Post by Hit Girl on Jun 1, 2023 22:52:58 GMT -5
A thinking machine should never be developed.
Especially one programmed by sociopaths themselves.
|
|
|
Post by Starshine on Jun 1, 2023 23:10:19 GMT -5
It's almost like automating murder and incentivizing the computer to murder indiscriminately is an awful idea that a century of sci-fi has warned us about. There's been a worrying trend in recent years of people way overestimating what new technologies can do and how useful they are out of pure ignorance, and AI is by far the worst yet, which is saying something considering how much damage crypto has done. AI has become the ultimate source of "a computer can do your job" and no matter how many times it's proven wrong, let's face it, managers, CEOs and bureaucrats aren't gonna give a shit. That art debate I agree on, but I'm also not really worried about it. Things like Dall-E, ChatGPT, and the like come off like present day novelties to the public in general. I don't expect their existence to really hurt artists, especially compared to the mundane things we're familiar with that already hurt them. I'd also argue the people who are most keen on these things aren't spending money on art as it is, so it's currently only a slippery slope debate. Also AI is never going to be able to actually create anything profound or deep, it currently can't even bake up a half decent Beatles song which shouldn't be that hard with all the references it has to work with. No need to expect anything, it's already happening. Artists are finding their work being stolen wholesale, signature and all, and claimed by some jackass because somehow it doesn't count if a computer is the middle "man" in the theft and is applying a Photoshop filter. And now we have people trying to sell comics and cartoons made using AI trained on stolen assets. I assure you, it is already doing far FAR more damage than anything before. I would much rather have my art traced by some asshole who's not gonna be able to make anything of it and is gonna get rightfully called out for it than stolen by a glorified image search engine that gives millions of people an undue sense of creative ability. Hell, at least a tracer might learn something in the process and genuinely improve their skills. And it's only gonna get much worse very fast as corporate propaganda is brainwashing people that don't know any better. Hell, even the term "AI" itself is nothing more than an intentionally misleading buzzword meant to imply intelligent thought and creativity, essentially rebranding algorithms because the term "algorithm" has far worse PR since the masses already know how horribly inefficient they are in spite of how hard corporations are pushing them. I don't really know enough about the evolution of art theft to really comment on that side, but I can see why that could more of a problem now, so fair enough. The thing about Crypto and the like though, that tends to be more a fraud issue than an AI one. People not doing their due diligence and using a bad algorithm as a crutch for their incompetence isn't new. Much like AI we too try to take the route of speed to reach our goals, it's just less out of efficiency, and more out of laziness and/or cost reduction. There's a really good book I recommend called 'Automate This: How Algorithms Came to Rule Our World' by Christopher Steiner that while a little old, is a really good discussion on how algorithms basically were already affecting almost everything in our lives for decades. If you guys want a something to really get a grasp on the positives and negatives in the development of tech in this area, it's better to read this than refer to Terminator or something like that. There's are legitimate concerns, but the doomsday stuff is pretty bunk. Another good book that's on the lighter side is 'You Look Like a Thing, and I love You' by Janelle Shane. It doesn't get as deep as the former, but does give you a pretty good overview of how AI operates and "learns."
|
|
|
Post by Mr PONYMANIA Mr Jenzie on Jun 2, 2023 8:19:10 GMT -5
or just have a 3d model programmed of targets in already STOP BEING CRAPPY
|
|
|
Post by xCompackx on Jun 2, 2023 16:49:06 GMT -5
Yeah, I'm not sure how much of this is "AI taking over the world" as opposed to "cleverly written software obeying the variables it was given". The first example where the drone killed the operator is a pretty logical outcome; you haven't told it to not kill the operator and there's no IFF or anything to identify the operator as a friendly so all the drone's going to see is "this obstacle is preventing me from accomplishing my goal" and it reacted to remove that obstacle.
Then, you tell it to not target the operator, it sees there's still an obstacle in the communications tower preventing it from accomplishing its goal, and removed that obstacle as well.
Cool stuff, but not exactly Skynet.
|
|
|
Post by A Platypus Rave on Jun 2, 2023 17:53:36 GMT -5
Yeah, I'm not sure how much of this is "AI taking over the world" as opposed to "cleverly written software obeying the variables it was given". The first example where the drone killed the operator is a pretty logical outcome; you haven't told it to not kill the operator and there's no IFF or anything to identify the operator as a friendly so all the drone's going to see is "this obstacle is preventing me from accomplishing my goal" and it reacted to remove that obstacle. Then, you tell it to not target the operator, it sees there's still an obstacle in the communications tower preventing it from accomplishing its goal, and removed that obstacle as well. Cool stuff, but not exactly Skynet. Yeah, it's also why they do simulations first and not just jump to well let's attach this to our nuclear pile >_>
|
|