Maybe we do need tin foil hats?
Discussion
This is exciting and interesting science
https://www.theguardian.com/technology/2023/may/01...
But I do wonder whether it will eventually be used to read people’s minds without their consent. It is noted that the AI has to be trained for every person but if given the opportunity to observe people and also hear what they are hearing, presumably it could learn without you knowing? At the moment not possibly as they have to sit in a scanner but if they could develop a way of reading without you knowing you were being read…
https://www.theguardian.com/technology/2023/may/01...
But I do wonder whether it will eventually be used to read people’s minds without their consent. It is noted that the AI has to be trained for every person but if given the opportunity to observe people and also hear what they are hearing, presumably it could learn without you knowing? At the moment not possibly as they have to sit in a scanner but if they could develop a way of reading without you knowing you were being read…
I was reading about Geoffrey Hinton, widely regarded as the chief architect of AI, who resigned from Google the other day stating that he regretted his life’s work. Anyone interested can pick up on the article, but the point that struck a chord with me is his fear of what people like Putin would do if they had control of the technology. Proper James Bond stuff and no need for tin foil.
AI has been sneaking its way into everyday life for years, without thought as to consequences. At the moment, it is not intelligent of course, but I get the feeling it's only a matter of time. Oppenheimer's 'I am become death, the destroyer of worlds' was felt to be a bit OTT at the time, but we all love a good aphorism. Now the epithet is more applicable to AI.
I like predictive text, wooly searches and such, but reading minds - that's scary.
Someone suggested he's scared what Putin would do with it. But then there's the US, China, us; that scares me more.
I like predictive text, wooly searches and such, but reading minds - that's scary.
Someone suggested he's scared what Putin would do with it. But then there's the US, China, us; that scares me more.
Blue62 said:
I was reading about Geoffrey Hinton, widely regarded as the chief architect of AI, who resigned from Google the other day stating that he regretted his life’s work. Anyone interested can pick up on the article, but the point that struck a chord with me is his fear of what people like Putin would do if they had control of the technology. Proper James Bond stuff and no need for tin foil.
How far along in this AI thing is Russia? Or will wily old Vlad sign up to ChatGPT via a nom de plume and continue along his path of world domination? Or is it "people like" various US intelligence agencies?DeejRC said:
What the US (or UK) govts would do with advanced AI has never remotely scared me for the simple reason of institutional incompetence.
Very highly competent US commercial companies however are a very different story…
Surly the risk is that governments use AI to become competent. Very highly competent US commercial companies however are a very different story…
Derek Smith said:
AI has been sneaking its way into everyday life for years, without thought as to consequences. At the moment, it is not intelligent of course, but I get the feeling it's only a matter of time.
I don't believe it'll ever be intelligent. It will be very, very convincing a lot of the time but when it gets it wrong it'll be laughable and very damaging. Humans seem to have a habit of offloading vital things to factors that aren't ready. In this case caution should be highest in our minds but there'll be less than ever as it's all sparkly and cheap.
What they're doing is sort of analogous to doing power analysis to attack a crypto processor; looking at patterns of brain energy use to see what's busy and associate that with known text they used as training data.
This is a very crude system though, non transferable between different people and needing long detailed training with a friendly subject and lots of material similar to what you're trying to 'read'. The AI bit is nothing too clever.
Sounds good for dealing with some disability but general purpose 'mind reading' ain't happening with anything like this so leave the tinfoil in the kitchen.
This is a very crude system though, non transferable between different people and needing long detailed training with a friendly subject and lots of material similar to what you're trying to 'read'. The AI bit is nothing too clever.
Sounds good for dealing with some disability but general purpose 'mind reading' ain't happening with anything like this so leave the tinfoil in the kitchen.
Gassing Station | News, Politics & Economics | Top of Page | What's New | My Stuff