So, let me get this straight... This article is warning about people figuring out your PIN number, by sticking a brain scanner on your head, and showing you PIN numbers until they pick up a specific signal. Just for anyone worried about personal data being revealed by this I have a very effective counter-measure. Don't let suspicious people stick brain scanners on your head and ask you questions.
Oh oh! sounds like maybe we should get the Bat signal dusted off! [ame="http://www.youtube.com/watch?v=qJit_tLOV7E"]Jim Carrey as The Riddler (Best Bits) - YouTube[/ame]
I actually saw one in my cognitive development class where the prof showed us new research where they did something similar, albeit with a far more expensive set up, and without the investigators being able to see the screen or the subject, only the readings, they were able to discriminate between different shapes, simples objects, and a couple more basic categories.
Well, I think the article is quite specific on where the danger is =). BCI are being progressively implemented in videogames and other form of entertainment. Some day they will be commonplace, and then you won't need a stranger to plug a brain scanner in your head. There is a department at my uni that deals with side channel attacks, and my own is more or less involved with BCI, so I'll see if I can get both to talk and give me their opinion on this.
You're right, some day they will be - so if you're ever wearing one and four digit numbers start flashing up on your tv, best to take it off. I just think the headline about 'hacking' the brain is more than a touch sensationalist. Frankly someone who can read body language well could probably do the same thing without the scanner.
Oh of course it's sensationalist, there is not much they can read so far. Still, it's an exciting field with a great future.
Definitely, and I'm really excited about the potential of it. It just annoys me when various media try to make it appear sinister or dangerous. :bang:
Yep. There is a big problem with funding too, specially in the field of robotics. One of the PhDs at my uni is working with a very exciting project, trying to understand from a functional point of view what certain emotions mean, and apply them to machines. She's doing some great progress, but when you show a potential investor an integrated circuit and say it feels pain or love, it doesn't have much of a punch. However, show them a brainless animatronic robot that looks funny, can smile, blush, or look sad, and they'll throw their money at you. It ****es me off a little because it enforces the wrong line of research, me thinks.
If someone wants your pin number they'll probably just threaten you with violence until you give it to them. Like they do already.
Will they be able to do "Inception"? [ame="http://www.youtube.com/watch?v=HPH2T9o_Dz8"]Inception - YouTube[/ame]
The minor difference being that a perfect poker face isn't sufficient. Which of course is not a minor difference, it's a major one. It's essentially a better lie detector test for interrogation. There's also the issue that it processes facial recognition, not just numbers, and I'm sure future research will successfully identify many other forms of neural recognition.
I expect that's because the IC doesn't actually feel pain or love, it just synthesises a behavioural approximation.
So do you think it is impossible for technology to achieve emotions, thought, or possibly sentience? What are your criteria for real in this case?
I think Star Trek pretty well answered these questions. [ame="http://www.youtube.com/watch?v=GYp2dx652ho"]http://www.youtube.com/watch?v=GYp2dx652ho[/ame]
Well if you subscribe to the idea that we are machines, and you agree that we are sentient, then the answer is: of course artificial machines could be made to 'think', whatever that means. But we are nowhere near the point where we could do it and every attempt at doing it has fallen flat.