Recently, I came across a Reuters article with the headline: “U.S. scientist operates colleague’s brain from across campus,” which, as you may suspect, piqued my interest.
All sorts of wonderful and horrible fantasies were triggered, ranging from professors being able to lecture to their classes without preparation, to students taking exams for each other. However, after actually reading the content, which began with the claim that they had achieved the first “mind-meld,” I found a much more mundane but still potentially exciting scenario.
Scientist A, wearing a cap with electrodes, was sending his brain’s electrical signals to a colleague on the other end of campus. His colleague, scientist B, wearing a similar cap, received the signals directed to the left-motor cortex, which controls right-hand movement. When scientist A imagined moving his right hand to press a space bar on his keyboard, scientist B involuntarily moved his right index finger in response. While this hardly qualifies as a bona fide “Star Trek” mind-meld, the scientists are hopeful that this technology, when fully developed, could be used productively using the example that “it might one day be harnessed to allow an airline pilot on the ground help someone land a plane whose own pilot is incapacitated.”
As I previously mentioned, this aroused my interest, and further research led to an interesting destination on the Web where one can view a two-part video hosted by Alan Alda (Hawkeye himself). Actually Alda’s presentation is excellent, combining thoughtful interview and analysis on a well-constructed PBS video with just a soupçon of mischief.
The premise of this video is a mock trial of an attempted robbery and shooting that raises questions about the law, neuroscience and privacy. As the trial progresses, Alda breaks in to examine a new technique in neuroscience that raises privacy issues particularly in the law profession.
As background, we first learn about the technology that makes all of this possible: the fMRI (and no, it’s not the stock symbol for a new form of government-backed mortgages), which stands for “functional MRI,” and if you’ve had an MRI, you may already know that it is “magnetic resonance imaging” and doesn’t hurt at all (unless you’re claustrophobic like me). The “functional” part comes in because the fMRI can show the locations in the brain that are working hard, and researchers are mapping the brain to match up a unique function with its precise location. For example, the fusiform face area location of the brain has the job or function of recognizing and categorizing faces. If a person suffers brain damage in that specific area, they will have difficulty recognizing faces. And most importantly for this discussion, if this fusiform area lights up (shows activity as evidenced by more blood flow) during an fMRI, then we can reasonably surmise the subject is in the face-recognition mode. This is one school of thought in current neuroscience; the other important school is that while functional areas exist, they are not so centralized, and actually several locations may collaborate. In either case, the scientist’s goal is to correlate active, physical parts of the brain with human behavior, and great progress is being made.
Using the above as the backstory, the video proceeds to raise the following questions:
The process of eyewitness identification is murky at best; the jury must decide whether prosecutor or defense attorney makes the better case. Suppose fMRI technology could examine the relevant brain area to determine what the witness actually saw? What if it could decide whether the witness or the defendant was lying? What if it could detect which members of the jury were showing racial bias? And even if a person, say the defendant, voluntarily submitted to the fMRI procedure, what about the Fifth Amendment to the Constitution, which protects a witness from testifying against themselves?
All interesting and provocative questions, and if neuroscience technology evolves to the point where we literally can read another person’s mind, what are the implications for society? Will it destroy it, will we pass more regulations to protect privacy, or will we learn to live in a fishbowl?
Dr. Stewart A. Denenberg is an emeritus professor of computer science at Plattsburgh State, retiring recently after 30 years there. Before that, he worked as a technical writer, programmer and consultant to the U.S. Navy and private Industry. Send comments and suggestions to his blog at www.tec-soc.blogspot.com, where there is additional text and links. He can also be reached at firstname.lastname@example.org.