The Closing of the Scientific Mind

Pic: Repdan (CC)

Pic: Repdan (CC)

Yale Professor of Computer Science David Gelernter thinks that science as become an “international bully”. You may recall that Gelernter was severely injured after receiving a mail bomb from Ted Kaczynski. Wonder what an open dialogue between these two would have been like had Kaczynski chosen a more peaceable tactic for his activism?*

Via Commentary:

The huge cultural authority science has acquired over the past century imposes large duties on every scientist. Scientists have acquired the power to impress and intimidate every time they open their mouths, and it is their responsibility to keep this power in mind no matter what they say or do. Too many have forgotten their obligation to approach with due respect the scholarly, artistic, religious, humanistic work that has always been mankind’s main spiritual support. Scientists are (on average) no more likely to understand this work than the man in the street is to understand quantum physics. But science used to know enough to approach cautiously and admire from outside, and to build its own work on a deep belief in human dignity. No longer.

Belittling Humanity.

Today science and the “philosophy of mind”—its thoughtful assistant, which is sometimes smarter than the boss—are threatening Western culture with the exact opposite of humanism. Call it roboticism. Man is the measure of all things, Protagoras said. Today we add, and computers are the measure of all men.

Many scientists are proud of having booted man off his throne at the center of the universe and reduced him to just one more creature—an especially annoying one—in the great intergalactic zoo. That is their right. But when scientists use this locker-room braggadocio to belittle the human viewpoint, to belittle human life and values and virtues and civilization and moral, spiritual, and religious discoveries, which is all we human beings possess or ever will, they have outrun their own empiricism. They are abusing their cultural standing. Science has become an international bully.

Nowhere is its bullying more outrageous than in its assault on the phenomenon known as subjectivity.

Keep reading at Commentary.

 

*Introduction to this excerpt written by editor (MS).

, , , , ,

  • gustave courbet

    While I don’t entirely agree with his assessments, it’s good to see someone in the intelligentsia taking a shot at what Gelertner describes as the ‘Kurzweil Cult.’ Unsurprisingly, most transhumanist advocates are scientific materialists that have never applied empiricism to the investigation of their own consciousness, and are thus crippled by their partial perspective. As MLK said “The means by which we live have outdistanced the ends for which we live. Our scientific power has outrun our spiritual power. We have guided missiles and misguided men.”

  • Calypso_1

    1.You can transfer a program easily from one computer to another, but you can’t transfer a mind, ever, from one brain to another.
    >>>Try loading a current OS on a Commodore 64. Not only are prosthetic hippocampi being tested in rats & primates, artificial memories have been created in fully biological mice with optogenetics.

    2. You can run an endless series of different programs on any one computer, but only one“program” runs, or ever can run, on any one human brain.
    >>>BS – The brain uses a myriad of programs and generates new ones. There are trillions of seperate circuits w/ totally independent function.

    3. Software is transparent. I can read off the precise state of the entire program at any time. Minds are opaque—there is no way I can know what you are thinking unless you tell me.
    >>> Reverse engineering is fun! Recording and imaging of active networks of up to 1000 neurons was achieved 2013 in living mice.

    4. Computers can be erased; minds cannot.
    >>>Data remanence is real. Amnesia is real. Individual memories have been successfully erased from mice.

    5. Computers can be made to operate precisely as we choose; minds cannot.
    >>> 1) Precise is relative. 2)Windows. 3) This -> http://www.youtube.com/watch?v=PhRa3REdozw

    • Adam’s Shadow

      “Computers can be made to operate precisely as we choose; minds cannot.”

      I don’t understand anyone, either your average consumer or a programmer, who thinks that on a practical level computers (which in this age also includes smart phones) ever operate precisely as intended. Telecommunications networks in particular are especially subject to the whims of weather.

      • Calypso_1

        and squirrels, remote terminal boxes are great places for nuts.

      • Lookinfor Buford

        Actually no, Adam.. The quote did not say “intend”, it said “choose”. You’re correct, it is highly improbable for any given program to work as intended. But this is not a mystical phenomenon, it’s perfectly identifiable.
        There’s an old cliché in coding (proper quote forgotten, so here’s a paraphrasing) Software won’t do what it’s supposed to do, because it only does exactly what we tell it to do.

        • Adam’s Shadow

          I did not intend it to be a reflection on something mystical, just a comment on the unpredictability inherent in everything, even something as supposedly exact as computer science and information technology.

    • kowalityjesus

      ARRRGGH, that may have been an awesome comment, but the icing on the cake was BACH!!!!!!!XD

    • Lookinfor Buford

      >>>Try loading a current OS on a Commodore 64. Not only are prosthetic hippocampi being tested in rats & primates, artificial memories have been created in fully biological mice with optogenetics.
      Windows on Commodore is just a port, his statement is correct. Prosthetic organs do not change the fact that the program in one mind cannot be transferred to another.

      >>>BS – The brain uses a myriad of programs and generates new ones. There are trillions of seperate circuits w/ totally independent function.
      This is the equivalence of multi-threading/multi-core, etc. I believe his statement was valid in that he means, your mind’s forefront, the outward projection of it, at any given time is singular. You can do two things at once, but the ability to do that is a result of your final consolidation of tasks, resulting in an action.
      You missed his point, that another individuals mind cannot simultaneously exist parallel to yours, in your brain, whereas on a computer, completely foreign programs which produce distinct actions that derive from distinct sets of processes, can.

      3. Software is transparent. I can read off the precise state of the entire program at any time. Minds are opaque—there is no way I can know what you are thinking unless you tell me.
      >>> Reverse engineering is fun! Recording and imaging of active networks of up to 1000 neurons was achieved 2013 in living mice.

      Recording and imaging (i.e. mapping), is not equal to understanding the thoughts of these mice, only vague interpretations. He did say ‘precise’ state, which is accurate in computing.

      4. Computers can be erased; minds cannot.
      >>>Data remanence is real. Amnesia is real. Individual memories have been successfully erased from mice.

      Agreed, but he’s speaking in the context of falsifying the analogy between mind and computer. You cannot erase one’s memories, or thoughts without basically destroying the mind, and certainly can’t do it easily and precisely, as you can with computers.

      5. Computers can be made to operate precisely as we choose; minds cannot.
      >>> 1) Precise is relative. 2)Windows. 3) This -> http://www.youtube.com/watch?v
      Did Bach have any stray thoughts while his heavily conditioned motor skills carried him through this performance? Any emotions? Were these thoughts and emotions precisely the same as the last time he performed it?
      In a computer program accomplishing the same feat, they would be.

    • Lookinfor Buford

      1.You can transfer a program easily from one computer to another, but you can’t transfer a mind, ever, from one brain to another.
      >>>Try loading a current OS on a Commodore 64. Not only are prosthetic hippocampi being tested in rats & primates, artificial memories have been created in fully biological mice with optogenetics.

      Windows/Unix on Commodore is just a (relatively) simple port. Prosthetic components of the brain do not allow for the transfer of one person’s programming to another. Brain transplants are pretty much a dead end because of the inability of science to figure out how to transfer consciousness.

      2. You can run an endless series of different programs on any one computer, but only one“program” runs, or ever can run, on any one human brain.
      >>>BS – The brain uses a myriad of programs and generates new ones.
      There are trillions of seperate circuits w/ totally independent function.

      This is the equivalent of multi-threading/multi-cores, but at any given moment you can only produce one coherent result from all these processes. But I think his point was one of utility, your brain will never support someone else’s mind, whereas computers are utilities for any program. Think of it as your identity being your one an only program, you can change it, modify it, but not replace it (notwithstanding severe trauma and relearning, but that is not an intended utilization of the brain).

      3. Software is transparent. I can read off the precise state of the entire program at any time. Minds are opaque—there is no way I can know what you are thinking unless you tell me.
      >>> Reverse engineering is fun! Recording and imaging of active networks
      of up to 1000 neurons was achieved 2013 in living mice.

      Imaging and recording does not equal understanding state. We can do these things but it is entirely vague, subjective, and impossible to accurately and definitively explain the state. The same is not true of computing.

      4. Computers can be erased; minds cannot.
      >>>Data remanence is real. Amnesia is real. Individual memories have been successfully erased from mice.

      Ok, fine. But you can’t do it consistently, and without damaging other facets you did not intend to damage. Furthermore, the mind has defense mechs that would not consent, where computers do not.

      >>> 1) Precise is relative. 2)Windows. 3) Bach

      Precise is not relative. Did Bach have any stray thoughts while his finely honed motor skills carried him through this performance? Any Emotions? Were these thoughts and emotions precisely the same as the last time he performed it? They would be, in a computer accomplishing the same feat

      • Calypso_1

        1- a)windows 7 on a comm-64: show me. you can emulate the reverse with ease.
        b) if you can’t follow incremental steps of progress you are completely unqualified to even contemplate the process. Memory is part of a persons programing and that is in the early stages of being hacked. The brain/consciousness divide is moot. No one knows, at this time no one can no. Anyone that says otherwise is deluded. We may find ‘consciousness” is not any one thing, or that it transfers quite readily – or not. It may be emergent from any sufficiently complex cybernetic system. Or it may be non-local.
        2 – you do not know if a brain can support another mind or not. Much of the ‘mind’ is hardwired. In that sense a computer cannot support another ‘mind’ either without replacing parts. A better analogy to software is ideas. If another mind has an idea. Music, literature, an equation and another hardware mind is capable of running that idea than you have indeed run another software ‘mind’. The idea of identity being a ‘one and only’ program is how people end up with extremely inflexible minds. An identity is made of many components many of which have little to do with each other than that they share the same body. We all have multiple dissociative identities; it’s simply a mater of degree.
        3 – Your comment allows me to discern the level of knowledge you have of both neuro & computer sciences. Refer back to b) 1st sentence.
        4 – Consistently? In an interventional state, absolutely. Both short and long term memory can be stopped. Computers don’t have defenses? …..
        5 – Precise is utterly relative. Precision with a 10lb sledge hammer v. a scalpel. Miles v. mm.
        Bach have stray thoughts, undoubtedly. I promise you that a computer accomplishing the same feat could not possible do so without variation in processes. The more complex the system the more variation and regulation must occur.

        • Lookinfor Buford

          1-a) meh, why?

          1-b) agreed. true.. no one knows. but reality is we can ‘hope’ or ‘speculate’ all day about a break through, but there is currently zero understanding of it, and zero chance that our current capabilities will create it (consciousness). And I am qualified to say that.

          2. again blah blah – what if this – you don’t know that, blah.. Really, I love your out of the box stuff, I really do, but again, we are no closer to emulating human thought with silicon than we were in the days of Alan Turing. The best we have is very complex algorithms that can mimic behaviors.

          3 – no it doesn’t. You have no idea my qualifications. If you think snapping an image of a monkey’s brain, and recognizing patterns and connections to behavior is a complete understanding of the state of the system, you are again wrong. And what I said is correct, this absolutely can be achieved in ALL computing applications today, even the most complex systems.

          4 – wut? no. and you’re wrong. you can’t identify specific memories and erase them, and do this with any sort of accuracy. neither can neuro surgeons

          5 – Context != Relativity.

          ” I promise you that a computer accomplishing the same feat could not possible do so without variation in processes.”
          AND HERE is your most incorrect statement of all. Absolutely 100% false. Every single thing the computer does with todays tech, in order to play Bach, is predetermined, a finite state machine, and there would be absolutely zero deviation from it’s programming. Zero.

          • Calypso_1

            1) because you said it was possible
            2) behavior is algorithms. light.move toward, dark.move away
            build in complexity add feedback.
            3)please
            4)every watched a brain surgery? watched as specific memories are triggered by an electrical impulse? up the amps.
            5) OK
            6)define “play” as in play an audio sample. as in hardware specifically designed only to play audio, it is close to possible. Even in this scenario their are active audio systems that can determine optimum acoustics for a given environment, this produces variance just as a human would in making adjustments. Playback on a machine such as a PC will not produce zero deviation in process as the operating system is performing other tasks. If you actually want to play it via cybernetic/instrument interface their will be variation.

    • Kevin Leonard

      I would also challenge the assertion that the optogenetics/ mice experiment is implanting false memories. Rather, it may also be explained as an advanced stimulus/response.

      • Calypso_1

        I don’t disagree. However, one could also say the same thing about many of our own natural memories of which are proven to be constantly modified or entirely constructible . Fundamentally, all that constitutes ‘memory’ is unknown. It is undoubtably an amalgamation of many functions, but it cannot be a true representation of actual events given the inherent limitations of perceptions, functional cognitive biases and tremendous amount of neural filtering.
        Now if you get into eidetic memory, its crossing over into a ‘recording’ of sense perception. This appears to be one aspect of memory functions in the brain. It is not one that is readily available as a primary memory attribute. There is no reason to believe that the fundamental encoding of these perceptions will not be deciphered at some point. Many steps have already been made. That we can store exact audio and visual representations in other media hints that very detailed depictions of reality could be transferred to biological format once they are decoded.

        • Kevin Leonard

          I will not argue against imperfect memories and our manipulations of them, nor that we are essentially clueless about what constitutes memory.

          However, I have seen no evidential reason to believe that the “media” of the brain/ mind will ever be decoded. I understand we have been “imaging active networks,” which, to me, amounts to little more than monitoring the active paths of electrical circuits in a radio reciever. It does not describe the content of what is passing throught the circuits in a completely meaningful way. Or perhaps an appropriate metaphor, might be that it is like trying to appreciate Mozart (or Bach) by viewing a spectrogram. It does not translate.

          • Calypso_1

            Meaningful content has been deciphered in areas such as motor functions, visual perception, speech recognition temporal encoding and many other subprocesses (coupled neural oscillators) below perceptual thought. It’s called neural decoding. That you haven’t seen the evidence I can’t account for other than it takes a certain level of knowledge to understand it. It’s all around. There are unknowns. There will be new unknowns and limits and things never dreamed of knowing. That’s the nature of the endeavor. There brain doesn’t use a single way to process information. Discovering one way will no doubt reveal other mysteries.
            There are plenty of people that can appreciate music from a spectrogram – I can. Much of modern music production is done on nothing but spectrograms. I can rapidly make mental extractions of phase into a 3rd plane just by watching a 2d wave – so can engineers watching an oscilloscope. It’s just a different level of appreciation, but you can still discern in music the nature of order, complex interactions, harmonies, rhythms, timbers – even genius. It does translate. One just needs to know the languages involved.

          • Kevin Leonard

            Well, there is an ontological discussion in there, which is why I prefaced “meaningful” with “completely.” Your experience of extracting meaning from spectrograms is far removed from my experience of listening to Mozart. I will be impressed when someone can take raw visual data from a spectrogram and turn it back into music that can be appreciated by an untrained person. That is what I meant by my “translating” comment. Something is always lost in translation.

            There was an underlying invitation to provide evidence about the decoding of the media of the brain. That I have not seen it likely has more to do with my not being exposed to it, rather than my ability to comprehend it – thank you very much.

            Even with something as advanced as cochlear implants and the ability of deaf children to learn to communicate does not mean that their experience of acoustic data is the same as that of a “natural hearing” person. And I find neuroplasticity to be detrimental to “brain-as-computer” metaphors. Something else seems to be happening, to me, rather than “stimulate x, get y.” We are not binary creatures.

            Likewise, even if sensory memory in the brain was perfectly recordable and translatable to another person, the subjective experience of that data would be completely different. For instance, let us suppose that we could translate all sensory data from one person to another, and for fun’s sake, let us suppose that we could do that ex posto facto, and we translated the proverbial moment that Archimedes stepped in the bathtub. It is highly unlikely that someone else would have the same Eureka moment.

          • Calypso_1

            “Ontological” discussions, in my experience, oft are an escape into vagaries. I do not know what you envision as a framework for such a direction. If the intent is define the basis for organizing the necessary knowledge frameworks, processes and so one it becomes something functional.

            I’m not sure what you mean by saying you would want to see someone transform a spectrogram back into music. That is done all the time digitally. The data represented in a spectrogram is merely a graph. If all you have is the graph printed out on paper you are losing a dimension of data. Much of it can still be extrapolated with mathematical techniques and turned back into sound. But asking what you are is no different than placing a Mozart score in front of you and not knowing how to read music. It contains the information but you are not trained in reading it. No different than literacy. Its a translation of information media.

            As to presenting evidence. Allright, you could have searched for neural decoding. I’ll break it down a little further. Visual decoding: http://neurosurgery.washington.edu/Lectures/science.1234330.full.pdf

            http://newscenter.berkeley.edu/2011/09/22/brain-movies/

            http://www.youtube.com/watch?v=FLb9EIiSyG8

            Speech recognition:
            http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.1001251

            http://www.bu.edu/npl/research/neural-prosthetics-for-speech-restoration/

            Motor/sensory:

            http://gizmodo.com/darpas-crazy-mind-controlled-prosthetics-have-gotten-e-510649096

            http://www.nature.com/srep/2013/130228/srep01319/full/srep01319.html?WT.ec_id=SREP-631-20130301

            Tactile/cortical oscillators: http://www.pnas.org/content/94/21/11633.full.pdf

            I do not find neuroplasticity detrimental to such metaphors. Look up how artificial neural networks learn or dynamic reconfigurable processors.
            None of this implies we have to be binary creatures. Computers don’t have to be binary. Check out DNA based computing.

            When does something become subjective? How many components are actually in a memory like the Eureka moment. Not just the bath tube. You would need every bit of insight that led to it and more.
            What if you could detect the neurochemical/structural/
            energentic imprint of a gestalt type experience in memory. You could impose that on any experience…whether any insight was truly their or not. You would feel like their was. If, however you could implant information (or access to) at that point….

          • Kevin Leonard

            My apologies for my vague use of the word “ontological.” I feel any discussion journeying into the nature of meaning to be inextricably linked to a discussion on the nature of being. I had forgotten the other meaning of the word.

            “The data represented in a spectrogram is merely a graph. If all you have is the graph printed out on paper you are losing a dimension of data. ” Precisely. That is why I feel my analogy stands. IMO, any neural decoding will lose a dimension of data.

            I did search “neural decoding” which led to my analogies. But why survey all of the literature when your informed pointers could get to the heart of the matter? ;) Thank you for the links. I have not looked at them, yet, but will. I do find this a fascinating topic.

            I agree with the sentiment of your last two paragraphs. You are expressing the subtext of my comment, though I am highly skeptical one could impose the gestalt type of experience you implied. I don’t actually want to dismiss the possibility that “given sufficient complexity” a non-biological system might one day hold human-originating consciousness. I must hold the idea that it is possible, for it supports my long held position that the mind is not tied to the brain.

            The irony is that I keep finding myself wanting to use language I hear from skeptical folks discussing non-material reality… “I’ll believe it when I see it.” “I’ll change my tune when I put on that ‘Brainstorm’ type of helmet and experience others’ senses and memories, but until then…”

            When does something become subjective? Indeed. Discussions with mystics who have had sustained non-duality experiences have yielded reports that they have perceived the body, along with all of its ‘mechanical’ processes, including its behavioral patterning, going about its day, interacting with its environment and others, while “they” were somewhere else. Their I AM was experiencing itself as something far greater than the bodily/behavioral processes. They report this experience as unsustainable, to which they have choice – leave the body behind, or come back as a type of bodhisattva. Taking Huxley’s “reducing valve” metaphor, I can envision that an advanced computer as hinted at, could act as a larger valve, enabling more of the “Self” to be present in material form.

            I do, however, think this being’s experience would be qualitatively different than that of a purely carbon-based creature, so I maintain, for now, my position that we cannot replicate and transfer an individual experience to another. Also, I would hesitate to call this being “homo sapien.”

          • Kevin Leonard

            Having examined the links you’ve provided, I understand more clearly the “hints” you referred to. Advances are clearly being made in eidetic decoding. Some of the studies, naturally, are more impressive than the others. I found the “Neural Decoding of Visual Imagery During Sleep” very compelling, as it is working with data gleaned from further along the visual processing pathways than the Cat study. But it raises other questions about the nature of dreams and the nature of the hypnagogic state. Does this relate to memory? to creative thought? to neuron firing after some “transfer phenomena” (such as in the recent study confirming that video games create “hallucinations”)?

            The work with the locked-in patient (Ramsay) is downright inspiring, as are the robotic prosthetics. But even those with an actual neural interface are not interpreting data driven by the brain. It is the brain (and the driver of that brain) that are adjusting to the device. I’m not thinking that they are of the same category of neural decoding that lends itself to shared experiences or “mind reading” by computers.

            The “brain to brain interface” working the M1 cortex was impressive, but still, is it really anything more than advanced stimulus/response?

            I concede that the “Tactile/cortical oscillators” study was beyond my understanding without signficant extra study.

            I also did some reading on dynamic reconfigurable processors. And i do want to make it known that I view the material world as mathematical, so perhaps, one day…

            … but I wonder… in our perfect modeling of brain processes, will we model our imperfections, as well? or will we transcend them? Will memories become infallible using eidecic interfaces and storage systems? and the perennial question… what does it say about who, or what, we are?

            Thanks again for the shares.

          • Lookinfor Buford

            Jordi’s visor.

  • kowalityjesus

    Anything “established” must maintain some kind of facade. That’s why all the real MCs are underground.

  • heinrich6666

    This is just the usual scientists’ fantasy, but moderated to appear more reasonable. “Scientists have acquired the power to impress and intimidate every time they open their mouths”. Really? The fantasy of a special elect chosen by God (whoops!) to save the world betrays itself. Almost without exception, scientists are *not* intimidating thinkers. They are usually what in former days their caricature depicted them as: specialists with an extremely narrow range of knowledge, sometimes breathtakingly naive. Cf. the posts on Sagan here in the last few days. Though I don’t care much for R.A. Wilson, he pointed out the obvious when it came to Sagan: someone trained only in astronomy (essentially) appointed himself expert on every topic under the sun. Why? Because he was a Man of Science (TM)! There isn’t a scientist today with a mind like Goethe’s. Yet because our standards are so low, we accept popularizers like Sagan and Tyson as ‘great minds’, and Stephen Hawking gains world renown essentially because he fits our sci-fi view of an all-cerebral being buzzing around in his robotic chair.

  • festernaecus

    Damn, Michael Cera looks like shit these days

21