What does it mean to be conscious?

Discussion in 'Off Topic Area' started by Socrastein, Jan 20, 2011.

  1. Socrastein

    Socrastein The Boxing Philosopher

    This is a big question and one could start the ball rolling with many different aspects of this issue. I'll start with some general claims and we'll see where the discussion goes from there.

    Not a lot of people are thrilled by the notion that their mind can be explained as nothing more than the workings of billions of tiny robots, by which I mean cells of course. Many people believe that the mind is full of so much content, so much mystery, and so many untouchable things that it's foolish, downright arrogant to even suggest that we could understand our consciousness as mechanical, unconscious brain activity.

    I contend that humans are barely conscious in the first place, as in not nearly as conscious as it seems we are. A lot of the amazing features of consciousness that we feel are there, simply aren't. Consciousness isn't nearly as hard to explain as it seems when we realize that there isn't nearly as much to consciousness as it seems. As Daniel Dennett put it, "What isn't there, doesn't need to be explained".

    I believe consciousness is not just a single unified feature of our brain, but an assortment of tricks and judgments, expectations and (dis)confirmations, that help us, as well as many other mammals and possibly some birds, perform the kind of tasks that are relevant to our survival.

    Obviously saying consciousness isn't all it's cracked up to be doesn't actually begin to explain what it is, but I believe a big part of that question is explaining what it isn't.

    I'll start with what I hope some will find to be interesting statements that we can further explore as the thread progresses.

    Consciousness is the serial software that runs on the parallel processing of the brain's hardware.

    It is a method of auto-stimulation. Thoughts and speech help communicate information between different parts of the brain that aren't directly wired together.

    Consciousness is what it's like to be a brain with two hemispheres that process different information, in different ways, with different tools, but ultimately guide the actions of a single organism.

    Consciousness is the tool of informavores, a term coined by psychologist George Miller I believe. It is all the extra information gathering that can occur in animals that can receive, process, and analyze more data than is immediately necessary for their survival.

    I think that's a good place to start. I won't pretend to be able to explain and illustrate the entire proverbial elephant, but perhaps we can all blindly fondle this topic together and glean some deeper understanding :)

    This question came up in another thread about vegans and vegetarians while discussing whether or not fish feel pain.
  2. Talyn

    Talyn Reality Hacker

    If fish have nociceptors, fish feel pain. That's not a discussion, that's a biological observation. Pain is data, and if an organism has nociceptors then it stands to reason that it will feel pain; otherwise those nociceptors have no reason to exist.

    As for consciousness, the wetware explanation is fine. The distinction between nociception, the data of pain, and the phenomenal experience of pain all need to be noted as happening in body, brain, and mind (consciousness) respectively.
  3. Socrastein

    Socrastein The Boxing Philosopher

    It was discussed in fact. The kind of research that you just linked to was addressed in the thread; the gist is that consciousness does not simply occur whenever and wherever a nervous system processes data, even if that data is injury detection. If by pain you mean conscious suffering, then no, that is not simply data.

    An organism needn't feel anything to respond to input in a way that keeps it alive. As I said in the other thread, nature doesn't put conscious experience where the behavior of an automaton will suffice.

    This is the same reason why many of the things that seem to exist in our own human conscious experience do not: they don't need to.

    As for the wetware explanation, it isn't an explanation at all. It's a classification of the different components of information gathering and processing that occur in conscious animals into different abstract groupings. It explains consciousness no more than taxonomy explained where all the variety of life came from, though I don't mean to imply that wetware is analogous to taxonomy in it's scientific detail or support.

    It's a cool term coined years ago. Not a scientific or philosophical explanation of the actual phenomenon of consciousness.
  4. Talyn

    Talyn Reality Hacker

    I should've made a distinction in my original post with my terminology use. The confusion is stemming from the world feel, which doesn't refer to a phenomenal conscious experience but to simple awareness of data in the self which fish have as a result of having the nociceptors. A creature must have self-awareness if it is to interact with an environment. Here self-awareness and self-consciousness are not inter-changable, and it is the latter that humans and some other intelligent species have (see the mirror test on primates for instance).

    I'll just quote myself: "Pain is data". The phenomenal experience that results from that data may or may not require consciousness on the level of humans (or mammals in general). Unfortunately because we are reliant on behaviour to determine whether something is suffering in the conscious sense of the word, we are unable to determine whether a fish feels pain. We have no measure of any potential individual behaviour with regards to fish suffering (I'm reminded of the super spartan thought experiment).

    Be less vague please.

    It was not meant to be an explanation of consciousness, but rather was intended to determine whether the discussion was about consciousness being a feature of the brain or a feature of the mind; or whether the mind is a feature of consciousness.
  5. Socrastein

    Socrastein The Boxing Philosopher

    Why must a creature have self-awareness to interact with an environment? Do you draw a distinction between self-awareness and complicated algorithms that produce behavioral reactions without experience? As in, are there things that interact with their environment on some level but aren't self-aware, but rather robotic automatons? Individual cells can react to environment conditions, are they self-aware in the sense that you speak of?

    If pain is data, and animals can feel pain, then are you saying animals feel all data processed by their brain?

    While behavior certainly can help, it's not the only thing that we rely on when answering the question "Can this thing suffer?" The neurobiology of the creature in question can tell us quite a bit, for instance.

    I'll give an example of something that seems to exist in our consciousness but in fact does not: the panoramic field of vision you have. It feels like you are now reading some text and all around your focus is detail and color. The fact is that you're only processing a tiny amount of visual data of any meaningful detail (as in resolution and color recognition) compared to how much there seems to be. We don't actually need our eyes to work like cameras, taking in all the color and resolution of very wide scenes, so they don't. But it sure feels like they do.

    Just as there can be things we seem to consciously experience but don't, there are many things that are processed by the brain that we don't ever experience. We humans, with the most advanced consciousness in the animal kingdom, are barely conscious of anything! To suppose that a fish can feel pain in any sense of the word is to just apply the same overestimation of their experience as we do to our own minds.

    Thanks for clearing up you usage of wetware, I see what you meant now :)
  6. Nathaniel Cooke

    Nathaniel Cooke Valued Member

    Sorry to get in to semantics, but isn't it entirley subjective whether humans are conscious of anything? Cogniscence can be measured, in interpretation if stimuli as in your vision example above but, as you say, cogniscence and behavioural responses could, arguably, be machinations and not brought about by consciousness at all.

    The argument for consciousness, I reckon, is the fact that you and I are able to sit in front of a computer, envisage a person at the other end, interpret what they are saying, understand their metaphores and have an abstract discussion about consciousness. To whit: I think, therefore I am. Just because imagination and interpretation of sensory information can be described by evolutionary & survival biology, does not mean it is evidence of absence of consciousness.

    There is a demonstrable difference between instinctual behavioural responses and reasoned, or judgement-based behavioural responses to your environment and to me, that is an argument for consciousness. I think the mistake is to set the bar for consciousness too high - a sense of existence, rather than a sense of self, would be a much easier case to argue.

    I do, also though, agree that most of what we think of as free thought or individuality is, in fact, controlled by our baser urges - a fascinating subject, particularly when applied to self defence and the psychology of violence.
    Last edited: Jan 20, 2011
  7. Fish Of Doom

    Fish Of Doom Will : Mind : Motion Supporter

    will read up and give a more informed answer later, but one thing i consider to be a foundation of consciousness as i (vaguely) understand it is, for lack of a better way to explain it (so it might not make much sense), the continuous feedback loop between sensory data and cognitive abilities that enables animals (maximally exemplified in humans) to exert a change on their environment, instead of being completely controlled by it, this varying proportionately to both the cognitive and sensory development of a given species (ie a deaf species, or one with limited vocal range, even if intelligent enough would be very unlikely to be able to develop a formal spoken language such as the ones humans employ, yet vocal cues are used by many species, in the shapes of the natural noises different animals make and which are known to convey distinct meanings).

    did that even make sense?
  8. Talyn

    Talyn Reality Hacker

    Because it must be aware that it is an object that takes up a certain space within a larger space, it must be aware that it is an object distinct to other objects. Not necessarily intellectually, but perhaps 'instinctually'. A fish can eat plants because it instinctually has awareness of the distinction between itself and the plant, and the plant from another fish.

    Let's say the robot has been given needs that will result in its shut down should they not be met, and it has been given commands to tell it what will satiate those needs. If you were to put it in the middle of an environment it would probably not behave in the same way as an organism that experiences. It has no ability to experience so it doesn't change its 'behaviour' as a response to events that are not described within its pre-existing algorithms. The robot, left to its own devices, will eventually be destroyed because it cannot adapt its behaviour to the unforeseen circumstances. Organisms that experience can- although granted their ability to adapt varies both in their willingness and how much change results, some with very minor and slow changes and others with massive and rapid changes (humans?).

    There's nothing to say that if we were to program a robot with an ever-increasing complexity in its algorithms that it would not at some point develop self-awareness as a result. See science fiction for more. The level of complexity required and whether we would even know it had developed self-awareness are another matter entirely.

    Certainly there are organisms that are mere automata, but I question whether an organism that has nociception should be determined to have the same level of self-awareness as bacterium. It seems to result when you do not consider self-awareness to be graduated, or when you try to force self-awareness to have a definitive and measurable starting point.

    Pain is a discouragement to a particular action, a kind of biological precursor to classic conditioning. If it were a slight flash in the eyes we wouldn't necessarily feel inclined to behave in such a way that prevents the injury that caused the pain. I'm reminded of the humans who have a disorder that prevents their brain processing pain data the way the rest of our brains do. Fish probably do not feel pain as intensely as us because pain is heightened by secondary consciousness.

    A big sweeping statement. Again, be less vague.

    No it isn't. If an animal response to the pain of being poked by a pinhead (and by that I mean that if it isn't, say, a hippopotamus that wouldn't feel it because of a thick hide), and you poke it with a pin and it responds with a behaviour that means it has felt the pain. To suppose that a fish cannot feel pain because it does not experience it in the same way as humans is absurd.
    Last edited: Jan 20, 2011
  9. Socrastein

    Socrastein The Boxing Philosopher

    Talyn I think a lot of your points are founded on an underestimation of how powerful stupid robotic processes can actually be. Plasticity in a system, the ability to change behavior over time in response to new input and outcomes, does not require consciousness, nor does it indicate it.

    You propose that for an animal to navigate an environment it has to know on some level it's an object, and that for an animal to react to potentially dangerous and damaging stimuli it must feel pain. These behaviors can be explained without introducing a conscious state, so why would we do so? Is that very parsimonious, or scientific? A computer can spell check without knowing how to speak English, so why can't a fish navigate it's body without actually knowing it's a body?

    Why would nature give fish the ability to feel and understand themselves on any level if that wasn't necessary? Evolution doesn't try to produce cool things, or aware things, it produces things that are the bare minimum of complexity, often marvels of engineering simplicity, for accomplishing the task at hand.

    It reminds me of the age old tale of the impossible flying bumble bee. People couldn't quite understand how the simple engineering could possibly account for the function they were observing, so they assumed there must be something more waiting to be discovered and understood.

    As for things processed by our brains that we don't consciously experience, I grant that is sweeping but I didn't think it was controversial in any way.

    We're not aware of all the data that as recieved and processed pertaining to our heart regulation, the regulation of our endocrine system, the transportation of nutrients, etc.

    For things that are a little "closer" to our conscious experience, we aren't aware of the saccading of our eyes that happens many times a second, we aren't aware of the actual pauses between words spoken in our native language, because we experience pauses between each word, but if you record someone speaking English the periods of least sound don't line up with the separations between the words. This is why people who speak other languages we don't understand sound like they're speaking really fast and running things together: we sound just like that too when we talk, but that's not what we actually experience.

    We also aren't aware of the breaks in our vision that occur with blinking. The break in our visual data is a bit more severe than our experience tells us, because our brain sort of just leaves that stuff out of what we consciously experience. There are many other examples.

    I also agree that a complicated enough robot could become conscious in every way that a human is.

    I also believe that consciousness does exist on a spectrum, but many organisms aren't really on that spectrum. I believe it's similar to trying to tell the difference between a light orange and a dark yellow, which can be difficult. But it's a bit easier to distinguish dark blue from bright orange. I'd say consciousness is dark blue and the mindless processing of a fish is bright orange. It's a bit easier to distinguish the two.

    I'm curious what your justification is for inferring consciousness where we can only observe competence. I think that's an important point in our discussion thus far.
    Last edited: Jan 20, 2011
  10. Socrastein

    Socrastein The Boxing Philosopher

    While I think you hit pretty close to encapsulating what gives an animal free will, which I think is a 'trick' even more powerful than consciousness. I'd say that consciousness is more about exerting change in our own minds, rather than being completely controlled by the robotic processes that dictate the behavior of other species. By our I don't just mean humans, but any conscious creature.

    I was touching on this point when I said thought and speech are methods of auto-stimulation, a type of self-communication that is necessary in a brain of sufficient complexity with many different parts that have their "own agendas" so to speak to work together in a manner that produces a series of thoughts and behavioral decisions.

    The idea you bring up of overcoming inevitability and opening up evitable options is a key point in understanding consciousness and why it exists, which helps us understand when and where it is likely to exist.
  11. Talyn

    Talyn Reality Hacker

    Re-read- I didn't say they do. I said they require awareness.

    I see no reason to be overly reductionist about it. Here Ockham's Razor primarily eliminates mind-body dualism.

    I said aware, not know. Knowledge is not the same as awareness.

    "Awareness is the state or ability to perceive, to feel, or to be conscious of events, objects or sensory patterns. In this level of consciousness, sense data can be confirmed by an observer without necessarily implying understanding. More broadly, it is the state or quality of being aware of something. In biological psychology, awareness is defined as a human's or an animal's perception and cognitive reaction to a condition or event."

    ... Hence human intelligence? Or the solar-powered hornet?

    Also, evolution does not 'try' to do anything.

    Do not conflate inferring with simply not denying.
    Last edited: Jan 20, 2011
  12. Fish Of Doom

    Fish Of Doom Will : Mind : Motion Supporter

    hypothesis: a conscious mind is one that is, to a greater or lesser degree, capable of interpreting stimuli to form concepts which are then used to re-interpret stimuli, which may or may not modify behaviour.

    ^a more coherent version of my prior opinion.

    note: IMO this does not give an animal free-will, as free-will is actually necessary for the animal to be able to choose in the first place. also in b4 determinism.
  13. Socrastein

    Socrastein The Boxing Philosopher

    Which is equally as unfounded, and only begs the question why do they need to be aware of what they're doing to competently do it?

    What does overly reductionist mean to you? To me it means you're reducing something for the sake of doing so without properly accounting for what's been observed. We can account for many competent behaviors in animals without needing to bring up notions of awareness or consciousness, so it seems like very appropriate reductionism to me.

    So are you arguing the awareness may exist where we have no reason to suspect it does and even reasons to suspect it does not? It's hard to really address that in any way unless you support it.
  14. Talyn

    Talyn Reality Hacker

    If the robot's internal sensors determine that it needs re-charging (hungry), then it will enact a pre-written algorithm to seek out a power source and re-charge (feeding). It is aware of its needs and can navigate the environment and interact with it (self-awareness), even though it could not necessarily pass the Turing test (intelligent) or reflect on itself (self-conscious).

    I think there may be a problem of ordering going on. An organism/species does not obtain self-awareness like a computer game avatar unlocks a skill tree and then goes on to gain capabilities from that purview. Self-awareness is the description for a creature that has a certain level of existent capabilities.

    Because behaviours A through Z can be explained without self-awareness does not mean that the organism is not self-aware.

    No, because we have reason to suspect the awareness does exist- they have senses.

    You've also changed your terminology from conscious to awareness; as I said in an earlier post, awareness and conscious are not inter-changable. Your earlier post was:
  15. Socrastein

    Socrastein The Boxing Philosopher

    You seem to be very good at taking single snippets from my posts and addressing them one little bit at a time, but I wonder if you could address the relevant points I've made without quote mining? Not simply reassert your unsupported points, but actually support them with something other than "you can't prove they're wrong". I could come up with an infinite set of absurd statements that you could never prove wrong, but that wouldn't make for a meaningful discussion, would it?
  16. Atre

    Atre Valued Member

    Checking in so I can see what happens here via subscription updates and all that :).

    What I said on the original thread in brief: Behavioural experiments don't tell us about consciousness, because we can only observe behaviour and not the reasoning behind it.

    I am generally skeptical about consciousness as a useful tool/concept for understanding brains (which may arise from my not being a specialist in that field) because it SEEMS like us trying to force the research into taking the form we expected to find rather than letting experiment inform the theory.

    I would think that this is a prime example of when I think science could benefit from having its own very technical terms for what's going on - because consciousness can just mean too many confusing things.

    PS. This does mean that my position could be seen as backing out of a vital part of neuro research (what controls abstract thought and reasoning?); but I think not. It is interesting and work should be done but we need very careful experiments to pull out underlying mechanisms, right now talking of consciousness in neuro/behaviour seems a bit like talking about why composition of the earth is the way it is before understanding accretion discs and stellar nucleosynthesis.

    Please feel free to rip apart my reductionist view :).
    Last edited: Jan 21, 2011
  17. cejames

    cejames Valued Member

    Question 1: Is this merely your viewpoint, your perspective, or your hypothesis?
    Question 2: Can you tell me your credentials as to the study of the brain?
    Question 3: Can you tell me your credentials as to the study of Psychology?
    Question 4: What sources do you reference to substantiate your hypothesis?


  18. Socrastein

    Socrastein The Boxing Philosopher


    This is certainly nothing more than my humble opinion on a very difficult and fascinating topic. I have no credentials in psychology or neuroscience, and I do apologize if I somehow gave that impression.

    I've merely read a lot of books, articles, and studies by people much smarter than me, and have been debating and discussing the issue with various people in various fields for a little over 7 years.

    Daniel Dennet's work has influenced my thoughts the most over the years, particularly Consciousness Explained. Many of the points I make when discussing consciousness are rooted in the science and philosophy found in that book.
    Last edited: Jan 21, 2011
  19. cejames

    cejames Valued Member

    Hey Socrastein,

    Thanks, I wanted to make sure I was not reading into anything and much like my blog posts my views tend to come less from any certifiable source but from my studies, i.e. books and discussions with others.

    You were adamant about it and what I first read seemed to indicate you had some additional expertise/discipline in those fields so wanted to clarify.

    Much appreciated,

  20. Atre

    Atre Valued Member

    This bit intrigues me (along with the idea that language is a tool for allowing better co-opting of disparate brain functions in one task - correct me if I am misinterpreting)

    What are the thoughts behind how the disparate left/right brain exists without this conscious tool (I am interpreting the OP as saying that consciousness is what allows communication between regions, which leads to the question of why brain sections became mutually incompatible in the first place and what this meant for animal behaviour pre-consciousness. Again, correct me if I'm wrong about the gist of the argument or putting in details that aren't there :))

Share This Page