Tuesday, October 9, 2007

yes... functionalism again...

ok...so... going back to the point of functionalism... we are saying that the mind is a function of the body (brain) so we wouldnt have to get into dualism activity or any cartesian notion of the mind body problem where the body is aware and the mind understands. however, if we are talking about qualia and the understanding of this qualitive experiences, then we start talking about semantics. so if we take the example of Mary and the room, which is that Mary has been living in a black and white room without any kind of qualitive experience..only learning ...but not direct empirical experience. when Mary leaves the room she starts experiencing all this stuff... so the point is that before leaving the room... was she learning? do we learn without experiencing?
i was thinking this: before she left the room she had a syntactical grasp of things. after she left the room she had a semantical grasp. now the crucial point is the fact that she left the room and started 'experiencing' or experimenting...and finally realizing what ever it was that she had been learning all along.
since i started reading philosophy i've been quite skeptic towards distinctions. analytic and sythetic, apriori and aposteriori, syntax and semantics, and so on... and maybe this sounds funny...but there has to be a detonator that bursts this distinction... so where is the fine line that divides one side from the other... in the case of Mary before and after the room i would say that there is something such that that something makes Mary conscious or aware of something, by herself. she steped from the third person perspective to the first person perspective...she was finally doing it by herself. i think this is a huge criticism against functionalism, because is bringing back the relevance of metaphysics by taking under account the existence (wihtout any particular ontological view) of the mind as a weird thing to talk about, not as a fuction of the body.

5 comments:

Gatiio said...

lemme read my books again, i think you're onto something mate!
but if a line is what you need, for now at least, what purpose does this line accomplish? and learning is a process of experience all by itself, so you cannot learn with out experiencing at least your own awareness of learning. :s im confused.


n____n

Pablo said...

I think that Ale is right, and behind everything you wrote we should keep in mind the following question: "what is the difference between this and a computer?" In other words, would a computer EVER be able to be like us, both in syntax and semantics? Another very strong thought experiment against materialism, in my opinion, is the chinese room.

Gatiio said...

its is only because the chinese room in our time does not have a syntactical device, such as the human device, that makes it different. if it would ever be the case that such a "machine" existed, then it would no longer be a machine in that sense, but a thinking entity, such as humans. i see no real conection of this to what the post is trying to get us to think. then again i might be wrong like i always am.


n____n

Unknown said...

i think thatthere is no 'detonator' from one side to the other... all the contraire..we have sides because there is a relevant difference. if it was just one step form one side to the other they wouldn't be acknowledged as distinctions, more as levels, and these are not levels. it is very romantinc i owuld say...to think that if we find out about this metaphysical entity called by ale 'detonator' we could understand better the why of the distinctions. i just dont see this happening.

Pablo said...

Gatiio, what I was getting at is that there is an underlying assumption behind physicalism (materialism) that if consciousness is really just made up from the physical components of the brain then presumably, once we are able to replicate the human brain down to the last detail, a rise of consciousness should occur. I have still much to learn but so far am skeptical towards materialism because, as the chinese room and Jackon's Mary thought experiment argue, what the robot (or the exact replica of the human brain) lacks is the semantics. Remember, the syntax refers to the symbols, like the computer's 1s and 0s, and so all artificial machines use these.