• Welcome to League Of Reason Forums! Please read the rules before posting.
    If you are willing and able please consider making a donation to help with site overheads.
    Donations can be made via here

The Case for Idealism

arg-fallbackName="Sparhafoc"/>
can computers be conscious or not? If yes then you're admitting there's consciousness without brains, if no then you're contradicting your objections to me when I object to computers being conscious lol you're just a lil contrarian for whatever I say aren't you?

Or I am contesting vapid absolutism - either/or.

Unwarranted certainty has long plagued humanity; it's always worthy of the time spent squishing it.
 
arg-fallbackName="Sparhafoc"/>
Sparhafoc said:
This is an argument for Idealism: In philosophy, the group of philosophies which assert that reality, or reality as we can know it, is fundamentally mental, mentally constructed, or otherwise immaterial.

It's not really much of an argument though; of course 'reality as we can know it' is fundamentally mental - that's a tautology.

i) there is reality ii) we can 'know' stuff by employing our sensory organs and brain iii) our knowledge is necessarily 'mental'.

But that's quite different to saying that reality itself is mental, which doesn't follow from the stated notions.


D. C. Stove said:
We can eat oysters only insofar as they are brought under the physiological and chemical conditions which are the presuppositions of the possibility of being eaten.

Therefore,

We cannot eat oysters as they are in themselves.


And of course, I am far from the first to note that certain forms of idealism seem to be stuck in a use/mention loop.

Does Santa Clause exist? Well, of course Santa Clause exists, because the mind causes Santa Clause to exist.

Russell's arguments were, of course, much better, rightly identifying the fundamental problems of Idealism as being conceptual errors and category mistakes.


We can talk about reality-as-we-can-know-it until the bovines return to their domecile and still be not a jot closer to talking about reality.
 
arg-fallbackName="Master_Ghost_Knight"/>
Monistic Idealism said:
I don't see anything incoherent about a program spitting out such a conclusion.
So do you therefore also agree that whatever reasoning it uses
to conclude that itself is conscious must therefore be necessarily fallacious, given than it concluded that it is conscious but in reality it isn't?
 
arg-fallbackName="Monistic Idealism"/>
I was including hypothetical hard AI in that sentence, which you seem unwilling to do.

Because phenomenal consciousness is not reducible to functional states. Artificial Intelligence is exactly that: artificial. It's a simulation. You can have a very complicated one, but it's still just a simulation of various behaviors.
What does Commander Data experience? I dunno, ask him. How about Zero or Mega Man X? Beats me, ask them

Those are fictional characters, they exist in the imagination alone.
The man is just a tool for executing those instructions, like how blood and electrical conductivity by potassium and sodium ions is a tool for executing functions in the human brain. It's the algorithm itself when combined with the operations of the man that understands Chinese.

Consciousness is not magic.

Agreed. I don't recall saying it was.
I know the code alone doesn't give rise to consciousness. Something has to run the code before it can do that.

That would be the hardware. From there it's just code and more code. Synatx and more synatx.
Demonstrate that he's aware and he has experience

Direct quote from you:
We do experience. "I think, therefore I am", and all that crap.

You know there's experience and that we have it. You can argue in bad faith and arbitrarily turn up the skepticism dial, but you have yourself already granted that there is experience and we have it. An AI is just a simulation of this experience we have of other minds. It's just a copy, not the real thing.
All I said is that it doesn't make sense to me to try to "tell us what it's like to have experience from the first person" without first having first person experience.

That right there is an admission that consciousness is irreducible. If the mental were reducible to the non-mental then you should be able to describe the mental independently of the mental. But as you admitted, we can't. This implies reductive materialism is false.
There is a problem; you're arbitrarily excluding artificial intelligence from having the potential to be conscious. Idealism has nothing to do with it.

Idealism comes up when you assumed I would lapse into some kind of non-reductive materialism but with idealism it's all good. And I'm not excluding AI arbitrarily, I have reasons as stated above.
You said you had a degree in psychology, right? Why don't you draw from psychology instead?

Please read what I write in context. My mention of my degree came when another user initiated talks of credentials and I merely stated a correction. He assumed I had no contact with people in fields of neuroscience and such, and I merely showed them that they're wrong. What we're primarily discussing is philosophy of mind, which informs and as informed by psychology.
You're misapplying the tools of philosophy here.

How? You're just saying this with no support.
We can say consciousness is analogous to software, because it runs on the hardware that is the brain and cannot exist without it.
[/quote]

When you say consciousness runs on the brain are you saying consciousness reducible or irreducible? If it is, you face the hard problem of consciousness. If it is not, you face the exclusion problem and a potential lapse into substance dualism.
 
arg-fallbackName="Monistic Idealism"/>
Or I am contesting vapid absolutism - either/or.

Then you're literally contesting logic:
The Principle of Excluded Middle: The principle that asserts that any statement is either true or false.
P∨~P

Source: Copi, I. M., Cohen, C., & McMahon, K. (2011). Introduction to Logic (14th ed.) p. 333-335. Upper Saddle River, NJ: Prentice Hall.

So it's come to an idealist who affirms logic and the non-idealist who contests logic... rly makes u think
 
arg-fallbackName="Monistic Idealism"/>
We can talk about reality-as-we-can-know-it until the bovines return to their domecile and still be not a jot closer to talking about reality.

How many times do I have to correct you on this? I'm making the case for idealism using commitments from monism and causation. I don't jump from epistemological idealism to ontological idealism, I have an independent argument that leads us to idealism independent of an idealist epistemology.
 
arg-fallbackName="Monistic Idealism"/>
So do you therefore also agree that whatever reasoning it uses to conclude that itself is conscious must therefore be necessarily fallacious, given than it concluded that it is conscious but in reality it isn't?

There would have to be a formal or informal fallacy. Either the reasoning would be invalid or there's some equivocation or other kind of informal fallacy.
 
arg-fallbackName="Master_Ghost_Knight"/>
Monistic Idealism said:
There would have to be a formal or informal fallacy. Either the reasoning would be invalid or there's some equivocation or other kind of informal fallacy.

But here is the thing, it's a program that is made to perfectly emulate you, it employs exactly the same reasoning as you. If the reasoning it uses is invalid for it, it must also be invalid for you.

So again, are you really conscious? Or do you just believe that you are because you are programed to do so?
 
arg-fallbackName="Akamia"/>
Monistic Idealism said:
I was including hypothetical hard AI in that sentence, which you seem unwilling to do.

Because phenomenal consciousness is not reducible to functional states. Artificial Intelligence is exactly that: artificial. It's a simulation. You can have a very complicated one, but it's still just a simulation of various behaviors.
Why is it not reducible to functional states? You haven't demonstrated this.
What does Commander Data experience? I dunno, ask him. How about Zero or Mega Man X? Beats me, ask them

Those are fictional characters, they exist in the imagination alone.
So does everything else, according to you.
Agreed. I don't recall saying it was.
Then stop putting it on a goddamned pedestal.
I know the code alone doesn't give rise to consciousness. Something has to run the code before it can do that.

That would be the hardware. From there it's just code and more code. Synatx and more synatx.
That would be the brain. From there it's just neurons and more neurons. Chemicals and more chemicals. Molecules and more molecules...
Demonstrate that he's aware and he has experience

Direct quote from you:
We do experience. "I think, therefore I am", and all that crap.

You know there's experience and that we have it. You can argue in bad faith and arbitrarily turn up the skepticism dial, but you have yourself already granted that there is experience and we have it. An AI is just a simulation of this experience we have of other minds. It's just a copy, not the real thing.
Is a copy of a book not a real book? Is a copy of a movie not a real movie? Is an ebook copy of a book originally written on paper suddenly not a book? Is an ebook that was never printed on paper a book?
All I said is that it doesn't make sense to me to try to "tell us what it's like to have experience from the first person" without first having first person experience.

That right there is an admission that consciousness is irreducible. If the mental were reducible to the non-mental then you should be able to describe the mental independently of the mental. But as you admitted, we can't. This implies reductive materialism is false.
Not necessarily. I don't know what your first person experience is. Strictly speaking, the only first person experience I truly know (and even that I'm dubious of) is my own. You try to tell me yours, at best, it's received as second-person experience, and that's true whether it's coming from a human or some hard AI construct. But just because these specific barriers exist does not mean that we can't describe how consciousness occurs.
 
arg-fallbackName="Sparhafoc"/>
Monistic Idealism said:
We can talk about reality-as-we-can-know-it until the bovines return to their domecile and still be not a jot closer to talking about reality.

How many times do I have to correct you on this?

Perhaps you could try once, but that would necessitate you being correct in the first place, and secondly having the capacity to formulate coherent and compelling arguments.

So perhaps the question should be: how many times will you ineffectively flail at me?

And the answer to that, it seems, is 'lots'.

Monistic Idealism said:
I'm making the case for idealism using commitments from monism and causation.

And I've addressed that. Your inability to do other than repeat your assertions, plus your ensuing contradictions, certainly ensures I am under no obligation to restrain myself to what you think it is you believe.

Monistic Idealism said:
I don't jump from epistemological idealism to ontological idealism, I have an independent argument that leads us to idealism independent of an idealist epistemology.

I think one of the main problems I am seeing is that you think in labels, and consequently your understanding is precisely as shallow as the label. Therefore, when you make other statements which conflict with your position, you don't recognize yourself doing so because you proudly bear a label you've decided you subscribe to.

What's most intriguing is how in conflict with your own premises you are. You talk about introspection, but yet you don't seem even passingly aware that you've made the carthorse to fit the cart.
 
arg-fallbackName="Sparhafoc"/>
Monistic Idealism said:
Or I am contesting vapid absolutism - either/or.

Then you're literally contesting logic:

No, I am contesting vapid absolutism.

Monistic Idealism said:
The Principle of Excluded Middle: The principle that asserts that any statement is either true or false.
P∨~P

Statement: the universe seen from the outside is yellow and smells like fried raspberries.

Is this statement true or false?

It's neither, because there can be no such knowledge.

Again, your comprehension is nanometers deep. Certainty cannot be used as a proxy for knowledge.

Your declaration was wholly about your unwitting certainty, and had no bearing whatsoever on whether computers/AI could be or become conscious.

Monistic Idealism said:
So it's come to an idealist who affirms logic and the non-idealist who contests logic... rly makes u think

I very much doubt that.

Amusingly, though, the implication you've forwarded is that it's not expected to be routine for an idealist to affirm logic.
 
arg-fallbackName="Monistic Idealism"/>
But here is the thing, it's a program that is made to perfectly emulate you, it employs exactly the same reasoning as you. If the reasoning it uses is invalid for it, it must also be invalid for you.

Like I already said, either there's a formal or informal fallacy here. It can use the same form of reasoning but there will be an equivocation on certain terms. An AI is just a simulation, it's just a bunch of code, no first-person subjective awareness or any of that. For yourself, and other minds, its clear as day that we are conscious and have first-person subjective awareness. A copy of someone saying this doesn't mean the copy is itself conscious.
 
arg-fallbackName="Sparhafoc"/>
Monistic Idealism said:
But here is the thing, it's a program that is made to perfectly emulate you, it employs exactly the same reasoning as you. If the reasoning it uses is invalid for it, it must also be invalid for you.

Like I already said, either there's a formal or informal fallacy here.

A motivated deduction.

The real question is how many pages it will take until you concede your error.

Monistic Idealism said:
It can use the same form of reasoning but there will be an equivocation on certain terms. An AI is just a simulation, it's just a bunch of code, no first-person subjective awareness or any of that. For yourself, and other minds, its clear as day that we are conscious and have first-person subjective awareness. A copy of someone saying this doesn't mean the copy is itself conscious.

Show that you are not just a simulation performing the same routines and arriving at the same illusory conclusion.

"Clear as day" is the kind of language used by people who can't substantiate their claim, but want it to be lent validity anyway.
 
arg-fallbackName="Sparhafoc"/>
Mind exists: show that it's absolutely impossible for mind to be an illusion.

You can't. You can offer ideas that can present grounds to believe it's not an illusion, but they could themselves be an illusion subsisting on the original illusion.

Tell me oh great logician - if your first premise is faulty or incomplete... what does that make of the argument?
 
arg-fallbackName="Monistic Idealism"/>
Perhaps you could try once, but that would necessitate you being correct in the first place

Are you a troll or something? By "correcting you" that clearly meant I was showing you how you misunderstood my argument. It is after all my argument, you don't make it for me.
And I've addressed that.

By straw manning it into some old epistemological argument that I didn't use. Learn to stop attacking straw men.
I think one of the main problems I am seeing is that you think in labels, and consequently your understanding is precisely as shallow as the label. Therefore, when you make other statements which conflict with your position, you don't recognize yourself doing so because you proudly bear a label you've decided you subscribe to.

If I call this idea labelism and say it's true you going to contradict yourself and say it's not true cuz it's got a label now? lol
What's most intriguing is how in conflict with your own premises you are. You talk about introspection, but yet you don't seem even passingly aware that you've made the carthorse to fit the cart.

Assertions with no support, great. So enlightening.
No, I am contesting vapid absolutism.

You said you rejected either/or so you're rejecting the principle of excluded middle.
Statement: the universe seen from the outside is yellow and smells like fried raspberries. Is this statement true or false?

I don't know side of the dichotomy is true, but the dichotomy itself is still true: every statement of the form P or not P is necessarily true. Just because you don't have knowledge of the matter doesn't mean there is no truth of the matter.
I very much doubt that.

well you're the guy denying the principle of excluded middle so... yeah...
 
arg-fallbackName="Master_Ghost_Knight"/>
Monistic Idealism said:
But here is the thing, it's a program that is made to perfectly emulate you, it employs exactly the same reasoning as you. If the reasoning it uses is invalid for it, it must also be invalid for you.

Like I already said, either there's a formal or informal fallacy here. It can use the same form of reasoning but there will be an equivocation on certain terms. An AI is just a simulation, it's just a bunch of code, no first-person subjective awareness or any of that. For yourself, and other minds, its clear as day that we are conscious and have first-person subjective awareness. A copy of someone saying this doesn't mean the copy is itself conscious.
Well isn't that what a computer programed to think like you would say?
 
arg-fallbackName="Monistic Idealism"/>
[Why is it not reducible to functional states? You haven't demonstrated this.

Yes I have, many times now actually. All the way back in the OP I have support for all my premises. All you have to do is read the OP in its entirety. My arguments, with scholarly citations, are included.
So does everything else, according to you.

I literally never said that. You're just attacking straw men.
Then stop putting it on a goddamned pedestal.

I never did. I only noted that consciousness is irreducible and gave reasons for this, citations included.
That would be the brain. From there it's just neurons and more neurons. Chemicals and more chemicals. Molecules and more molecules...

Yup, and that would be an elimination of consciousness, which we know cannot be true since consciousness exists.
Is a copy of a book not a real book? Is a copy of a movie not a real movie? Is an ebook copy of a book originally written on paper suddenly not a book? Is an ebook that was never printed on paper a book?

Your own language portrays you: notice how you subtly move the goal post by asking "is a copy of a book not a real book?" instead of asking "is a copy of a book not the real book?" and of course we would all know the answer to that question: no. A copy of say the monalisa is not the monalisa, it's just a copy of it. It's not the real monalisa.
Not necessarily.

It does actually. If A and B are identical then everything true of A is true of B. Describing A would describe B. But as we can see this is not the case, so A cannot identical to B. Reductionism would have to be false.
I don't know what your first person experience is. Strictly speaking, the only first person experience I truly know (and even that I'm dubious of) is my own

I would like to see you justify this, but quite frankly you'll only be giving the idealist the epistemological foothold. By your own epistemological commitments, the idealist has the high ground over the materialist.
 
arg-fallbackName="Sparhafoc"/>
Monistic Idealism said:
Perhaps you could try once, but that would necessitate you being correct in the first place

Are you a troll or something?

Says the guy who's just joined the forum and has already tried foisting off the notion that there are sock puppets out to manipulate viewership against him.

Get out your own arse, James.

Monistic Idealism said:
By "correcting you" that clearly meant I was showing you how you misunderstood my argument. It is after all my argument, you don't make it for me.

An entirely pointless non-sequitur. The fact that you made an argument doesn't mean you understood it, nor does owning that argument mean you are in a position to correct people who critique it.

Monistic Idealism said:
And I've addressed that.

By straw manning it into some old epistemological argument that I didn't use. Learn to stop attacking straw men.

The dozens of times you've appealed to this fallacy suggests you should also learn to stop using the fallacist's fallacy. Declaring something a strawman doesn't make it a strawman.

Monistic Idealism said:
I think one of the main problems I am seeing is that you think in labels, and consequently your understanding is precisely as shallow as the label. Therefore, when you make other statements which conflict with your position, you don't recognize yourself doing so because you proudly bear a label you've decided you subscribe to.

If I call this idea labelism and say it's true you going to contradict yourself and say it's not true cuz it's got a label now? lol

That's about the closest you've come to having an independent thought in this thread. Congratulations.

Sadly it was in the employ of evading the thrust of the point, but those be our crosses to bear.

Monistic Idealism said:
What's most intriguing is how in conflict with your own premises you are. You talk about introspection, but yet you don't seem even passingly aware that you've made the carthorse to fit the cart.

Assertions with no support, great. So enlightening.

Only you are allowed to assert without support?

Or do you think that adding more words equates to support?


Monistic Idealism said:
No, I am contesting vapid absolutism.

You said you rejected either/or so you're rejecting the principle of excluded middle.

No, your reading comprehension is flawed.

You said 'it is X' and I said 'or it is Y - either/or'.... as in, either you were right and it is X, or I am right and it is Y.

Of course, you wouldn't have made this mistake if you were here to discuss rather than evangelize.


Monistic Idealism said:
Statement: the universe seen from the outside is yellow and smells like fried raspberries. Is this statement true or false?

I don't know side of the dichotomy is true,...

Err, then it's a trichotomy.


Monistic Idealism said:
... but the dichotomy itself is still true: every statement of the form P or not P is necessarily true. Just because you don't have knowledge of the matter doesn't mean there is no truth of the matter.

What it means is that declaring P or declaring not P is the problem, both fall foul of assigning a status to something that cannot be known. As I clearly said: my problem is with absolutism, it's just compounded by certainty of things that cannot be known.


Monistic Idealism said:
I very much doubt that.

well you're the guy denying the principle of excluded middle so... yeah...

And you're the guy who can't read plain English so... yeah...

How on earth did you get a degree with such poor reading comprehension?
 
arg-fallbackName="Monistic Idealism"/>
Well isn't that what a computer programed to think like you would say?

It can say whatever it wants, it's still just a computer program that's making some formal or informal fallacy. You can try to equate me to the computer but you're just committing what I already predicted: an informal fallacy.
b-but that's something the computer would say!

Good thing I'm not a computer program but am a human being with flesh and bone, no computer parts or anything. If you want to go full retard with skepticism you're going to blow up metaphysics and epistemology in general. Your attempts to go after idealism will result in philosophical suicide. The suicide bomber equivalent of a refutation lol
 
Back
Top