Quantum Mechanics and Consciousness

As the linked-to article describes, a hybrid computer uses an analogue “computer” to generate a reasonably good initial value (or set of values) that forms the starting point for digital processing by a normal computer in cases where problem-solving requires iterative techniques. The reason for doing this is basically to reduce the solve time on the digital computer: the better the initial guess, the fewer iterations are needed to satisfy some predefined tolerance or accuracy criterion.

An analogue “computer” must not be thought of as comparable to a digital computer because it doesn’t per se perform any calculations. Instead, it simulates one physical process by another that is mathematically similar – hence the “analogue” descriptor. Something as simple as an electronic Inductance-Resistance-Capacitance (LRC) circuit built with variable resistors, capacitors and/or inductors can qualify as an analogue “computer.” Such a circuit can, for example, be used to simulate forced damped vibration behaviour in materials (or mechanical wave propagation in them) by subjecting the circuit to AC current of an appropriate frequency and measuring certain electrical responses in the circuit. Another example is to simulate stress/strain state changes in materials subjected to impulsive forces by measuring transient electrical behaviour in the circuit. Such problems have quite complex formulations that are of the same or a very similar mathematical form to that of the analogue that is used to simulate them, usually a set of linked non-linear partial differential equations.

Put briefly, an analogue computer is a clever way of initialising the analysis of a problem in order to reduce the computational effort that a digital computer alone would need and thereby reduce the processing time. Analogue computers are not general computing machines like a digital computer is. They are limited in their applicability to a small set of problems and are purpose-built for specific problems, which is perhaps the main reason why they and also hybrid computers are less favoured.

So much for the background.

It should be clear from the above that a hybrid computer is fully deterministic because the digital aspect of it greatly refines the approximate answer provided by the analogue component. In contrast and as described in an earlier post, a quantum computer is not deterministic, and that is the essential difference between the two types. One possible way of thinking about a quantum computer is to picture it as an array of bits (i.e. binary digits) that have a curious property: until each bit is actually examined, it exists in a superposed state of both 0 and 1, and it becomes definitely 1 or 0 only once it is examined. Such special bits are called “qubits.” Moreover, the state that each qubit will assume upon examination depends on its neighbours and what operations have been performed on the whole collection of them and in what order (this only works if the qubits are entangled, and this is the main technical obstacle in the way of the “quamputer”). These operations and their sequence can be thought of as the algorithm.

Here’s a very simple example for the sake of illustration: Suppose you have a four-qubit computer. It has 16 (= 24) possible states. Suppose further that an algorithm is loaded that forces the third qubit always to be the complement (not = negation) of the exclusive-or (xor) result of the first two, and the fourth qubit is the logical-and (and) result of the second and third qubits. This algorithm has two degrees of freedom because the third and fourth qubits are fixed by the value of the first two. The algorithm also predisposes the first qubit to come up high (= 1) 80% of the time, and low (= 0) for the remaining 20%, while the second qubit comes up high 33% of the time and low over the remaining 67%. Because this is a trivial problem, it’s not hard to work out the probability of each of the four possible states but it should be clear that the complexity of the algorithm and the quantum computer can be vastly increased in theory to address more meaningful problems. While the quantum computer can be used to implement such a simulation directly, a deterministic computer must either analyse the problem symbolically or use a source of randomness (or, more usually, pseudorandomness) to compute the likelihood of the possible outcomes.

'Luthon64

Hmm, I guess I just assumed that a quantum computer would also have a deterministic, digital component which we would use to interface with the quantum part. I’m more interested in the the analogue part at the moment though. Would it be deterministic? If not, is it non-deterministic for a very different reason than the quantum computer?

This is the part which really grabbed my attention:

Consider that the nervous system in animals is a form of hybrid computer. Signals pass across the synapses from one nerve cell to the next as discrete (digital) packets of chemicals, which are then summed within the nerve cell in an analog fashion by building an electro-chemical potential until its threshold is reached, whereupon it discharges and sends out a series of digital packets to the next nerve cell. The advantages are at least threefold: noise within the system is minimized (and tends not to be additive), no common grounding system is required, and there is minimal degradation of the signal even if there are substantial differences in activity of the cells along a path (only the signal delays tend to vary). The individual nerve cells are analogous to analog computers; the synapses are analogous to digital computers.

But wouldn’t a programmable analogue computer be seriously cool? Imagine being able to write programs which ran simulations that were actually real! (At least in the numerical sense)

NEC just started miniaturisation in June this year:

http://www.necel.com/news/en/archive/0906/1802.html

Also check out Factorizing RSA Keys, An Improved Analogue Solution:

http://www.springerlink.com/content/k5355j45w452537m/

Apologies for the delayed reply – I have been indisposed these past few days.

Well, yes, probably there would be such but it would be a component merely serving an input-output (IO) function. Its presence would most assuredly not suddenly change a quantum computer into a deterministic machine. That would be a bit like saying that the presence or absence of a speedometer in your car changes the type of fuel it needs between petrol and diesel.

Ideally, yes – or as close to it as macroscopic (i.e. non-quantum) models will allow in theory. In actuality, all real analogue devices are subject to certain inaccuracies, however small they may be. These inaccuracies are for all practical purposes random, if not entirely unknowable, and they could be the result of any number of environmental factors or conditions. The errors may be tiny but no instrument can measure with 100% precision the real quantity it was designed to measure, and it is furthermore highly doubtful whether perfect accuracy is even achievable.

Sure it would, but it is hard to see how one might go about constructing a general-purpose programmable analogue machine. As outlined earlier, an analogue machine is one that makes use of a process that is mathematically similar to the one of interest, whereas a digital computer (usually) treats the mathematics itself of the process of interest (which is why a digital computer isn’t inherently limited to a rather narrow range of simulations or physical problems). It is not apparent how one might find an analogue of sufficient generality (short of reality itself, which clearly isn’t an analogue) to cover all of the requirements adequately for a super analogue computer to be built.

'Luthon64

No worries, gave me time to do some more reading. ;D

Agreed, in the same way that, with this quantum consciousness theory, the digital parts of our nervous system do not suddenly change our brains into deterministic machines. The same could be said for an analogue/digital hybrid consciousness theory.

Are you sure? This quote from Wikipedia seems to say the opposite:

Although digital computer simulation of electronic circuits is very successful and routinely used in design and development, there is one category of analog circuit that cannot be simulated digitally, and that is an (analog) circuit made to exhibit chaotic behavior. Because everything in the analog circuit is essentially simultaneous, but a digital simulation is sequential, simulation a chaotic circuit fails. http://en.wikipedia.org/wiki/Analog_computer

They have designed programmable analogue computers, but they still use punch cards! I’m talking about miniaturization and bringing the complexity up to that of today’s digital computers. As to finding analogues of problems, as with digital computing, one breaks them down into components. Analogue computers are great at:

* summation
* integration with respect to time
* inversion
* multiplication
* exponentiation
* logarithm
* division, although multiplication is much preferred

BTW did you get a chance to check out that Factorizing RSA Keys, An Improved Analogue Solution:

http://www.springerlink.com/content/k5355j45w452537m/

Isn’t this one of those BQP problems?

Yes I am, and not really, respectively. The first thing to realise is that the terms “chaos” and “chaotic behaviour” have precise mathematical meaning. Second, “chaos” does not imply indeterminism or non-computability or some such. Third, the phrasing of the cited Wikipedia excerpt is a little misleading because true chaotic behaviour can always be digitally simulated to arbitrary precision. If it were not so, then, for example, the Mandelbrot set would come out looking differently every time it is computed. It’s just a question of how long you wish to wait for answers, as well as of the capability of the resources at your disposal.

That things happen essentially simultaneously in an analogue circuit of a certain kind also does not in principle preclude digital simulation. It is not a good reason at all. For example, in brittle failure modes (which are mathematically chaotic), things like stress redistributions and strain-energy releases also happen essentially simultaneously, yet we can model such scenarios quite satisfactorily on powerful digital machines. I think that what the article means to say is really just what I wrote earlier, namely that while analogues are fully deterministic in theory, it is an enormously difficult task to predict or model certain types’ behaviour in practice – so much so that it is fair to call them intractable or even infeasible. The important difference to bear in mind here is the distinction between “practically undoable” and “impossible even in principle.”

Yes, true enough all around. However, the observation that your suggestions haven’t been happening much should tell you a few important things: for reasons of physics, analogues do not generally lend themselves well to miniaturisation; the individual components are very limited in their capabilities and ranges of application; assembling a solution involves a hands-on approach to arranging the components in a particular way according to some design (sort of like using bits and pieces from a programming library, except that the analogue constituents are palpable); constructing a general-purpose machine that automatically assembles an analogue and runs it according to some schematic concept merely shifts the problem back by one level; and so on. In short, the limitations of analogues and the practical difficulties of implementing and using them are daunting. That is not to say, however, that they do not find good use in certain dedicated niches, nor that these difficulties are technically insurmountable.

Yes, I’ve read the analogue factorisation paper – thank you for locating it. It’s a very interesting theoretical exercise that has as much to say about complexity theory as about integer factorisation. As for what complexity class the problem of general integer factorisation is, it’s still unknown. Most number theorists are reasonably sure that it is squarely NP. Certainly, all known algorithms place it there, but definitive proof is still lacking. If it is indeed NP, it probably falls outside the scope of the BQP class (because it is also not entirely clear just how far the BQP class extends).

'Luthon64

OK, went and read up on Chaos theory and it IS about a deterministic mathematical model which shows that “tiny differences in the starting state of the system can lead to enormous differences in the final state of the system”.

Chaotic behaviour is a real phenomenon, observed in natural systems like weather and, I would propose, analogue circuits.

If used in the mathematical model sense, no, but when applied to a real life observation, why not?

This is where chaos theory seems to fall short of observed chaos. Considering the exponential growth of error in a chaotic system, how can a digital simulation ever be precise enough?

They look kinda pretty, but I’m betting that they take long to compute and they still look digital. Imagine what cool wavy patterns an analogue computer could make, and in a fraction of the time.

Agreed but it takes longer. That is why this quantum consciousness was proposed originally, wasn’t it?

Also with an analogue simulation it is possible to manipulate the variables and see the results in real time. With digital you have to run the algorithm again from scratch.

Still think it would be easier than constructing quantum computers.

Still not too sure I get this P complexity thing. Is the analogue solution not at least as plausible as the quantum one?

Yes, correct, but don’t confuse “chaos” with “unpredictability.” They’re not the same thing.

We have been talking about simulations all along, have we not? And simulations are by definition not the real thing.

How so? I don’t follow. Perhaps you should define “observed chaos” separate from mathematical chaos as relevant to the cited passage, in particular how one might distinguish the one from the other.

Very simply by defining the initial and governing conditions with sufficient accuracy to meet the precision requirements of the model in question.

Where to begin? Appearance is nothing in this context, merely exciting the sensibilities about an intriguing pattern. Mathematically, though, they’d be useless and – much worse – mostly uninformative because of the accuracy issues plaguing analogue computation. Then, there’s the question of recursion and iteration which analogues are quite clumsy at. Once you understand the mathematical nature of the Mandelbrot set, you should have no trouble seeing that high precision computation is the only way to generate it for any purpose beyond the aesthetic.

Not really. The Copenhagen interpretation (CI) of QM implies that nothing is definite until it is observed by a conscious entity. This led Schrödinger to propose his dead-alive cat thought experiment in order to illustrate the apparent absurdity of this conception. Based thereon, Roger Penrose much later hypothesised that quantum wave function collapse (reduction) – the technical term for finding a particle or group of coordinated particles in a definite state – is missing some essential ingredient that is perhaps also instrumental in manifestations of consciousness. Penrose conjectures that this is to be found in a proper quantum gravity formulation (still conspicuously lacking), and calls it “objective reduction” (OR), as opposed to the “subjective reduction” done by conscious observers. A neuroscientist, Hameroff, proposes that OR on a (relatively speaking) large coordinated scale within neural microtubules accounts for consciousness. That’s the picture painted in very broad strokes.

At present, undoubtedly so, but quantum computers aren’t just some theoretical possibility that excites only nerds. If achievable, these machines will revolutionise computation as surely as the digital one has, and probably in ways we can barely imagine today.

I’m not sure what to make of “plausible” in this context. Sure, the analogue variety is doable with today’s technology. After all, historically it precedes the digital kind, but the analogue computer is not a general-purpose machine for attacking a significant cross-section of real-world problems using a single device. That kind of flexibility is reserved for the digital computer and accounts in large part for its huge growth and success. The basic issue is that analogue computers are deterministic, extremely limited and very cumbersome to implement. In truth and in light of the above responses, I am beginning to be somewhat doubtful whether you appreciate fully the severity of these objections and constraints. Moreover, (and not that analogue and quantum computers are properly comparable – it would be a bit like comparing an abacus with an IBM Roadrunner), assuming that the technical difficulties with quantum computers are soluble, these machines have the potential to obviate many of the limitations of both analogue and digital computers. That we don’t have them yet is perhaps the most frustrating thing in all of this.

'Luthon64

But wouldn’t the observed chaotic behaviour in the final state ultimately be influenced by quantum unpredictability in the starting state?

OK, but digital simulations are less real. Analogue simulations are real physical processes which operate on real numbers, the results of which we can observe.

Mathematical chaos is a deterministic model which shows that “tiny differences in the starting state of the system can lead to enormous differences in the final state of the system”. When it is used to model a real chaotic system digitally, each iteration is performed on an approximation of the real values. Each of these approximations, in turn, is itself a tiny difference which can lead to enormous differences later on. Even if we are to increase the precision down to the quantum level, at enormous cost in processing time, we are still left with further unpredictability.

But through those digital iterations we loose the intrinsic unpredictability in a naturally chaotic system. Analogue computers may not be as precise, but they perform the calculations themselves far more accurately than a digital computer could given a reasonable amount of time.

OK, will read up more on these Mandelbrot sets, but I think my comment still applies to CGI generally.

Mefiante on September 07, 2009, 15:46:49 PM

I don’t doubt it, but isn’t an analogue computer more likely to evolve naturally than a quantum one?

I guess what I mean is, is the proposed analogue solution sufficiently efficient to explain our brain’s computational speed?

I agree it would be seriously cool if we built quantum computers. ;D

Sorry, but I’m somewhat confused. Is there a specific point, or more than one, that you wish to make? Or are you just whipping up conversation for the interest of it? Because truthfully I’m beginning to feel just a little beset and harassed over matters that are either trivial, only peripherally relevant or misconstrued, or all of the above – matters that you could easily do research on by yourself.

Do you think that, failing a quantum computer, the analogue kind is the answer to our computing needs and consciousness, maybe? Or that in reality analogues are quasi-quantum computers, maybe? If so, then reality and the computing status the world over resoundingly refute that stance. Of course, you are welcome to persist in such beliefs (if indeed you hold them), but it would in my view be unwise to do so, considering these rather imposing countermanding indications.

It’s quite simple, really: For a variety of technical reasons already given, analogue computers are not the answer. If they were, we’d all be using them already.

'Luthon64

I’m really sorry, that was not my intention at all. If you come to the next Skeptics in the Pub I’ll buy you a drink to try and make up for it.

I’m genuinely interested in artificial intelligence and, I must admit, somewhat disappointed by the lack of progress in this field. I’d hate to have to wait for quantum computers before we can build machines that can think and feel, so I’m looking for alternatives.

As for doing more research myself, I have been. Reading up on QM and responding to your posts has been taking up most of my free time lately.

I’m more concerned with consciousness, and how we could simulate it artificially. We already know that the human brain has both digital and analogue components. The digital is obviously insufficient considering we are starting to reach the limits of this technology and haven’t found a solution yet. Analogue computing, however, hasn’t really progressed much in the last 40 years.

Concerning my understanding of quantum mechanics: It seems logical to me that if everything is based on quantum mechanics, then the entire universe is essentially one big quantum computer. This isn’t a belief I hold, I just don’t know what else to think. What is it I am missing?

But if everyone though that way, no one would ever invent, or just imagine in my case, anything! Also, considering how closely scientific and technological progress has been tied to digital computing over the last few decades it might simply be a lack of interest or focus.

Not likely. Read what Mefi wrote. Ask yourself what’s special about it that the digital revolution happened so quickly and comprehensively. Even in things like photography and music that are as analog as analog can be. Hint: it has a lot to do with convenience & versatility.

I agree completely. For storing and transmitting information, digital is far more efficient. But, it is more efficient because it actually contains less information. An analogue photograph or recording contains far more information than your eyes or ears are capable of processing.

Hey, cool, this is post number 42 ;D

Uh-uh, not necessarily. You can rasterize a photo at a resolution higher than the grain size of the photographic film it was taken on (or of your eyes’ photoreceptor spacing) and apply a complex interpolation scheme in between. Then the digital pic will contain more info. You can sample a sound clip at time intervals shorter than the peak-to-peak of the highest resonant frequency of the recording device it was made on (or that of your eardrums) and again interpolate between. Then the digital sound will contain more info. In any case “information” is a digital concept formally. Ask Claude Shannon who came up with its definition. Again, why should that be?

Your real name Arthur Dent maybe? :wink:

Is this like spreading the pixels out and filling in the spaces between with the averages of the adjacent pixels?

Well, I think it’s because our nervous systems sense information that way.

Hey, a fellow Adams fan! :slight_smile:

Look who has finally got it… Do you think it is testable?

What do you think these guys are proposing and doing here?
Information processing mechanisms in microtubules at physiological temperature: Model predictions for experimental tests.
Towards Quantum Superpositions of a Mirror
Comments on Proposed Gravitational Modifications of Schrodinger Dynamics and their Experimental Implications

I don’t blame you. You might like the Simulation Argument. What kind of simulation hypothesis though?

The simulation argument linked to seems pretty unlikely to me although it makes for great sci-fi. However, one could argue that the “reality” we experience is just a simulation or approximation of reality generated in our own minds.

Something like that. Now you have the analog image plus the interpolation stuff. Thus more info than you started with. You can make the interpolation as complicated as you like.

What, we sense information as digital? I doubt it. Shannon went digital for reasons of formal rigor, precision and convenience.

“Finally”, eh? Yup, your own genius knows no bounds. That’s clear to anyone who reads your asinine drivel.

The Simulation Argument is antiscientific BS - except if you’re a cretinist or IDiot.

This is more like guessing or estimating what the missing info would have been. It will result in a larger file size but not really more information.

Even our individual nerve cells communicate with each other digitally via tiny packets of chemicals. Where communication is concerned, digital is definitely the way to go.

Brrraaaap! Wrong answer, thanks for playing. Look up how information is quantified before shooting from the hip like that. In one form or another, you’ve added the info to actually do the interpolation. Its amount is not zero or less.

How does that make sensory apprehension of information digital?

You argue just like a creationist, you know that? Not the content but the method. It’s like intellectual hopscotch. All over the place. Damn annoying.

Of what use is a digital definition of information in comparing analogue and digital complexity?

Try doing that repeatedly to an image of writing which is too small or pixelated to read. I doubt it will become much clearer. Compare this to the old analogue slide and projector.

It has already been digitized before it gets to your brain.

I don’t really see what I am doing as arguing. Why do you?