Saturday, October 27, 2012

From the naturalism workshop, part I




by Massimo Pigliucci

Well, here we are, in Stockbridge, MA, in the middle of the Berkshires, sitting at a table that features of good number of very sharp minds, and yours truly. This gathering is the brainchild of cosmologist Sean Carroll, entitled “Moving Naturalism forward,” its point being to see what a bunch of biologists, physicists, philosophers and assorted others think about life, the universe and everything. And we have three days to do it. Participants included: Sean Carroll, Jerry Coyne, Richard Dawkins, Terrence Deacon, Simon DeDeo, Dan Dennett, Owen Flanagan, Rebecca Goldstein, Janna Levin, David Poeppel, Alex Rosenberg, Don Ross, Steven Weinberg, and myself.

Note to the gentle reader: although Sean has put together an agenda of broad topics to be discussed, this post and the ones following it will inevitably have the feel of a stream of consciousness. But one that will be interesting nonetheless, I hope!

During the roundtable introductions, Dawkins (as well as the rest of us) was asked what he would be willing to change his mind about; he said he couldn’t conceive of a sensible alternative to naturalism. Rosenberg, interestingly, brought up the (hypothetical) example of finding God’s signature in a DNA molecule (just like Craig Venter has actually done). Dawkins admitted that that would do it, though immediately raised the more likely possibility that that would be a practical joke played by a superhuman — but not supernatural — intelligence. Coyne then commented that there is no sensible distinction between superhuman and supernatural, in a nod to Clarke’s third law.

There appeared to be some interesting differences within the group. For instance, Rosenberg clearly has no problem with a straightforward functionalist computational theory of the mind; DeDeo accepts it, but feels uncomfortable about it; and Deacon outright rejects it, without because of that embracing any kind of mystical woo. Steven Weinberg asked the question of whether — if a strong version of artificial intelligence is possible — it follows that we should be nice to computers.

The first actual session was about the nature of reality, with an introduction by Alex Rosenberg. His position is self-professedly scientistic, reductionist and nihilist, as presented in his The Atheist’s Guide to Reality. (Rationally Speaking published a critical review of that book, penned by Michael Ruse.) Alex thinks that complex phenomena — including of course consciousness, free will, etc. — are not just compatible with, but determined by and reducible to, the fundamental level of physics. (Except, of course, that there appears not to be any such thing as the fundamental level, at least not in terms of micro-things and micro-bangings.)

The first response came from Don Ross (co-author with James Ladyman of Every Thing Must Go), who correctly pointed out that Rosenberg’s position is essentially a statement of metaphysical faith, given that fundamental physics cannot, in fact, derive the phenomena and explanations of interest to the special sciences (defined here as everything that is not fundamental physics).

Weinberg made the interesting point that when we ask whether X is “real” (where X may be protons or free will) the answer may be yes, with the qualification of what one means by the term “real.” Protons, in other words (and contra both Rosenberg and Coyne), are as real as free will for Weinberg, but that qualifier means different things when applied to protons than it does when applied to free will.

In response to Weinberg’s example that, say, the American Constitution “exists” not just as a piece of paper made of particles, Rosenberg did admit that the major problem for his philosophical views is the ontological status of abstract concepts, especially mathematical ones as they relate to the physical description of the world (like Schrödinger’s equation, for instance).

Dennett asked Rosenberg if he is concerned about the political consequences of his push for reductionism and nihilism. Rosenberg, to his credit, agreed that he has been very worried about this. But of course from a philosophical and epistemological standpoint nothing hinges on the political consequences of a given view, if such a view is indeed correct.

Following somewhat of a side track, Dennett, Dawkins and Coyne had a discussion about the use of the word “design” when applied to both biological adaptations and human-made objects. Contra Dawkins and Coyne, Dennett defends the use of the term design in biology, because biologists ask the question “what is this [structure, behavior] for?” thus honestly reintroducing talk of function and purpose in science. A broader point made by Dennett, which I’m sure will become relevant to further discussions, is that the appearance on earth of beings capable of reflecting on things makes for a huge break from everything else in the biosphere, a break that ought to be taken seriously when we talk about purpose and related concepts.

Owen Flanagan, talking to Rosenberg, saw no reason to “go eliminativist” on the basic furniture of the universe, which includes a lot more than just fermions qua fermions (see also bosons): it also includes consciousness, thoughts, libraries, and so on. And he also pointed out that, again, Rosenberg’s ontology potentially gets into serious trouble if we decide that things like mathematical objects are real in an interesting sense of the term (because they are not made of fermions). Flanagan pointed out that what we were doing in that room had to do with the meaning of the words being exchanged, not just with the movement of air molecules and the propagation of sounds, and that it is next to impossible to talk about meaning without teleology (not, he was immediately careful to add, in the Cartesian sense of the term).

Again interestingly, even surprisingly, Rosenberg agreed that meaning poses a huge problem for a scientistic account of the world, for a variety of reasons brought up by a number of philosophers, including Dennett and John Searle (the latter arguing along very different lines from the former, of course). He was worried that this will give comfort to anti-naturalists, but I pointed out that not being able to give a scientific (as distinct from a scientistic) account of something — now or ever (after all, there are presumably epistemic limits to human reason and knowledge) does not give much logical comfort to the super-naturalist, who would simply be arguing from ignorance.

Poeppel asked Rosenberg what he thinks explanations are, I assumed in the context of the obvious fact that fundamental physics does not actually explain the subject matters of the special sciences. Rosenberg’s answer was that explanations are a way to ally “epistemic hitches” that human beings have. At which point Dennett accused Rosenberg of being an essentialist philosopher (a la Parmenides), making a distinction between explanations in the everyday sense of the word and real explanations, such as those provided by science. But, argued Dennett, this is a very old fashioned way of doing philosophy, and it treats science in a more fundamentalist (not Dennett’s term) way than (most) scientists themselves do.

The afternoon session was devoted to evolution, complexity and emergence, with Terrence Deacon giving the introductory remarks. He began by raising the question of how do we figure out what does and does not fit in naturalism. His naturalistic ontology is clearly broader than Rosenberg’s, including, for instance, teleology (in the same sense as espoused earlier in the day by Dennett). Deacon rejects what Dennett calls “greedy” reductionism, because there are complex systems, relations, and other things that don’t sit well with extreme forms of reductionism. Relatedly, he suggested (and I agreed) that we need to get rid of talk of both “top-down” and indeed “bottom-up” causality, because it constraints us to think about the world in ways that are not useful. (Of course, top-down causality is precisely the thing rejected by greedy reductionists, while the idea that causality only goes bottom-up is the thing rejected by antireductionists.)

Ross concurred, and proposed that another good thing to do would be to stop talking about “levels” of organizations of reality and instead think about the scale of things (the concept of “scale” can be made non-arbitrary by referring to measurable degrees of complexity and/or to scales of energy). Not surprisingly, Weinberg insisted on the word levels, because he wants to say that every higher level does reduce to the bottom lowest one.

Deacon is interested in emergence because of the issue of the origin of life understood (metaphorically speaking) as a “phase transition” of sorts, which is in turn related to the question of how (biological) information “deals with” the constraints imposed by the second law of thermodynamics. In other words: the interesting question here is how did a certain class of information-rich complex systems manage to locally avoid the second law-mandated constant increase in entropy. (Note: Deacon was most definitely not endorsing a form of vitalism according to which life defies — globally — the second principle of thermodynamics. So this discussion is relevant because it sets out a different way of thinking about what it means for complex systems to be compatible with but not entirely determined by the fundamental laws of physics.)

All of the above, said Deacon, is tied up in what we mean by information, and he suggested that the well known Shannon formulation of information — as interesting as it is — is not sufficient to deal with the teleologically-oriented type of information that characterizes living organisms in general, and of course consciously purposeful human beings in particular.

Dennett seemed to have quite a bit of sympathy with Deacon’s ideas, though he focused on pre- or proto-Darwinian processes as a way to generate those information-rich, cumulative, second principle (locally) defying systems that we refer to as biological.

Rosenberg, as usual, didn’t seem to “be bothered by” the fact that we don’t have a good reductionist account of the origin of life. Methinks Rosenberg should be bothered a bit more by things for which reductionism doesn’t have an account and where emergentism seems to be doing better.

At this point I asked Weinberg (who has actually read my blog series on emergence on his way to the workshop!) why does he think that the behavior of complex systems is “entailed” by the fundamental laws. He conceded two important points, the second one of which is crucial: first, he readily agreed that of course nobody can (and likely will ever be able to) actually reduce, say, biology to physics (or even condensed matter physics to sub-nuclear physics); so, epistemic reduction isn’t the game at all. Second, he said that nobody really knows if ultimate (i.e., ontological) reduction is possible in principle, which was precisely my point; his only argument in favor of greedy reductionism seems to be a (weak) historical induction: physicists have so far been successful in reducing, so there is no reason to think they won’t be able to keep doing it. Even without invoking Hume’s problem of induction, there is actually very good historical evidence that physicists have been able to do so only within very restricted domains of application. It was gratifying that someone as smart and knowledgeable in physics as Weinberg couldn’t back up his reductionism with anything more than this. However, Levin agreed with Weinberg, insisting on the a priori logical necessity of reduction, given the successes of fundamental physics.

Weinberg also agreed that there are features of, say, phase transitions that are independent of the microphysical constituents of a given system; as well as that accounts of phase transitions in terms of lower level principles are only approximate. But he really thinks that the whole research program of fundamental physics would go down the drain if we accepted a robust sense of emergence. Well, maybe it would (though I don’t think so), but do we have any better reason to accept greedy reductionism than fundamental physicists’ amor proprio? (Or, as Coyne commented, the fact that if we start talking about emergence then the religionists are going to jump the gun for ideological purposes? My response to Jerry was: who cares?)

Don Ross argued that fundamental physics just is the discipline that studies patterns and constraints on what happens that apply everywhere at all times. The special sciences, on the contrary, study patterns and constraints that are more spatially or temporally limited. This can be done without any talk of bottom-up causality, which seems to make the extreme reductionist program simply unnecessary.

Flanagan brought up the existence of principles in the special sciences, like natural selection in biology, or operant conditioning in psychology. He then asked whether the people present imagine that it will ever be possible to derive those principles from fundamental physics. Carroll replied — acknowledging Weinberg’s earlier admission — that no, that will likely not be possible in practice, but in principle... But, again, that seems to me to amount to a metaphysical promissory note that will never be cashed.

Dennett: so, suppose we discover intelligent extraterrestrial life that is based on a very different chemistry from ours. Do we then expect them to have the same or entirely different economics? If lower levels entail (logically) higher phenomena, the answer should be in the negative. And yet, one can easily imagine that similar high-level constraints would act on the alien economy, thereby yielding a convergently similar economy “emerging” from a very different biochemical substrate. The same example, I pointed out, applies to the principle of natural selection. Goldstein and DeDeo engaged in an interesting side discussion on what exactly logical entailment, well, entails, as far as this debate is concerned.

Interesting point by Deacon: emergence is inherently diachronic, i.e., emergent properties are behaviors that did not appear up to a certain time in the history of the universe. This goes nicely with his contention that talk of causality (top-down or bottom-up) is unhelpful. In answer to a question from Rosenberg, Deacon also pointed out that this historical emergence may not have been determined by things that happened before, if the universe is not deterministic but contingent (as there are good reasons to believe).

Simon DeDeo took the floor talking about renormalization theory, which we have already encountered as a major way of thinking about the emergence of phase transitions. Renormalization is a general technique that can be used to move from any group/level to any other, not just in going from fundamental to solid state physics. This means that it could potentially be applied to connecting, say, biology with psychology, if all the involved processes involved finite steps. However, and interestingly, when systems are characterized by effectively infinite steps, mathematicians have shown that this type of group theory is subject to fundamental undecidability (because of the appearance of mathematical singularities). Seems to me that this is precisely the sort of thing we need to operationalize otherwise vague concepts like emergence. 

Another implication of what DeDeo was saying is that one could, in practice, reduce thermodynamics (macro-model) to statistical mechanics (micro-model), say. But there is no way to establish (it’s “undecidable”) whether there isn’t another micro-model that is equally compatible with the macro-model, which means that there would be no principled way to establish which micro-model affords the correct reduction. This implies that even synchronic (as opposed to diachronic) reduction is problematic, and that Rosenberg’s refrain, “the physical facts fix all the facts” is not correct. (As a side note, Dennett, Rosenberg and I agreed that DeDeo’s presentation is a way of formalizing the Duhem-Quine thesis in epistemology.)

It occurred to me at this point in the discussion that when reductionists like Weinberg say that higher level phenomena are reducible to lower level laws “plus boundary conditions” (e.g., you derive thermodynamics from statistical mechanics plus additional information about, say, the relationship between temperatures and pressures), they are really sneaking in emergence without acknowledging it. The so-called boundary conditions capture something about the process of emergence, so that it shouldn’t be surprising that the higher level phenomena are describable by a lower level “plus” scenario. After all, nobody here is thinking of emergence as a mystical spooky property.

And then the discussion veered into evolution, and particularly the relationship between the second law of thermodynamics and adaptation by natural selection. Rosenberg’s claim was that the former requires the latter, but both Dennett and I pointed out that that’s a misleading way of putting it: the second law is required for certain complex systems to evolve (in our type of universe, given its laws of physics). But the mere existence of the second law doesn’t necessitate  adaptation. Lots of other boundary conditions (again!) are necessary for that to be the case. And it is this tension between fundamental physics requiring (in the strong sense of logical entailment) vs merely being necessary (but not sufficient) for and compatible with certain complex phenomena that captures the major division between the two camps in which participants to the workshop are divided (of course, understanding that there is some porosity between the two camps themselves).

Tomorrow: morality, free will, and consciousness!

4 comments:

  1. Thanks for the very detailed write-up; it was a joy to read and I very much look forward to part 2!

    One minor thought about using renormalization to "operationalize" notions like reduction and emergence: did Robert Batterman's work (e.g. _The Devil in the Details_ and a number of subsequent papers) come up at all in that context? It might (or might not be) relevant...

    ReplyDelete
    Replies
    1. Shane,

      yes, I referred to Batterman's work in the workshop. It's a must read.

      Delete
  2. Sounds like you all should be aware of some of the work being done on Physical Intelligence coming from the Ecological Psychology community. A lot of progress is being made on the relationship between intelligence, evolution, and thermodynamics, coming from physics, psychology, and philosophy. This special issue should be a good place to start:
    http://www.tandfonline.com/toc/heco20/24/1

    I know Dennett at least has been a guest at Turvey's pub and is probably aware of the approach.

    Specifically with regards to the your paragraph, a proposal has been made for a fourth law of thermodynamics, that entropy always increases maximally, which would predict the emergence of dissipative structures and complex systems as inevitable.

    What's lacking is an account of the divide between autocatakinetic systems in general (e.g. hurricanes), which maintain themselves away from equilibrium by the consumption of negentropy, but are slaves to the negentropy gradient and will themselves dissipate when the negentropy source is depleted. Biological systems, on the other hand, can fight the gradient and forage for remote sources of negentropy. An explanation of this divide is the motivation for this special issue (and another one two issues later).

    Finally, I would have liked to see Robert Rosen come up in the discussion on reductionism. His view is that impredicativity is necessary to explain complex systems, and in fact impredicativity is the general case, not a specific one. In order to understand complex systems science needs to abandon a Newtonian view of cauaslity (and the closely related model of predicative logical entailment). Similarly, Godel showed that in formal systems logical completeness and closure is not the general case but a specific one, and Rosen claims that physics cannot explain biology because biological systems are erroneously viewed as specific cases. In his view, the complex is the general, and science as a whole needs to reorganize around complexity rather than explanations built up from simple systems.

    ReplyDelete
  3. One more thing with regard to the insufficiency of Shannon information: this has been espoused by ecological psychologists going back to Gibson. Specifically, there's an article in the second special issue on physical intelligence that's highly relevant:
    http://www.tandfonline.com/doi/full/10.1080/10407413.2012.702615

    Yates, "On varieties of information"

    ReplyDelete

Note: Only a member of this blog may post a comment.