Sunday, March 30, 2014

U of Texas' Granville Sewell on entropy and the evolutionary materialist meta-narrative and origins story

Food for thought (HT UD, vid at YouTube):




For background, cf this on the basics of thermodynamics:

Thermodynamics, the entropy concept and the famous second law were first developed in analysing heat engines. Where in (a) we see the second law, in the isolated system overall increment in entropy dS must be at minimum zero, though of course the heat evolving subsys loses and the heat absorbing one gains, but dependence on temperature means the gain exceeds the loss. This of course also highlights how a system that gains heat or similar forms naturally tends to increase entropy, there being more ways for mass and energy to be distributed at micro level than before d'Q passed into it. Then in a heat engine at (b) the possibility of shaft work now leads to various possibilities for B' which both absorbs and emits energy, some as exhaust heat to D, some as shaft work, dW.

Such leads to the question of spontaneous formation of a heat engine, which is possible, e.g. a hurricane. But when such an engine shows intricate, functionally specific complex organisation, its formation by blind chance and mechanical necessity becomes maximally implausible as can be seen for the case of a piston engine (or a turbine engine) . . . here (HT, Wiki) the famous Rolls Royce Merlin V-12 used in the Spitfire, Hurricane and Mustang Fighters -- with the shaft shown sticking out:



 Moreover, the associated concepts are now understood to be far broader and far more powerful, once the concept of molecules and energy states was first added (leading to statistical thermodynamics) and once, later, the conceptual link to and from Shannon's information theory was appreciated -- though this is still subject to vigorous objection in some quarters.

However, as Wikipedia acknowledged in its article on informational entropy, as at April 2011:

At an everyday practical level the links between information entropy and thermodynamic entropy are not close. Physicists and chemists are apt to be more interested in changes in entropy as a system spontaneously evolves away from its initial conditions, in accordance with the second law of thermodynamics, rather than an unchanging probability distribution. And, as the numerical smallness of Boltzmann's constant kB indicates, the changes in S / kB for even minute amounts of substances in chemical and physical processes represent amounts of entropy which are so large as to be right off the scale compared to anything seen in data compression or signal processing.

But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer. (See article: maximum entropy thermodynamics.[Also, another article remarks:  >>in the words of G. N. Lewis writing about chemical entropy in 1930, "Gain in entropy always means loss of information, and nothing more" . . .   in the discrete case using base two logarithms, the reduced Gibbs entropy is equal to the minimum number of yes/no questions that need to be answered in order to fully specify the microstate, given that we know the macrostate.>>]) Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox). [Cf. more detailed discussion here on.]

The relevance of this to the origin and development of life lies in the challenge to explain the origin of functionally specific complex organisation and associated information -- much of it in the form of complex digital code that operates cells and develops body plans from embryos. 

 Namely, as Sewell challenges, too much of what is being claimed amounts to being materially similar to a tornado video running backwards and untangling chaos into the fabric of an intact town. That becomes especially evident in the case of origin of cell based life from some claimed prebiotic soup or similar environment, but it also obtains for the claimed spontaneous origin of major body plans by blind watchmaker thesis macro-evolution.

So, the Sewell video is food for thought.

As is this Vuk Nicolic video on the DNA and RNA information "tape" controlled production of proteins in the living cell:

Protein Synthesis from Vuk Nikolic on Vimeo.

Where this illustration (HT Wiki) shows the ribosome in action:



Sewell patently has a serious point, one well worth reflecting on.

And for a first step for more cf. here on on OoL and here on on OoBPs. END