April 27th, 2013
The following is the abstract from a recent paper (“Life Before Earth,” 28 March 2013) published in arXiv by Alexei A. Sharov, Ph.D. (Staff Scientist, Laboratory of Genetics) and Richard Gordon, Ph.D. (Theoretical Biologist, Embryogenesis Center). What’s quite startling and significant about this paper is that it compares to the complexity found in biology and compares it to Moore’s Law, which is a computer/computational complexity. What’s important is not the mere issue of complexity but the specific coding elements required for specific function in conjunction with complexity. Thus, the information content is very complex, robust, and specified.
An extrapolation of the genetic complexity of organisms to earlier times suggests that life began before the Earth was formed. Life may have started from systems with single heritable elements that are functionally equivalent to a nucleotide. The genetic complexity, roughly measured by the number of non-redundant functional nucleotides, is expected to have grown exponentially due to several positive feedback factors: (1) gene cooperation, (2) duplication of genes with their subsequent specialization (e.g., via expanding differentiation trees in multicellular organisms), and (3) emergence of novel functional niches associated with existing genes. Linear regression of genetic complexity (on a log scale) extrapolated back to just one base pair suggests the time of the origin of life = 9.7 ± 2.5 billion years ago.
read more »
June 25th, 2012
I have a new paper in moderation at arXiv. The two papers below are currently listed there:
- “Epistemological-Scientific Realism and the Onto-Relationship of Inferentially Justified and Non-Inferentially Justified Beliefs,”arXiv: 1205.2896 (May 2012)
- “Albert Einstein and Scientific Theology,” arXiv: 1205.4278 (May 2012).
The multiverse hypothesis is the leading alternative to the competing fine-tuning hypothesis. The multiverse dispels many aspects of the fine-tuning argument by suggesting that there are different initial conditions in each universe, varying constants of physics, and the laws of nature lose their known arbitrary values; thus, making the previous single-universe argument from fine-tuning incredibly weak. There are four options for why a fine-tuning is either unnecessary to invoke or illusory if the multiverse hypothesis is used as an alternative explanans. Fine-tuning might be (1) illusory if life could adapt to very different conditions or if values of constants could compensate each other.
read more »
April 18th, 2012
Information theory is the branch of probability theory that deals with uncertainty, accuracy, and information content in the transmission of messages. It can be applied to any system of communication (electric signals, fiber optic pulses, speech, etc.). Random signals, known as noise, are often added to a message during the transmission process, altering the signal received from that sent. Information theory is used to work out the probability that a particular signal received is the same as the signal sent. In transmitting a sequence of numbers, their sum might also be transmitted so that the receiver will know that there is an error when the sum does not correspond to the rest of the message. The sum itself gives no extra information, simply a confirmation. The statistics of choosing a message out of all possible messages (letters like the alphabet or binary digits for example) determines the amount of information contained in it. Information is measured in bits (binary digits). If one out of two possible signals are sent then the information content is one bit. A choice of one out of four possible signals contains more information although the signal itself might be the same.
For more information see John Daintith and John Clark’s The Facts on File Dictionary of Mathematics (New York: Market Book House, 1999), 97.