August 31st, 2014
Whenever we are considering two competing hypotheses, an observation counts as evidence in favor of the hypothesis under which the observation has the highest probability.
The Likelihood Principle of Confirmation theory states as follows. Let h1 and h2 be two be competing hypothesis (in this case the existence of X and ~X, with X being a first cause, fine-tuner, a particle, etc.). According to the Likelihood Principle, an observation e counts as evidence in favor of hypothesis h1 over h2 if the observation is more probable under h1 than h2. Thus, e counts in favor of h1 over h2 if P(e|h1) > P(e|h2), where P(e|h1) and P(e|h2) depict a conditional probability of e on h1 and h2, respectively. The degree to which the evidence counts in favor of one hypothesis over another is proportional to the degree to which e is more probable under h1 than h2: particularly, it is proportional to P(e|h1)/P(e|h2) . The Likelihood Principle seems to be sound under all interpretations of probability. This form is concerned with epistemic probability.
read more »
November 17th, 2013
The odds against an event is a ratio of the probability that the event will fail to occur (failure) to the probability that the event will occur (success).
Most of the time you hear about “odds”, they are referring to the odds against.
read more »
November 15th, 2013
Below is a brief introduction to theoretical probability. Probability is something that is used very, very frequently in philosophy and science. I would encourage further research and familiarity with it than just this.
Definition: If each outcome of an experiment has the same chance of occurring as any other outcome, they are said to be equally likely outcomes.
Let E be an event having equally likely outcomes, then the probability of E may be calculated with the following formula:
- The probability of an event that cannot occur (impossible event) is 0. p(ø) = 0
- The probability of an event that must occur is 1. p(S) = 1
- Every probability is a number between 0 and 1, inclusive. 0 < p(E) < 1
- The sum of the probabilities of all possible outcomes of an experiment is 1.
read more »
August 27th, 2012
In our experience, intentions get actualized any number of ways: A sculptor by chiseling at stone, musicians by writing notes, engineers by drawing up blueprints. In general, all actualizations of intentions can be realized in language. Precise enough sets of instructions in a natural language can tell the sculptor how to form the statue, musician how to record the notes, and engineer how to draw up blueprints.
Why should an act of speech be God’s mode of creation? Language is the universal medium for actualizing intentions. The language that proceeds from God’s mouth in the act of creation is the divine Logos (Jn. 1.1-5). In the act of creation God the Father speaks the divine Logos in the power of the Holy Spirit. The divine Logos is not just language in the ordinary sense (utterances that convey information), but the very ground and possibility of language. Words need power to accomplish their end and God’s Word has that power (Is. 55.11).
Given that we are made in God’s image, the Trinitarian structure of creation is reflected in human speech.
“The word [goes] out of the mouth of God in such a manner that it likewise ‘[goes] out of the mouth’ of men; for God does not speak openly from heaven, but employs men as his instruments, that by their agency he may make known his will.”
read more »
June 26th, 2012
Whenever probability is being considered there must be some type of relevant or total background information (usually depicted as k). The immediate objection when applying a probability rule or calculus to the fine-tuning of the universe in a multiverse scenario would be to say that this is universe is not an appropriate random sampling. In other words, if we know of [at least] only one universe with these values the random sample size is precisely 1; thus, no random sample can be used to assess the probability of certain values of physics in the argument. In statistics a random sample drawn must have the same chance of being sampled as all the other samples. Since we know of only one universe we do not know what the range of values for the constants and physics could be. Additionally, since we don’t know how narrow or broad these ranges could be there’s no way of drawing out any probability based argument for fine-tuning. However, we can know what other universes would be like if the values were different. If our natural laws have counterfactuals that are in any way incoherent then this is an appropriate sampling. Also, to make this objection and advocate that we just so happen to live in a life permitting universe in the multiverse then this objection cannot be made since the claim that we happen to life in a life-permitting one amongst countless others suggest we can know what the other samplings are.
read more »
May 9th, 2012
Word of the Week: Mathematical Invariance
Definition: In Einstein’s use of the word, mathematical invariance established a genuine ontology in which the subject grips with objective structures and intrinsic intelligibility of the universe.
More about the word: Throughout Einstein’s work, the mechanistic universe proved unsatisfactory. This was made evident after the discovery of the electromagnetic field and the failure of Newtonian physics to account for it in mechanistic concepts. Then came the discovery of four-dimensional geometry and with it the realization that the geometrical structures of Newtonian physics could not be detached from changes in space and time with which field theory operated.
read more »
April 18th, 2012
Information theory is the branch of probability theory that deals with uncertainty, accuracy, and information content in the transmission of messages. It can be applied to any system of communication (electric signals, fiber optic pulses, speech, etc.). Random signals, known as noise, are often added to a message during the transmission process, altering the signal received from that sent. Information theory is used to work out the probability that a particular signal received is the same as the signal sent. In transmitting a sequence of numbers, their sum might also be transmitted so that the receiver will know that there is an error when the sum does not correspond to the rest of the message. The sum itself gives no extra information, simply a confirmation. The statistics of choosing a message out of all possible messages (letters like the alphabet or binary digits for example) determines the amount of information contained in it. Information is measured in bits (binary digits). If one out of two possible signals are sent then the information content is one bit. A choice of one out of four possible signals contains more information although the signal itself might be the same.
For more information see John Daintith and John Clark’s The Facts on File Dictionary of Mathematics (New York: Market Book House, 1999), 97.