Friday, September 30, 2005
Paleocurrents and the Flood
For people wanting to do more research on the subject, he said that the only text that really deals with this in-depth is Paleocurrents and Basin Analysis.
Wednesday, September 28, 2005
Two From Dembski
- A Primer on Probability for Design Inferences
- Design by Elimination vs. Design by Comparison (from The Design Revolution)
Tuesday, September 27, 2005
Additional Mechanisms of Inheritance
The main points are as follows:
- the more we study cells, the more complex their biochemistry becomes
- while genes are inherited, the cells themselves have control over genetic expression through epigenetics
- this implies that DNA is more-or-less a "library" of information to be used by cellular processes as needed
- the cell, and not the genes, control inheritance
- cell stability in the face of large genomic change is readily viewable in the bacterial world
- transplanted membranes can be inherited even when transplanted to cells with different DNA
He points out:
For example, a study of the bacterium Escherichia coli over 10,000 generations found that at the end, 'almost every individual had a different genetic fingerprint', yet they were still Escherichia coli. Only if the cell is in control can we explain these observations.
He then explains the five parts of Gitt information theory: statistics, semantics, syntax, pragmatics, and apobetics. Though I have not read Gitt's book, it looks like Williams gave semantics waaay too small of a role. Within computer science, semantics is much more all-encompassing than Williams mentions. What Williams says is true, but there is indeed much more to be said.
He then concludes with Barbieri's semantic view of biology (also available in dead tree format). I have not heard of this work before, but since it is online I will hopefully have the ability to do so shortly. Williams sums up Barbieri as saying that:
- The cell is fundamentally an epigenetic, rather than a genetic, system (the cell is in charge, not the genes)
- Genes provide genetic memories for the cell, but there are other memories, some of which are waiting to be discovered, which participate in cell life, including many aspects of embryological development
- Each memory has a semantic code established
I agreed with Williams overall in principle, but I thought there were some things he overlooked.
First of all, he seemed to be trying to fit the changing nature of the genome with the semi-static nature of life, and therefore proposed that the genome is completely changeable while it is the structure of the cell which provides the stasis. This may in fact be true, but it is irrelevant. Codal systems are codal systems, whether they exist within the genome, or within the structure of the cell itself. I don't doubt that we will find parts of cellular structure which will change, too. From the point-of-view of a computer programmer, all information is essentially on equal par. In order for a program to work, you must have a base system, which can then be fitted with attachments. The base system, however it is coded, must necessarily have little change in order for the rest of the system to change in an orderly and consistent manner. In computer programs, both the changing and unchanging parts are stored on the same system (i.e. - program and data). Therefore, saying "the unchanging part is here and the changing part is here" is rather irrelevant and will probably be disproved in short order. The fact is simply that the unchanging part is more important than the changing part, and in fact is specifically what allows the changing parts to change in an orderly fashion.
Kind of along with this, there are many aspects of program semantics that are more than just the mapping of nucleotide sets to amino acids. A good look at type theory in computer science will show a beginning of the semantic information that must be accounted for in any codal system.
However, I do think that his main point -- we are only scratching the surface of inheritance and control of cellular structures, is quite correct.
There was also a note in there about an article discussing drastic chromosomal structure changes in rock hopper wallabies (TJ 17[1] 19-21), which should be of interest.
Friday, September 23, 2005
Alternative Splicing
Anyway, while this is secular research (I assume), it does point out how fearfully and wonderfully made we are! This is probably biochemically similar to what computer scientists would call higher-order functions.
More info on alternative splicing can be found here.
BACKDATED: Slow news day
So, in the meantime, these are some articles that are more directly anti-Darwin than I usually like to post, but are interesting for the fact that they were published in secular journals by ID'ers or creationists.
- Enzymic editing mechanisms and the origin of biological information transfer - Lambert, a creationist, showed that without complex editting enzymes, the transcription error rate in DNA would cause error catastrophe. However, the editting enzymes are coded for themselves by DNA. Other papers by Lambert (mostly biochemical research) can be found here.
- Simulating evolution by gene duplication of protein features that require multiple amino acid residues - Behe shows the mathematical probability of evolution of new proteins by gene duplication, where the protein would require more than one base pair change. Other papers by Behe (mostly biochemical research) can be found here.
- The Origin of Biological Information and the Higher Taxonomic Categories - a review article discussing the relative merits of various hypotheses upon the origin of the major animal taxonomic groupings. This was pulled after-the-fact from the journal for political reasons (you can't have an openly ID paper in a secular biology journal!) but a review of the peer-review file by others reveals that it did indeed pass peer-review.
Tuesday, September 20, 2005
Research Mode
Sunday, September 18, 2005
Rim Gravels, Paleohydraulic Analysis, and the Flood/Post-flood boundary
Deposits Remaining from the Genesis Flood: Rim Gravels in Arizona
Here they examine the gravels in Arizona, and determine that the flood/post-flood boundary is shortly after the deposition of the gravels, in the "late Cenozoic" period in the standard geologic column for that area. The discuss that the gravels include large, rounded quartzite rock with many percussive marks. The roundness of the stones indicate large volumes of water, while the percussive marks indicate very energetic transport.
They used paleohydraulic analysis to examine the rocks, and came up with the following results:
- Estimated minimum depths range from 3.3 meters (11 feet) to 9.9 meters (32.5 feet). Actual depths may have been greater.
- Estimated minimum current speeds range from 11.5 m/s (26 mph) to 21.5 m/s (48 mph). Actual peak current speeds may have been greater. These are far in excess of the recommended maximum allowable current speed for channels excavated in hard rock, which is 3 to 4.5 m/s (6.7 to 10 mph) (Julien, 1995), indicating that very rapid erosion would have taken place. Peak current speeds in excess of 30 m/s (67 mph) may result in cavitation and extremely rapid destruction of rock masses (Holroyd, 1990a,b).
- Estimated discharge per meter width range from 38 to 198 m3/s per meter width (410 to 2,130 ft3/s per foot width). Actual peak unit discharge may have been greater. The estimated unit flows exceed historic peak flood unit flows for the Colorado River at Bright Angel. Unit discharge estimates indicate a very different environment of deposition for the Rim Gravels from current environments.
- Paleocurrents were supercritical (Fr>1.0). To reduce the Froude number to 1.0 (critical flow) would require a flow depth of 4.6 kilometers (2.86 miles)! Flow, therefore, was almost certainly rapid, not tranquil.
- Estimated minimum Reynolds numbers are near the boundary between laminar and transitional flow. If actual peak depths and current speeds exceed the minimums estimated here, Reynolds numbers would have been higher, and flow would have been turbulent.
Minimum paleocurrents would have been very energetic, capable of eroding hard rock, planing off obstructions, rounding clasts, and transporting large amounts of sediment.
It's a very interesting article, and very well displays how creationists view geologic features of the earth.
Rim Gravels, Paleohydraulic Analysis, and the Flood/Post-flood boundary
This is an interesting article. It discusses the gravel rocks in Arizona, and uses them to define the flood/post-flood boundary.
They conclude that the material for the gravels was eroded from the south and west, and that the "late Cenozoic" part of the geological column is the proper flood/post-flood boundary in this region. Discussing the heavy rounding of the quartzite rock, indicating massive amounts of water, and the percussive marks indicate that the transport was particularly energetic.
Using conservative numbers, they subjected the rocks to paleohydraulic analysis and came up with the following results:
- Estimated minimum depths range from 3.3 meters (11 feet) to 9.9 meters (32.5 feet). Actual depths may have been greater.
- Estimated minimum current speeds range from 11.5 m/s (26 mph) to 21.5 m/s (48 mph). Actual peak current speeds may have been greater. These are far in excess of the recommended maximum allowable current speed for channels excavated in hard rock, which is 3 to 4.5 m/s (6.7 to 10 mph) (Julien, 1995), indicating that very rapid erosion would have taken place. Peak current speeds in excess of 30 m/s (67 mph) may result in cavitation and extremely rapid destruction of rock masses (Holroyd, 1990a,b).
- Estimated discharge per meter width range from 38 to 198 m3/s per meter width (410 to 2,130 ft3/s per foot width). Actual peak unit discharge may have been greater. The estimated unit flows exceed historic peak flood unit flows for the Colorado River at Bright Angel. Unit discharge estimates indicate a very different environment of deposition for the Rim Gravels from current environments.
- Paleocurrents were supercritical (Fr>1.0). To reduce the Froude number to 1.0 (critical flow) would require a flow depth of 4.6 kilometers (2.86 miles)! Flow, therefore, was almost certainly rapid, not tranquil.
- Estimated minimum Reynolds numbers are near the boundary between laminar and transitional flow. If actual peak depths and current speeds exceed the minimums estimated here, Reynolds numbers would have been higher, and flow would have been turbulent.
Minimum paleocurrents would have been very energetic, capable of eroding hard rock, planing off obstructions, rounding clasts, and transporting large amounts of sediment.
Very interesting article, and this half-baked review of it certainly does not do it justice.
BACKDATED: The Flood and the La Brea Tar Pits
La Brea Tar Pits: Evidence of a Catastrophic Flood
The article includes an interesting but poorly-documented experiment with insects in water. Apparently, an insect that dies in water can become disarticulated just from the water itself, even if the water is not moving.
The article includes information about the order of the fossils and their combination, and possible models for how they came to be buried as they were.
Friday, September 16, 2005
BACKDATED: Helium Zircons and Radioisotope Dating
The RATE group's purpose is to investigate the phenomena presented by radioisotopes in rock. Their goal was to see if other, equally stable, measurements using wholly different phenomena would give similar dates or differing dates.
What they discovered was that when Uranium decays within zircon crystals, it leaves behind Helium. Helium, being a noble gas, is more-or-less free within the crystalline structure. The RATE group determined that the rate of Helium diffusion within a zircon crystal should be very reliable. They performed experiments to determine precisely at what rate Helium diffuses through the crystal. Also, because of the hardness of the zircon crystal, this rate would be negligably impacted by the pressures exerted by the Earth's crust.
What they discovered was that the crystals indeed had quite a bit of Helium left, which limitted the age of the rocks to 4,000 to 14,000 years. The RATE group believes that this is evidence that some radioisotope decay rates have changed over the years. What is certain is that when using radically different, but equally reliable dating methods, you get radically different answers. This indicates that the assumptions underlying one or both dating methods is wrong.
What was learned here? For sure, we learned two things:
1) Helium diffusion rates through crystal lattices is very regular
2) What the diffusion rates of Helium through zircon crystals actually are
The preliminary conjectures about this data by the researching team are these:
1) The age of rocks given by these diffusion rates
2) That radioisotope decay rates have changed through history
3) That possibly radioisotope decay rate changes are the act of God in Creation and judgment
Personally, I wholeheartedly agree with 1 & 2, but take 3 as merely a possibility.
Anyway, here is the research paper which presents the results:
Helium diffusion rates support accelerated nuclear decay
Followed up by:
Helium Diffusion Age of 6,000 Years Supports Accelerated Nuclear Decay
This concept was mentioned in the first RATE book, but had not been confirmed by experiment until recently (I have not read this book myself).
This idea is not without its criticisms. Criticisms lodged against this idea include:
Young-Earth Creationist Helium Diffusion "Dates"
Talk.Origins Claim CD015
What about Humphreys excess helium arguments
Humphreys and others have responded to these criticisms, and you can read their responses here:
Helium Evidence for A Young
World Remains Crystal-Clear
Russ Humphreys refutes Joe Meert’s false claims about helium diffusion
NWCreation's Response to CD015
Wednesday, September 14, 2005
BACKDATED: Sedimentation, Stratification, and Moving Water
Berthault even has his own website concerning problems and answers to modern stratigraphy. This links to additional papers that I have not had time to review (not that I reviewed his others in any great depth).
Berthault does more than criticize, as his experiments show. He has even founded new principles of stratigraphy, which he calls Paleohydraulic Analysis.
BACKDATED: The pre-flood/flood/post-flood boundaries and Creationist stratigraphy
Secular geology usually employs a time-equivalent of rocks. This means that, generally, a given amount of time will produce a given amount of rock. Now, no geologist thinks that this is exactly true, but that is the basic understanding of how rocks form. A large formation was built in a lot of time and a small formation in a short time. I do understand that this is oversimplifying, but again this blog is not to debate which is the correct model, just to help people understand what the creationist model is.
In the creationist model, rocks represent energy-events. Thus, small amounts of rocks in a local area means small amounts of energy, while vast amounts of rock over vast areas means large amounts of energy. In creationism, the two largest energy events were Creation and the Flood. Therefore, the majority of the rocks are going to come from these two events as well. Likewise, the rocks which exhibit the greatest amount of discontinuity from the rocks below, combined with the greatest size and breadth, are likely rocks from the flood. Rocks towards the top of the column which are local in extent and show only minor discontinuities from the rocks below them are more likely to be post-flood rocks.
Anyway, the most important idea is that instead of being time-equivalent, in Creationism rocks are energy-equivalent.
Anyway, all of that to introduce two articles about these boundaries:
- A Note on the Pre-Flood/Flood Boundary in the Grand Canyon
- Assessing Creationist
Stratigraphy with Evidence from
the Gulf of Mexico
The latter one describes more fully how Creationists view geology. The former one is interesting, but it has an especially interesting discussion on created systems and whether or not they were created with the appearance of age.
If you want a good primer on Creationist stratigraphy, see Tas Walker's Biblical Geology site, especially his Geological Model page.
Tuesday, September 13, 2005
BACKDATED: Baraminologist Taxonomy
Baraminology is described very well in the book Understanding the Pattern of Life (link is to my book review of it).
You can also see The Refined Baramin Concept for information on baraminology.
The Hybridization Database
One of the primary tasks for baraminologists is determining what the original created kinds were. Historically, the gold standard for determining created kinds is hybridization, or cross-breading. This has been practiced since at least Linnaeus. In order to pursue research along these lines, the Bryan College Center for Origins Research has been working on putting together a hybridization database of all known hybridization studies.
BDIST Software
Recently, several statistical methods were developed for doing taxic analysis, which is especially useful when hybridization is impractical or not possible. Hybridization is an inclusive, not an exclusive, practice, meaning that failure of hybridization does not necessarily make two species belong to different baramins. For example, if a husband and wife can't have kids, it does not mean that they cease to be humans! There are certain animals who have diverged in ways that make breeding difficult (though sometimes it can be forced) or prevent it altogether. Therefore, statistical evaluation of character traits becomes important. However, note that all statistical methods rely on the choice of traits for examination.
BDIST is the first one of these, which stands for "baraminic distance". BDIST takes two species, and charts different character traits on a graph -- number of fingers, number of toes, average size, etc., using one species for the X line and the other for the Y line. The correlation between the species is then determined by viewing the "best fit" line that is generated. The nearer to a slope of 1 the line is, the more likely the two species are members of the same baramin.
Here is a description and download for the software.
A Quantitative Approach to Baraminology With Examples from the Catarrhine Primates, the original paper describing the method.
ANOPA
Anopa stands for "analysis of patterns", and it is used to view n-dimensional patterns in a 1, 2, and 3 dimensional space. Each character trait is considered a dimension. So, if you are studying 20 character traits across a number of species, your basic plot would have 20 dimensions. However, we can't really visualize more than 2 or 3 dimensions, so ANOPA projects the multidimensional patterns into 1, 2, or 3 dimensions. The way this projection is done is as follows:
- First determine the center point of the distribution (called the "centroid")
- Next, choose an "outlier" -- a species that is plotted but is assumed from the start to not be a member of the group. The patterns that are developed are in relationship to the line between centroid and the outlier.
- For each taxa, calculate:
- the distance along the line (the line between the centroid and the outlier) that they are at,
- the distance _from_ the line that they are at
- the rotation along the line that the are at, using the origin as a point of reference
- Convert the cylindrical coordinates to euclidean
You now have the cylindrical coordinates.
I am personally working on an ANOPA calculator/viewer, though I haven't had time to work on it recently.
The original paper describing ANOPA: ANALYSIS OF MORPHOLOGICAL GROUPINGS USING ANOPA, A PATTERN RECOGNITION AND MULTIVARIATE STATISTICAL METHOD: A CASE STUDY INVOLVING CENTRARCHID FISHES.
It's use in determining whether taxonomic groups are members of the same created kind is discussed here:
An Evaluation of Lineages and Trajectories as Baraminological Membership Criteria
I haven't read that one yet.
Classic Multidimensional Scaling
I haven't had time to look into this one yet. Sorry :( Anyway, here is the paper describing it:
Visualizing Baraminic Distances Using Classical Multidimensional Scaling
Monday, September 12, 2005
Donations Accepted
- [UPDATE: this has been donated!] A subscription to AiG's TJ
- [UPDATE: this has been donated!] A subscription to CRSQ
- CORE issues in creation
- Any other creation research magazine you can think of
I also could use some books. If you author a book on creation and/or evolution and want me to review it here or on my crevo or crevobits blogs, send a copy to:
Jonathan Bartlett
4208 W San Antonio
Broken Arrow, OK 74012
If you send me a book on creation and evolution, I will read it and review it in one of my blogs. You can even send me evolutionist books! If I get overwhelmed with books (one can only hope), I'll let you know. It only takes me about a week to get through a book, but I've got a pretty large backlog.
Anyway, sorry to be mooching in the early stages of the blog, but this could really be helpful to me getting the word out about creationist activities. Part of the problem that creationism has had is that it looks too much like a movement of inactive, back-seat-driver whiners from the outside. Publicizing and letting creationists know about the active research projects being taken on by Creationists will help people see that creationists have positive, insightful, and profound ideas that can alter science for the better!
Dembski's Mathematical Foundations of Intelligent Design
- Information as a Measure of Variation
- Uniform Probability
- Searching Large Spaces
- Specification: The Pattern That Signifies Intelligence
The only one I've had time to read is the last one, and that one only in a cursory manner. However, the Searching Large Spaces seems very interesting, though I should probably pick up a copy of No Free Lunch first.
A Grander View of Life -- the 4th BSG Conference
Interesting stuff contained within:
- Using systems theory and pattern design to study biological entities and their limits of variability
- Studying stress-adapted changes in bacterial-genomes
- Studying the relationship between taxonomy and the ease of taxic identification to man (since God asked man to name the animals)
- The role of polyploidy in diversification
- Several taxic studies (including snakes and whales)
- The correlation between interaction networks of genes and their conservation in species where common descent is correct.
- Lots, lots more.
Unfortunately, they didn't include full papers. Hopefully those presenting at the conference will make their full research available online. If anyone knows where these might be had, please send me a line.
BACKDATED: A New Way of Thinking about the Genome
Anyway, this is a whole collection of articles to give you new ideas of how to think about the genome. Not all of these are by creationists or even ID'ers, but they do give a new view of how the genome works which is radically different from the more-or-less static genome normally considered. This can be called designed variability, or natural genetic engineering, and is basically the ability for the genome to create its own variability and even engineer new enzymes under environmental stress.
Modular, self-changeable systems are a clear indicator of design. Anyway, here's the stuff:
Shapiro's work on Genome Engineering (all these articles are excellent, even if we disagree in major ways with many of his ideas)
Genetic Variability by Design (especially check out the references -- they are all excellent). Also see this overview and discussion of the significance of it.
The AGE-ing Process This is a creationist explanation for how genomes reorganize themselves. This contains some excellent ideas. However, this view is probably too limitted as it is still more genome-centric than the real answer probably is.
What we are finding out about the genome is simply amazing. Expect that in the coming years, we will find out continually new and amazing wonders from how the cell works, and its interaction with the environment. The design elements that keep on popping up in the cell are nothing short of amazing.
The terms of importance are recombination (recombining parts of chromosomes in sexual reproduction), natural genetic engineering (the ability for a cell to produce new genes based on need), and genomic modularity (the ability for a genome to have modular parts which can be switched on and off, duplicated, or reconfigured on the fly in times of environmental stress).