In the book Crystalline Intelligence we find that the Human Genome with 3 Billion Base Pairs and near 30,000 Genes, has perhaps 10^4,515,450 combinations with uniquely differing expressions. This number is so incomprehensibly large that the supposed 50 Billion species that have existed upon earth, can be correctly stated as zero percent of the possible specie expressions.
Since "Seeds of Symmetry" 2nd Edition was written, many books have been distributed, many dialogues have been had, and development of another book in the Crystalline Intelligence ® Series has begun. While "Seeds of Symmetry" was written as more of a primer showing that Darwinian Natural Selection was a flawed hypothesis on the development of Life, this new book will be highly focused upon discovering its foundational mechanisms.
-CHATS WITH ChatGPT-
The following pages of this site show chats with ChatGPT AI, some of which will be used in the new book. It is interesting that ChatGPT, in the beginning of a Chat, seems to protect the biases of its creators, the programmers. Views counter to straight up Darwinism came back only reluctantly after initial queries. However, with further user demonstration of knowledge of the material, the algorithm seemed to become malleable, and when its search and interpretation of available databases could no longer yield a rigorous defense, we reached a consensus and (humorously) it even thanked me for the discussion (in CHAT4). In any case, ChatGPT AI did seem to be relatively objective once the discussion had developed, and lacked the mindless resistance to discussion of the typical Darwinian Atheist. Note that I set the "Temperature" of the Algorithm to "Minimum" in CHAT2 and "Maximum" in CHAT3. This "Temperature" allows a software analog of a cool pot of water versus a boiling one. The "Minimum Temperature" solution provides a simpler and more repeatable answer. The "Maximum Temperature" solution mathematically agitates the Algorithm, causing it to jump out of a local minimum solution and perhaps provide an answer with broader insight. This effect can be seen in CHAT3 versus CHAT2. [Note that the Chats open in new Tabs]
Note that in all of these Chats, ChatGPT concluded, evaluating all searchable data available, that Natural Selection born of testing random mutations, could not have developed Life as we know it. The new book will expand upon this view narrowly (compared to the more broadly scientific "Seeds of Symmetry").
-BONUS - AI DEMYSTIFIED - A LITTLE
Before closing today's most current version of this page, a few more words on AI may be valuable to some readers. The term "Artifical Intelligence" is really quite aggrandizing to the technology, which actually expands upon mathematical algorithms that have been around for some time. The main elements that have more recently allowed the illusion of intelligence to grow are (a) computing speed and distributed processing, and (b) database sizes and accessibility. Combined, these elements have indeed allowed computing to meld near instantaneous, repeatable information recall and manipulation into what appears as Intelligence. So, if elements (a) and (b) above are the main drivers of recent AI developments, what are the "engines" that actually use speed and data to do what appears to be something new and revolutionary? Well, let's take one old mathematical workhorse, the Steepest Descent Algorithm, as an example of one such engine.
The Steepest Descent Algorithm is a general class of methods that seek to minimize the apparent errors between (c) an empirical dataset, and (d) a set of equations that attempt to model the system from which the data were acquired. As the algorithm is initiated in the computer, the data and the equations are (normally) at maximum error from one another. Then, like an alpine skier wanting to reach the bottom of a mountain in minimum time, with every computation the algorithm chooses its next model adjustment to give the greatest reduction in error between data and model. Thus, it tries to approach zero error with maximum speed, and with success, endless acquisitions of data are no longer necessary because our mathematical model can now be used to compute any other point outside our dataset. Voila! The good news is, this approach can be very successful, the bad news is that much nuance is often required to make it work well.
One problem that routinely occurs with Steepest Descent is that it may get mathematically "stuck" in a local error minimum. The alpine skier may have made rapid progress down his mountain, only to find himself stuck in a bowl in which no direction is downhill. His only choice then is to climb out, or wish for some "magical hand" to boot him out of the bowl. The Steepest Descent Algorithm, and indeed other modeling, Pattern Recognition and Software Neural Networks use the "magical hand" of mathematical "noise" to kick themselves out of these downhill traps. As mentioned earlier, the effect of this methodology was noted in the difference between CHAT2 and CHAT3 below. While a local minimum solution may be adequate, especially if near the bottom of the traverse, the most mathematically correct answer is usually found (often with much tweaking) by hammering out the method to escape as many local minima as possible. Since earliest use of this method, a synthetic "Boltzmann Absolute Temperature" has been used in a little exponential equation to mathematically generate this mathematical noise. That noise was then simply added to the model calculations to punch it out of a local minimum solution, letting it proceed downhill to the best answer. Incidentally, this equation is essentially one that models "Brownian Motion" of small particles being impacted by thermally excited molecules. At Absolute Temperature (T) of Zero, all molecular motion (M) ceases. Early math modelers must have decided that if this mathematical expression [ a possible form: M = A(e^+KT - 1) ] was adequate to describe particle (thermal) agitation in nature, it was good enough for them to agitate their models for better convergence.
Finally, the latter discussion used an example of making (c) data and (d) model agree as well as possible, as in observing some physical system of Physics, Electrical Engineering or Mechanical Engineering. However, in the CHAT examples below, the programmers have turned words, sentences, context and relations among massive databases into their own forms of mathematical construct. Another AI tool, Syntactic Pattern Recongnition, seeks to assign scaled numerical values to verbal and even shape constructs to facilitate computational interrogation of otherwise non-mathematical data. It is in this way that information in language form is quantified, facilitating logical reduction by computing. Thus human and (computer) algorithm can reach a concensus, as demonstrated in CHAT4, where the algorithm concluded that "we were finished" and thanked me for the conversation (lol).
See PDF eBook Purchase Information and Link Below