Is it reasonable to think that since no one knows what caused the Big Bang, one could occur anytime, anywhere, even right here, if the circumstances were right? I mean, if it were a random fluctuation of... the vacuum... of nothing... then that means another one could occur at any time. And if it does occur, would it wipe us out or would it explode into another dimension and we wouldnt know one occurred?
Stephen Hawking says that the Heisenberg Uncertainty means that there could be black holes anywhere--tiny ones that swallow up subatomic particles, then disappear. He was making a case against scientific determinism at the time in his lecture,
Does God Play Dice? If tiny black holes could appear anywhere, doesnt it follow that tiny bangs wherein particles emerge from nothing could occur anywhere, anytime, too? Little creatio ex nihilo's. Holy mother of pearl, what if there wasnt a bang? What if what occurred were tiny little bangs that accumulated in Planck time, then rapidly expanded? There was no bang. The universe is expanding and creating new particles in the process as space-time is being stretched out. Fred Hoyle was right about the steady-state, but he was premature in his predictions! The universe isnt a steady state. It's heading
towards a steady state. Given enough time, we wouldnt know whether or not the universe is expanding since we have a universal speed limit, c, the speed of light. The farther away a galaxy is, the faster it appears to be moving away from us such that if galaxy A is twice the distance from us as galaxy B, galaxy A would appear to be moving twice as fast as galaxy B is. Therefore, at some point, the nearest galaxy to us would be traveling at the speed of light, and therefore would appear to be standing still. We would have reached a steady state. There would be no more empirical evidence for expansion. One day, our descendants would think that the Big Bang was just a silly superstition.
***
With the rapid developments in genome sequencing, I may have found a way wherein Michael Behe's irreducible complexity in living things could be explained by traditional Darwinian methods. As you probably know, more and more evidence is being uncovered that the so-called junk DNA
isnt really junk. Darwinians explained junk DNA as just by-products of evolution, but scientists pursuing the theory of intelligent design predicted that the non-coding genes actually had a purpose, and research
has proven them to be correct. Some act as start-stop codes, for instance, telling when a certain set of instructions when to start and when to stop. As more and more of the gene sequence is decoded, if researchers find redundancies in the gene sequence, that is, different sets of instructions that more-or-less did the same thing, irreducible complexity can then be explained thus:
For example we had a biological process we'll call IRREDUCIBLYCOMPLEXPROCESS, where each letter is a biochemical process that is part of an irreducibly complex system. We can explain this via traditional Darwinian processes if it can be found that a process with IRREDUCIBLYXXXXXXXXX, and another process with XXXXXXCOMPLEXXXXXXX, and another process with XXXXXXXXXXPROCESS joined together. The XXX's are processes identical to the functions of IRREDUCIBLY, COMPLEX, and PROCESS. These three joined together to form something like IRREDUCIBLYXXXXXXXCOMPLEXXXXXXXXPROCESS, with the concommitant redundancies of the XXX's. Natural selection would later select against the XXX's, et voila! That leaves our IRREDUCIBLYCOMPLEXPROCESS.
The problem with this process is time. We have had only 4 billion years to explain the complexity we see today. Would random processes account for the variety of information we see? Up to now, there is no consensus yet as to how much new information can actually be created by a random process, with some scientists being skeptical about the 4 billion years being enough. That's why theories like
Directed Panspermia were invented. Directed because they dont believe that a one-time random insemination of earth could account for all the information and for the seeming explosion of lifeforms over the years. The insemination they say, had to be deliberate and occurs regularly. But we have powerful computers now and maybe one day some computer genius would design an algorithm to test whether the specified complex information we see today can be explained by random processes.
***
Addendum 10 May. There is one thing that scientists can't do with a computer: model how Darwinian evolution works. It's just impossible. To do so would require programmers, but that's not allowed because Darwinian theory denies the participation of intelligence in the formation of species. They deny real design, only affirming apparent design. But real design is exactly what is needed to create a computer model of Darwinian evolution. The moment you introduce an intelligence into the mix, youre playing right into the hands of the ID theorists.
That's what Richard Dawkins tried to do in his book, The Blind Watchmaker. In chapter 3 of his book, he tries to demonstrate how random iterations on a set of 28 characters would come up with METHINKS IT IS LIKE A WEASEL. At the first attempt, he did it in 41 generations. Pretty impressive. But wait, the books says:
The computer examines the mutant nonsense phrases, the 'progeny' of the original phrase, and chooses the one which, however slightly, most resembles the target phrase, METHINKS IT IS LIKE A WEASEL.
Did you see it? A target phrase, METHINKS IT IS LIKE A WEASEL, is written in the program, that is, the program already knows what it's looking for. The process therefore is not blind but 'intelligent.' He demonstrates another computer model in the same chapter but it's more of the same: what is supposed to be a demonstration of natural selection becomes a demonstration of artificial--
intelligent--selection.
Trying to model Darwinian processes using computers is an impossible task.
I dont know how it can be done, anyway. But maybe some computer genius would think of a way.