A Macro World Emzyne Analogy

A Macro World Emzyne Analogy

It was pointed out in the “Fake Biology” post that the world of biology defines an enzyme to be a protein catalyst, and a catalyst to be a molecule that “speeds up a chemical reaction.

This post is being written while spending time with family and friends at a celebration in Santorini, Greece.  Santorini is the remains of a volcano that exploded 3,700 years ago, destroying an advanced city thought by many to be the lost city of Atlantis.  Yesterday, we visited the archeological dig site of an ancient city on the volcano island and the tour guide pointed out what she called the rock “bombs” that flew out of the mouth of the exploding volcano and destroyed buildings.  There were a number of them on display there.

This picture, taken from the villa we are staying in, is one of these “bombs” on the hillside opposite the villa.  The  rock is precariously situated on the hillside, and will eventually roll down the hill due to erosion and/or an earthquake.

Someone in our group mentioned that a person with a shovel and a pry bar could do the same thing in 10 minutes.  Such an action by a human (intelligent machine) would speed up the natural process of erosion that reduces the activation energy required to dislodge the bomb. This is analogous to the intelligent work an enzyme does to “build” a molecule compared to the molecule being formed by a natural chemical reaction. This analogy works from a specificity point of view as well as a thermodynamics point of view because the human could control the timing of the event and the direction that the “bomb” would roll down the hill.

A close-up view shows that someone has stuffed stones to, in effect, increase the activation energy.  This is an example of intelligent work expended to offset the effects of natural causes.

© 2016 Mike Van Schoiack

Fake Physics

Fake Physics

Page 552; my sophomore physics class textbook reads: “By means of the statistical definition of entropy, Eq. 25-13, we can give meaning to the entropy of a system in a non-equilibrium state. A non-equilibrium state has a definite entropy because it has a definite degree of disorder. Therefore, the second law can be put on a statistical basis, for the direction in which natural processes take place (toward higher entropy) is determined by the laws of probability (toward a more probable state). From this point of view a violation of the second law, strictly speaking, is not an impossibility. If we waited long enough, for example, we might find water in a pond suddenly freeze over on a hot summer day. Such an occurrence is possible, but the probability of it happening, when computed, turns out to be incredibly small; for this to happen once would take on the average a time of the order of 1010 times the age of the universe. Hence, the second law of thermodynamics shows us the most probable course of events, not the only possible ones. But its area of application is so broad and the chance of nature’s contradicting it is so small that it occupies the distinction of being one of the most useful and general laws in all science.”

That was almost 60 years ago, and I remember thinking at the time that this was false – it didn’t make common sense. But more significant, equation 25-14, discussed the statistical variation of the velocity (think momentum) of the molecules.  The molecules of water in a lake are exchanging momentum by the collisions with each other. By the first law of thermodynamics (conservation of energy), this momentum (temperature of the water) had to be conserved. Therefore the average temperature of the lake would be constant except for the slow heating/cooling due to energy exchange with its environment.  In other words, this statistical improbability is true, looking at just a few molecules but is not valid for the average of all the molecules as a function of time.  So’ the probability is zero, not some crazy low number.

Even if the probability curve represented the instantaneous temperature of the macro object (pond), the changes in temperature would be happening so fast that one would be unaware of them. The lake would still have the same average temperature.  In other words, the freezing events would be so fleeting that one would be unaware of them, and this would be true at both the micro and macro levels.

While in college, common sense told me that this could not be true, and the first law of thermodynamics rational answered the question for me.  Later, I heard of the thought experiment of reversing time looking at just a few molecules; one would not be able to tell if time was going forward or backward.  If you watched a video of a tornado ripping a house apart, then ran it backward, you would know which direction the video was running.  But if you had a video of just a few molecules of some piece of wood in the house, you would not know which direction the video was playing – it would just be some molecules bouncing around.  The point is that at the micro-level, all energy is conserved as far as the observer can tell.  It is at the macro-scale that potential energy waists away.  In other words, one cannot always tell what is happening at the macro-level by looking at the micro-level.  The physics textbook made a statement that is not even true at the micro level, because even if a system of a few molecules instantaneously “froze”, they would be “unfrozen instantaneously as well.

The first and second laws of thermodynamics, taken together provide, at least in this case, a theoretical reason that the macro-reality we experience is not problematic. And that the direction of time (video running backward) is an excellent way to visualize why natural causes reduce specified complexity (increase entropy) as well.

Think about how we experience a pond freezing.  The freezing process starts as heat exchange between the water and its boundaries due to the temperature gradients.  This conduction process occurs at different rates depending upon the details of temperature gradients and the conductivity at various positions at the edges of the lake, eventually causing the first water molecules to freeze.  More molecules freezing to these will occur.  Other freeze points will appear at the borders.  The freezing will expand, meeting other expanding froze areas, eventually freezing the whole surface of the lake, assuming enough time and cold weather.  In all cases, thermal equilibrium dictates when the process stops or reverses.  Bottom line: lake freezing is a natural process that takes time; it is not an event.  If one looks at tiny volumes in the pond that consist of just a few molecules (like 2-10), then yes, there is a probability at some instance in time, one will find the average momentum equivalent to frozen water.  But the boundaries of these tiny volumes are tiny volumes at a higher temperature.  Make the volume bigger and there is a smaller instantaneous ΔT.  Make the volumes the size of a teaspoon, and the instantaneous ΔTs would be probably so small that it would not be measurable for two reasons:

  1. The variation would occur faster than measurement means, probably impossible due to Heisenberg’s uncertainty principle.
  2. It would be a violation of the First Law of thermodynamics – that heat energy is conserved.

Such an event would require the immediate release of a massive amount of energy, like an explosion.  By my calculation, if the pond was the size of Walton’s pond, and the ambient temperature 70 degrees Fahrenheit, the blast would be about ¼ the size of the atomic bomb detonated over Hiroshima.

Treating creations as being the result of an event causes a belief that there is a finite probability of the event occurring.  The truth is different. All-natural macro processes must move toward a reduction of free energy.  To achieve a deterministic raising of free energy requires the use of machines performing specified, deterministic work or mechanisms created by machines that convert energy forms to raise local free energy.

Machines, as I define them, will not function without matter/energy held in an away-from-equilibrium condition.  Therefore they cannot be created by natural causes.  This claim includes machines in life and is falsifiable.  This thought process is the basis for the “Fake Biology” post.

Calling my physics book “fake physics” is going overboard.  I’ve always had high regard for this textbook (two books, Part I & Part II).  I never parted with them because they have been valuable references for my engineering career. I believe this to be an error of completeness of thought, which can result in false conclusions.  There is another possibility – I may be wrong.  If I am, I hope someone will explain why.

© 2016 Mike Van Schoiack

Intelligent Work

Intelligent Work

 

 

Author’s Definition:

  1. work accomplished by a machine.
  2. work not solely the result of natural causes.

Background

Intelligent Work is work accomplished by a machine.  A machine’s functionality is the the result of intelligent manipulation of matter and energy to enable sensing, signaling, logical information processing and energy control functionalities.  Such manipulation embedds intelligence in the machine.

The technical definition of work is a force applied over a distance, in vector algebra:Term “work” has many non-technical meanings.  A Webster definition is:  sustained physical or mental effort to overcome obstacles and achieve an objective or result.   Another Webster definition:  energy expended by natural phenomena.  The definitions of interest here involve the expenditure of energy to cause something to happen.  In the natural word it can mean the energy of the sun to heat the earth, or the energy of wind and rain to cause erosion or energy or the energy released by burning or from a battery, or metal rusting.   For humans it might mean to build a fence, clean the house or to put out the garbage.  All these examples involve the consumption and/or release of energy that performs some sort of work.  The first list of examples involve work by natural causes, and the human examples involve intelligent work where the human is performing the function of a machine. 

Work by natural causes has no intelligence involved.  Intelligent work is a different paradigm as it can only be performed by a machine.  A machine requires embedded (logical) intelligence because it must gather information, process the information to determine action required, and have embedded means (control the expenditure of energy) to perform the specified work.

An example that is commonly thought of as work due to natural causes, but is actually intelligent work, is the sun’s energy using a process called photosynthesis to make food.  Photosynthesis is a result of both molecular machines and natural chemical reactions – this process cannot occur as a result of natural chemical reactions alone.  Here is how Wikipedia describes the area in a plant cell that performs this process: “A photosynthetic reaction center is a complex of several proteins, pigments and other co-factors that together execute the primary energy conversion reactions of photosynthesis. Molecular excitation, either originating directly from sunlight or transferred as excitation energy via light-harvesting antenna systems, give rise to electron transfer reactions along the path of a series of protein-bound co-factors. These co-factors are light-absorbing molecules (also named chromophores or pigments) such as chlorophyll and pheophytin, as well as quinones. The energy of the photon is used to excite an electron of a pigment. The free energy created is then used to reduce a chain of nearby electron acceptors, which have subsequently higher redox-potentials. These electron transfer steps are the initial phase of a series of energy conversion reactions, ultimately resulting in the conversion of the energy of photons to the storage of that energy by the production of chemical bonds.”  This is a description of intelligent work being performed by molecular machines in conjunction with some normal chemical reactions.

There are two levels of intelligent work that we regularly experience.  The first is logic level intelligent work, work that has at least logic level functionality.  This include all man-made objects but also includes things made other living things such as bird nests and beehives1.  It also includes the work done by each molecular machine in every living cell. That is a lot of machines doing a lot of work! 

Humans can perform a higher level of work: abstract intelligent work.  We can think, conceptualize, design, build machines and processes that logic level intelligent entities cannot. 

Then there must be an even higher level intelligent entity that designed us.2

Intelligent work, performed by machines, require the continual expendure of energy to power the logical functionality of the machine.  Natural work does not have this requirement as there is no intelligence.  There is a physics defined relationship between the work performed and the energy required to perform it.  Such a relationship does not exist with intelligent work because the energy involved in performance of a task involves variable, logical factors in addition to the performace of the physical work required for the task.

Understanding the difference between Intelligent work and “natural work” is key to understanding the conclusions presented herein.  Life depends upon the intelligent work that takes place in every cell.  Everything we (humans) make and do is the result of the intelligent work we do.

© 2016 Mike Van Schoiack

Scraps

Factory

To design and build something like a factory requires many co-ordinated steps.  It is first a “top-down” effort starting with objectives, and resources of all types.  A plan must be made that figures out how to utilize the available resources to put together the pieces that will make the plant work.  The details of how the plant will perform each process step that will be needed, and how all of the steps can be coordinated to make sure they will work together.  Then you have to go the bottom, and design the parts, test them, assemble multiple parts, make sure they work together, fixing problems as they arise, and work back upwards until the whole plant is assembled and tested.

As part of the design process, the process of moving the materials required in the overall process must be designed and built and installed.

When the plant is all together, the process must be started.  In order to do it sucessfully, one must preload materials, place all machines in a particular state, or metaphorically speaking, set all switches in the right position and prime all the pumps, before hitting the “start” button that applies power to everything.

, the means of doing this has to be designed and in place. that have to move in a precise, coordinated fashion that  build But this is but a small fraction of the tasks that must be performed.

Concepts to be added:  Life is a process that uses intelligent machines to enable a living entity.  The need to recognize, base on the fact that life is a process, the functionalty that has to exist., and the detail involved.  It is far greater than presently acknoledged by thosese writing about life.  The “process” elemdnets involve all of the detail.= functions that have to be performed to allow life to exist.  This is far more involved thanthe general categories such as “reproductionn”.  Reproduction invoves dozens, maybee hundreds or thousands of actual functional steps t.

ver since Darwin’s book Origin of Species was published in 1859, there has been a steady advance toward the idea that our existence can be explained by science – there is no need to invoke a God or other super-natural entity.  Darwin’s theory seemingly provided an explanation of how all life forms evolved from the first cell, and, by extension, assumes a natural cause explanation of the origin of life.

Since Darwin wrote his book, much has been learned about life.  It is much more complicated than expected.  We now know that organisms have what appears to be a designed mechanism for evolution, one which can account for adaptation of plant and animal forms, but not the evolution of new forms.  And there is no progress in the development of a theory for beginning life occurring from natural causes.

Studying the inter workings of the cell is science, and the knowledge gained makes the  argument for Neo-Darwinism weaker by the day.  The reality is that if Darwin knew what we know today about life and the record of life, and if he was honest, he would not have written his book based upon the doubts he expressed, and the methods of science that he expounded.  Even so, the mainstream field of biology clings to the belief in Darwin’s theory and to the idea that life began by natural causes.

So today there is a huge chasm in the field of biology – Darwinist vs. those who embrace the evidence of what is being learned.

This engineer has become interested in this field partly because of the natural need to know how and why we exist, and partly because of what seems to be a denial, or maybe it is purely lack of knowledge, of the field of chemistry to see the difference between chemical reactions vs. the physical building of macro-molecules.

A probable explanation of this chasm and how it has evolved is that biologists early on did not realize that actual molecular machinery was involved in the cell.  Once they did, they assumed that these machines could be described as an advanced catalyst, hence the term enzyme with the definition of it being an enzyme.  They were not, and, it seems, for the most part, not aware of the fact that they are dealing with a whole different paradigm that involves machinery sensors, actuators and process control systems in addition to the chemistry which they are trained in.

It is easy, as an engineer, to look at this situation and recognize what seem to be pitfalls, but limited the knowledge in biology and chemistry leaves many questions unanswered and leads to the possibility that conclusions drawn may be wrong.

Two Paradigms
These are two separate paradigms.  Chemical reactions are due to natural causes driven by the laws of physics and chemistry.  The physical building of macro-molecules is the work of intelligent machines working in a process control system, intelligently using the laws of phycis to accomplish things that natural causes cannot.  Life uses both of them to exist.  The world of biology still seems to think that enzymes are just catalysts that speed up a natural chemical reaction.

Life is a Process
From this engineer’s perspective, the fact that life is an intelligently controlled process is the easiest way to appreciate and understand the difference between living and non-living matter; that life is matter with embedded intelligence.

The field of electronics deals with devices in the nm range, which is the same size range of proteins.  But there are several differences between ultra-small electronic entities compared to life molecules.  Electronic chips are layered two dimensional static devices, compared to life molecules that, in the living condition, moving in all three dimensions.  The other big difference is that even though the electronic structures at these dimensions are hard to see, the designs normally supply test points that allow the engineer to discern what is happening even at these small dimensions by electrical measurements.  The point is that we do not have the means to track exactly what is happening inside the cell because “our fingers are too large” and we cannot see or probe inside the cell walls to “reverse engineer” the life process that is in motion.

Past, Present, Future
However, the bio-medical field is doing amazing things involving manipulating of the genome.  This engineer is clueless how these feats are accomplished.  The assumption is that researchers are learning how to use the cellular machinery to determine functionality indirectly.  This is a scary because it is obvious that the overall understanding of the process control system in the cell is very limited.  In addition, the lack of knowledge of process control in general leads to the prospect of not appreciating the probable interrelationships between the numerous control loops that must exist in the cell.  This could lead to unintended consequences, some which could be very subtle and/or dangerous.

It has been intuitively obvious to man since the earliest recorded history that life was something much different from inanimate nature.  Darwinism has given those who do not want to believe that an intelligent entity was responsible for our existence a way, as Dawkins put it “to be intellectually fulfilled.”  To others, life itself, is proof that there must have been some intelligent intervention with nature in order to create life.

The reality is that we do not understand the totality of physics; like dark matter and energy, gravity and entanglement.  Perhaps if we did, we would know which side of this chasm is correct.  The science we do understand leaves Darwinism in a black hole.

Frustrations of This Engineer
Lack of appreciation of the impacts of the 2nd law of thermodynamics:Reason that natural causes cannot build all of the molecules of life
Reason why there is no pathway for complexity to increase over time to “evolve” first life
The reason that life has to be a process
Terminology used in Biology & lack of precision of definitions
Understanding the full implications of life being a process
Appreciation of the difference between information and intelligence
Appreciation of the difference between design and actual building something
Appreciation of the difference between static things and machines that do work.
Purpose of This (ID) Site
This site will be a forum to discuss the frustrations listed above.  It is the hope that there are others, particularly thoughtful engineers who share some of these frustrations and would like to have the ability to have discourse on these and similar topics will find this site.

It seems obvious that the fields of biology and engineering are merging, and it is exciting to be able to participate in this merger.  If this engineer was a young man, he would become a bio-engineer.

This engineer, being very interested in understanding what is known about life; what it is and how it works, has spent countless hours reading and thinking about the topic. The engineer’s perspective is quite different from the Darwinist point of view and the differences have many implications: thermodynamic, theoretical, philosophic, political, and theological.

It is the belief of this engineer that the approximately 97% of the DNA that is not used to express proteins is essentially the life process “code”; that chucks of this DNA are converted to RNA that works with proteins to provide the functionality needed for the life process.  This assumption is made based on what seems to be the obvious reality that life is a process, and knowing what is require to make a process work, coupled with the knowledge that the protein/RNA combination is already known to provide some of the functionality of the life process, e.g., the ribosome.

The key to understand life, it seems to this engineer, is to understand the details of how this process works.  As mentioned, this is hard because our fingers and eyes are too big.  Engineers like myself have ideas about what to be looking for including some thoughts regarding where and how.  But we (speaking for myself) have only a little knowledge of what currently is known, and almost zero knowledge about the tools available to gain additional knowledge.

So the purpose of this site is to serve as a forum for this engineer and others to share ideas, learning resources and, in general, be a place to kindle thinking on the topic.

To submit a post, please use the Contact form in the footer.

This is a quote from my sophomore physics class textbook: “By means of the statistical definition of entropy, Eq. 25-13, we can give meaning to the entropy of a system in a nonequilibrium state. A nonequilibrium state has a definite entropy because it has a definite degree of disorder. Therefore, the second law can be put on a statistical basis, for the direction in which natural processes take place (toward higher entropy) is determined by the laws of probability (toward a more probable state). From this point of view a violation of the second law, strictly speaking, is not an impossibility. If we waited long enough, for example, we might find water in a pond suddenly freeze over on a hot summer day. Such an occurrence is possible, but the probability of it happening, when computed, turns out to be incredibly small; for this to happen once would take on the average a time of the order of 10^10 times the age of the universe. Hence, the second law of thermodynamics shows us the most probable course of events, not the only possible ones. But its area of application is so broad and the chance of nature’s contradicting it is so small that it occupies the distinction of being one of the most useful and general laws in all science.”

That was over 50 years ago and I remember thinking at the time that this was false – it didn’t make common sense. But more significant, equation 25-14 discussed the statistical variation of the velocity (think momentum) of the molecules.  The molecules of water in a lake are exchanging momentum by the collisions with each other, and by the first law of thermodynamics (conservation of energy) this momentum (temperature of the water) had to be consurved, therefore the average temperature of the lake as a whole would be constant.  In other words, this statistical improbablity is true looking at one molecule over time, but is not true for the average of all the molecules as a function of time.  So the probability is zero, not some crazy low number.

But there is also another problem.  Assume that the all of the lake’s water molecules did have the probability distribution described.  The changes in temperature would be happening so fast that one would be unaware of them.  In other words, the freezing events would be so fleeting that one would be unaware of them.

It has been my belief that there is a corellary between this example and the study of the probability of a given protein being produced by natural causes.  It is my belief that the probablitity that natural cases can produce all of the functional proteins needed for life is also zero.  The explanation for this belief is  part of the purpose of the ID section of this website.

The process of thinking about this topic has lead to considering other issues, such as what life is from a physics/engineering point of view, specific information, specific work, machines and intelligence. From this comes definitions and theories and proofs that natural causes could not have been responsible for begining life.

The main difference between the approach to thinking about these topics that I, as an engineer think about them, compared to non-engineers, is to focus on nature’s ability to actually build and start life and to think of life as a process as opposed to being a “material state.”

The result of this engineer’s deliberations on this topic lead to the conclusions that follow.  More detail in contained in referenced posts, which provide the reader the opportunity to respond to specific concepts.  The reality is that I am not a biologist and very possibly draw some incorrect conclusions about things that I’ve (thought to have) learned.  I’m hopeful to get corrective feedback.

 Frustrations of This Engineer

  1. Lack of appreciation of the impacts of the 2nd law of thermodynamics:
    1. Reason that natural causes cannot build all of the molecules of life
    2. Reason why there is no pathway for complexity to increase over time to “evolve” first life
    3. The reason that life has to be a process
  2. Terminology used in Biology & lack of precision of definitions
  3. Understanding the full implications of life being a process
  4. Appreciation of the difference between information and intelligence
  5. Appreciation of the difference between design and actual building something
  6. Appreciation of the difference between static things and machines that do work.

A. C. McIntosh, in a scholarly paper3regarding information in life, concludes by saying: “..that it is the non-material information (transcendent to the matter and energy) that is actually itself constraining the local thermodynamics to be in ordered disequilibrium and with specified raised free energy levels necessary for the molecular and cellular machinery to operate. The idea of information, a static product of intelligence, could be “constraining” thermodynamics”  makes an ambiguous connection, at least from an engineering point of view. But this line of thinking an inquiry is inherent in the field of biology today asking the question “where does the information come from?”

Information is a static, non-material construct of intelligence which can, and is embedded in matter by the use of language, codes, media, protocols, etc., Life, to this engineer, is a process that is continually doing things and burning energy just to maintain its living status.  McIntosh points out in his paper, “The functional machinery of biological systems such as DNA, RNA and proteins requires that precise, non-spontaneous raised free energies be formed in the molecular bonds which are maintained in a far from equilibrium state.”

The idea of life being a process that “constrains thermodynamics” to maintain “a far from equilibrium state” using machines that make intelligent choices answers the  ambiguity of information vs. intelligence expressed by McIntosh.  ********

A probable explanation of this chasm and how it has evolved is that biologists early on did not realize that actual molecular machinery was involved in the cell.  Once they did, they assumed that these machines could be described as an advanced catalyst, hence the term enzyme with the definition of it being an enzyme.  They were not, and, it seems, for the most part, not aware of the fact that they are dealing with a whole different paradigm that involves machinery sensors, actuators and process control systems in addition to the chemistry which they are trained in.

it is easy, as an engineer, to look at this situation and recognize what seem to be pitfalls, but limited the knowledge in biology and chemistry leaves many questions unanswered and leads to the possibility that conclusions drawn may be wrong.

Process

Process

Merriam-Webster Definition

  1. a series of actions or steps taken in order to achieve a particular end.

This Engineer’s Definition

  1. A series of actions or steps that result in an end.
  2. Natural Process: a series of action or steps that result in an end that does not involve intelligence.
  3. Intelligent Process: a series of actions or steps taken in order to achieve a desired end.

The definition allows the term “process” without an adjective to include both Natural and Intelligent Processes.

The outcomes of natural processes are determined by their initial conditions resulting from previous natural process steps, the free energy available and the laws of physics and include the formation of atoms, molecules, suns, black holes, solar systems, planets, weather, plate tektonics, and erosion. Intelligent processes are the same except they involve intelligence that controls outcomes based on sensed conditions. 

If an end is accomplished by doing intelligent work, the process involves the use of machines and is therefore an intelligent process.  If the process only involves natural causes, the outcomes are limited by the initial conditions of previous process steps.  Life, and all creations of life are the result of intellignet processes.

All processes, including natural processes, involve changes in state of matter and energy over time, in other words a process is not a static entity.   For example, a machine, when not running is still a machine.  When it is running, it is running a process to accomplish whatever task it was designed to do.  This may seem like a trivial distinction, but it is not when discussing life.  A distinction between designed processes and natural processes is that designed processes use a local, specified power source compared to a natural process that uses free energy – energy available to do work by previous natural process steps – no machines involved.

 

 

© 2016 Mike Van Schoiack