In silico DNA

Imagine an organism designed and replicated by software.  Synthetic life?  Has it been successfully performed?  The synthetic aspects, yes.  Life?  This is subject to personal interpretation.

In silico, Latin for “in silicon”.  Performed on a computer or via computer simulation.

A key tenant of chemistry is the notion of “synthesis as proof”.  The first synthetic organism was reported to have successfully been created on May 20, 2010.

Abstract

Creation of a Bacterial Cell Controlled by a Chemically Synthesized Genome

We report the design, synthesis, and assembly of the 1.08–mega–base pair Mycoplasma mycoides JCVI-syn1.0 genome starting from digitized genome sequence information and its transplantation into a M. capricolum recipient cell to create new M. mycoides cells that are controlled only by the synthetic chromosome. The only DNA in the cells is the designed synthetic DNA sequence, including “watermark” sequences and other designed gene deletions and polymorphisms, and mutations acquired during the building process. The new cells have expected phenotypic properties and are capable of continuous self-replication.

Published Online May 20 2010
Science 2 July 2010:
Vol. 329 no. 5987 pp. 52-56
DOI: 10.1126/science.1190719

In vitro, Latin for “in glass”.  Performed with cells or biological molecules studied outside their normal biological context.  Often referred to as “test tube experiments”, due to the use of test tubes, flasks and petri dishes.  Proteins, DNA and RNA for example are studied in solution, and cells in artificial culture medium.

In vivo, Latin for “within the living”.  Performed within animals, including humans and plants.  Biologicals are tested on living organisms, as opposed to in vitro, or in silico.

Imagine now, an organism that takes on many forms; a polymorph.  Polymorphism is when two or more clearly different phenotypes exist in the same population of a species occupying simultaneously the same environment.  These phenotypes take on specific characteristics as the result of the interaction of its genotype with their environment.

We’ve heard of genes undergoing mutation, more specifically, mutagenesis.  This is when an organism’s genetic information is changed by a stable process.  This is a change in the nucleic acid sequence which can be replicated, enabling a mutation to be inherited from one generation to the next.

Given in silico computer modelling of cellular behavior is faster than real time growth rates, results are obtained within minutes.  And, DNA sequencing produces digital genetic sequences, stored in sequence databases.  These can then be analyzed, digitally altered and used as templates for creating through artificial gene synthesis, new forms of unnatural DNA.

Ironically enough, the great limiting factor once in silico modelling was proven a viable alternative was the computational processing power of transistor-based, gate model computers.  Until the advent of sufficient throughput, a great many assumptions were made regarding the accuracy of the sequences derived from such computer modelling.  As well as predicting the cell’s behavior.

Today, we have a replacement for the gate model computer, even in its quantum configuration.  The adiabatic quantum computer, utilizing qubits rather than transistors.  These operate in a super-conductive state, without the transfer of heat or matter to its surrounding environment while performing quantum annealing.  The latest model, the 2048, possesses the equivalent processing power of 7 billion human brains.  It is for example, able to process a combinatorial problem in 10 seconds, while the fastest gate model supercomputer requires 30 minutes to solve the same equation.  This is accomplished through the employment of supersymmetry, superposition and quantum entanglement.

Thus, we now have the ability to accomplish within minutes, what previously required weeks of laborious DNA sequencing of base pairs and letters of genetic code.  We are immersed within the digital age of biology, synthesizing genomes based on the binary digital code of a computer.

During February 1943, Erwin Schrodinger delivered a seminal lecture, entitled “What is Life?” in Trinity College, Dublin.  His book by the same name was published in 1944 and is considered a scientific classic.  Within it, he theorized how hereditary information could be encoded in a chemical structure (aperiodic crystal) in living cells.  The book was cited by Crick and Watson as one leading them in determining the structure of DNA in 1953, for which they received the Nobel Prize.

While in 1958, Fred Sanger won the Nobel Prize for his work in 1948, first sequencing of the insulin protein.  Demonstrating that proteins consisted of linear amino acid codes.

1968 witnessed Holley, Nirenberg and Khorana share the Nobel Prize demonstrating in 1961 the triplet genetic code for each amino acid.  Thus showing how the linear DNA code had coded for the linear protein code.  Subsequently, Robert Holley discovered the structure of tRNA as the key link between messenger RNA and the amino acids for protein synthesis.

The next Nobel Prize of 1978 went to Ham Smith, Warner Arber and Dan Nathans for their discovery of restrictor enzymes, the molecular scissors that cut DNA.  This brought on the fields of molecular biology and engineering and molecular splicing using restrictor enzymes.

Repeating his Nobel Prize in 1997, Fred Sanger sequenced the DNA virus, Phi X 174, becoming the first viral DNA genome.  This led to a new sequencing technique of dideoxy DNA, now referred to as “Sanger sequencing”.

Presently, we see DNA as an analogue coding molecule, and when sequenced, we are converting its code into digital code.  Essentially, we are digitizing biology.

DNA is composed of four nucleotides, (A) adenine, (C) cytosine, (G) guanine and (T) thymine with its code stored within these chemical bases, also known as base pairs.  Each base (ACGT) representing a binary value (T and G = 1, A and C = 0).  With this arrangement, data is stored in DNA.

Sequencing of the DNA enables one to read back the stored data in binary form.  The same process as sequencing the human genome.  To aid in this sequencing, each strand of DNA has a 19-bit address block at the start, allowing a large quantity of DNA to be sequenced out of order.  Then sorted into usable data using the addresses.  The four essential steps involved are the initial encoding, synthesis, sequencing and then decoding of data stored in DNA.

There are three primary reasons for the use of  DNA as a storage medium:  It’s extremely dense (storing one bit per base, with a base being only a few atoms in size); it’s volumetric (in a container) rather than planar (hard disk); and it’s very stable.  Most leading-edge storage mediums must be maintained in cryogenic, sub-zero vacuums.  Whereas, DNA can survive for hundreds of thousands of years at moderate temperatures.

It is only with recent advances in microfluids and chip-scale labs that synthesizing and sequencing of DNA has become prevalent.  Initially, it took years for the original Human Genome Project to analyze a single human genome (over 3 billion DNA base pairs).  Modern lab equipment with microfluidic chips perform the same tasks in minutes.

Existing DNA-based data storage, consisting of one gram of DNA can store 700 terabytes of data.  Representing 14,000 50-gigabyte Blu-ray discs.  Equivalent to 233 3TB drives, weighing in at 151 kilos.  One limitation to this storage is it cannot be done in living cells.  These lacking in stability and integrity upon extraction.

Recalling mention here of the adiabatic quantum computer, and its use of qubit processors.  It is time to bring forth nanotechnology.  More specifically, nanodiamonds and their employment as self-assembling diamond-biological quantum devices.  In my latest novel; 2048: Diamonds in the Rough, these comprise the basis for my own version of a quantum computer.

Beginning with what R. Buckminster Fuller theorized as the smallest volume of enclosed space, the tetrahedron, or regular polyhedron.  The simplest of all ordinary convex polyhedra.  I delineate the self-assembling of C60  nanodiamonds taking on the form of this tetrahedron.  While a buckminsterfullerene (or bucky-ball) is a spherical fullerene molecule in the shape of a truncated icosahedron, my quantum computer replicates outward, ending as a 20-metre diameter, 600-cell tetrahedron.  The 600-cell is a convex regular 4-polytope, a four-dimensional analogue of the icosahedron.  Synchronistically and similarly, it too is also called a C600, hexacosichoron and hexacosidedroid.  Additionally, I propose the 600-cell as the true model of our known universe.

Connecting these in real-time, we have the technology to create such a computer.  One based upon the foundational work of sequencing and digitizing of DNA, forming the basis for in silico DNA modelling.  And, the creation of biological life.

By combining methodologies, biological with chemical, we arrive at a self-assembling, (or replicating) hybrid biological-nanodiamond quantum computer.  Once again citing the example of the adiabatic version, we now move from a cryogenic, electromagnetically-shielded quantum environment, to one of room temperature.

As recently as 2013, self-assembling protein structures served as a structural scaffold for surface-functionalized nanodiamonds.  This allowed for the controlled creation of nitrogen vacancy (NV) structures on the nanoscale and providing a new avenue towards bridging the   bio-nano interface.  Three-dimensional structures were formed by interconnecting nanodiamonds using biological protein scaffolds.

By incorporating in silico modelling of design-specific DNA and proteins, high-fidelity quantum entanglement and cluster states can be achieved.  By arranging nanodiamonds on SP1 protein for example, a hexagon is created with faces of 11nm.  Arranging these in a cluster of no more than 100nm, dipolar coupling interaction with nearby NV-centers is achieved, while reducing error-inducing computational decoherence.

Thus, the optimal design of scaffold protein structures can be derived at in silico, much the same way new pharmaceuticals are created.  Without the need for first creating physical biologicals, or chemical nanodiamonds.

Moving beyond the use of proteins, in silico modelling and testing of C60 nanodiamonds structures can be, and I propose has secretly been accomplished.  We have learned from the proteins themselves, what the optimal structure for a self-assembling nanodiamond quantum computer is.  Using X-ray crystallography, Synchrotron particle accelerators have allowed us to create these three-dimensional models of protein folding within our own DNA.

Nanodiamonds provide us with its intrinsic three-dimensional, Euclidian arrangement necessary for true quantum combinitorial processing of digital information.  In utilizing the unique arrangement of the 600-cell tetrahedron itself, we are able to achieve solutions derived from the fourth dimension.  Time.  This is where quantum entanglement and supersymmetry factor in.

I have brought in the mentioning of Synchrotron particle accelerators.  Not only are these useful in the generation of high-power X-rays for the defractive inference and measurement of proteins and their structural folding.  They are of course used in the discovery and analysis of quantum particles.  Case in point, CERN’s Large Hadron Collider (LHC) near Geneva, Switzerland.

Of the multi-faceted agenda’s at CERN, one in particular has received a great deal of attention both in the mainstream as well as the alternative media.  Their publically proclaimed goal of discovering new dimensions.  In fact, they recently announced their target was the opening of an actual portal to another dimension.

I cite this within the context of this article to point out the direct correlation of the aforementioned topics with that of the LHC itself.  Specifically, what is to be expected to move from another dimension to that of our own.

It is has been demonstrated how sequenced DNA can be digitized.  And using the four nucleotides, (A) adenine, (C) cytosine, (G) guanine and (T) thymine, each base (ACGT) having a binary value (T and G = 1, A and C = 0), software is employed to design and produce custom DNA using these base chemicals.

Advanced forms of 3-D printers replace ink and metals with ACGT.  Biological species never before seen on our planet have been produced both in laboratories and in competitions held among students on college campuses.  Likewise, entire animal and human organs and other body parts have been printed.  And, hybrids.  Humanoids, already grown and stored in stasis.

CERN is linked via the ESnet5 fiber optic network to over 160 laboratories and other Synchrotron and Linear particle accelerators, as with U.C. Berkeley and Stanford, California.  These are what I refer to as proof-of-concept labs, forwarding their results on for incorporation into the multi-layered agendas at CERN.

Building upon the ESnet5, whose primary function is distribution of data generated by the seven major particle detectors within the LHC itself, CERN announced in April of 2015, the Helix Nebula.  Through the European Commission, the European Open Science Cloud is a service to the public European Research Area (ERA).  A federated cloud services marketplace.  This open science cloud is a high priority for the European Commission’s strategy for a single digital marketplace.  This and its fiber optic-based predecessor, the ESnet5 serve as the infrastructure for the singular worldwide economic system of our near-future.

With the advent of digitized biology, sequenced DNA databases likewise are transmitted worldwide.  Within days of an outbreak of a virulent strain of bacteria, a synthetic vaccine can be produced virtually insitu using chip-scale labs.  These same labs can be shipped to another planet, such as Mars.  Not only for the production of nutritional substances supplying the needs of human explorers, but the humans themselves to more distant locations.

Which brings us back to one of the primary objectives as cited by representatives of CERN.  The opening of an interdimensional portal.  And, not so public is their personal search for human longevity.  The question often arises amongst those observing the activities at CERN: “Why?  Why a portal?”

The answer lies within their multi-layered agendas: immortality.  A goal of humankind extending down through the ages, and carried forward by secret societies bent upon achieving it.  For this is an occult-driven set of agendas.

The use of the LHC in the opening of an interdimensional portal itself appears ludicrous to the average lay person first hearing of such an idea.  Perhaps so.  Even more, the idea of beings, even demonic entities moving from another dimension to that of our own borders upon insanity at best.

A yet, in connecting the data of ancient occult practices to that of synthetic DNA and the largest, most powerful machine ever built in the recorded history of humankind, gives one reason for pause and to consider: what is actually taking place behind closed doors within CERN?  For the LHC itself is positioned at the powerful, electromagnetic nexus of ley lines, atop the ancient Roman site of a temple built for the worship of the sun god, Apollo.  In Hebrew, the term is Abaddon, referencing a bottomless pit, the realm of the dead.  In the Book of Revelation,  Abaddon is the king of an army of locusts.  In the Greek, its equivalent is Apollyon, “The destroyer”.  Additionally it should be noted, a statue resides at the offices of CERN.  The Hindu deity Shiva, “the Destroyer”.  Conspicuously absent, are the other two members of the Trimurti, Brahma, “the Creator”, and Vishnu, “the Preserver”.

It may appear as speculation and hyperbole in reaching the conclusion their purpose is to open a gateway.  One allowing demonic entities to move into our realm of existence from their own, and often referred to as “spirits”, or “spiritual entities”.

“Spirits” perhaps may further be defined and described as “digitized biology”.  Organisms replicated by software imitating life in silico via an Artificially Intelligent, 600-cell nano-diamond tetrahedronal, fourth-dimension quantum computer connected to other worlds.

For now is a good time to consult Revelation 9:1-12.  The Fifth Trumpet:  The First Woe.

  • A “star” falls from the sky (9:1).
  • This “star” is given “the key to the bottomless pit” (9:1).
  • The “star” then opens the bottomless pit.  When this happens, “smoke [rises] from [the Abyss] like smoke from a gigantic furnace.  The sun and sky [are] darkened by the smoke from the Abyss (9:2).
  • From out of the smoke, locusts who are “given power like that of scorpions of the earth” (9:3), who are commanded not to harm anyone or anything except for people who were not given the “seal of God” on their foreheads (from Chapter 7) (9:4).
  • The “locusts” are described as having a human appearance (faces and hair) but with lion’s teeth, and wearing “breastplates of iron”; the sound of their wings resembles “the thundering of many horses and chariots rushing into battle”     (9:7-9).

Such is the hubris of mankind.

Anthony Patch

www.anthonypatch.com     [email protected]

As heard on The Hagmann Report