Monday, September 8th, 2008
San Diego School of Medicine scientist Professor Jamey Marth suggests the basic components of the cell [Nucleic Acids, Proteins, Lipids and Glycans] are based upon essentially 68 molecules To the commonly accepted 8 nucleosides of nucleic acids and the 20 amino-acids are added 8 lipid and 32 glycan molecules. To the Genome and the Proteome we can add the Glycome and Lipidome if we are to obtain a more complete accounting of the cellular processes that control the origin and development of living systems.
Professor Jamey D. Marth, 2008, A unified vision of the building blocks of life. Nature Cell Biology, Vol 10(9):1015.
Thursday, October 2nd, 2008
COGNITIVE EXPANSION TECHNOLOGIES
by W. S. BAINBRIDGE.
Journal of Evolution and Technology, 19[1]:8-16,
2008.
——————————————————————–
Review by Professor George F. Hart, LSU.
Recommended reading.
——————————————————————–
Professor Bainbridge’s theme is that the human mind is being transformed as individuals become more intertwined with electronic technologies that perceive, process and present information. This process will continue with advancements in W3 technologies that use human criteria and reasoning in their search methodologies: following the concept of W3 as an extended brain [memory + reasoning] that can be utilized by individuals. Professor Bainbridge, however, sees beyond this stage to one in which computers include personality traits of a user.
From my viewpoint, the excitement of this article is that Professor Bainbridge outlines one way whereby we may eventually be able to identify what I have called the ‘humanity trait(s)”; and, help to answer the question I posed “What of humanity do we want to incorporate into our robotic descendents?” [Hart, 2008]. The kind of development he envisages and documents is a ‘bottom up’ approach to training computers about the human mind. The cleverness of this lies in that such an approach is soundly grounded in the Theory of Evolution: the individuals within the cultural gamodeme will generate the important criteria that will dominate the system.
Professor Bainbridge outlines an approach that I find strikingly simple but with potentially profound consequences for the evolution of robotic intelligence. His unique approach is that the capture of an individual personality may be possible by one person answering many questions set by many other individuals. This could cause an intelligent computer to derive associations that provide a deeper insight into human reasoning: I wonder now if my own estimate of 300 years to develop a robot that has a manufactured consciousness and incorporates the ‘humanity trait(s)” is too long. Although I think Ray Kurzweil’s estimate is too short, perhaps this century will see Robotico earthensis (Hart, 2008) evolve.
————————————————————-
Hart,
G. F., Evolution and the Future of Humanity, Homo sapiens’ galactic
future. eBook edition. ScienceAnd Publications, Boulder, Colorado.
ISBN-13 978-0-9818642-0-4 ,
Friday, October 3rd, 2008
by R. CAMPA.
Journal of Evolution and Technology, 19[1]:28-34,2008.
——————————————————————–
Review by Professor George F. Hart, LSU.
——————————————————————–
Having lived and worked in Britain, the former Soviet Union, South Africa, USA and India I am sure that the way in which we view pure science depends upon our cultural gamodeme. Dr. Campa’s view that “Since the industrial revolution, humans have tended to reduce science to the ancillary role of (the) engine of technology”, whilst true for certain cultures and especially those that came under the influence of Sovietism, is not a global attribute of scientists. The former Soviet Union trained skilled specialist to a very high level in narrow and specialized fields and, indeed, looked upon science as the tool of technology. Britain provided a broad but still specialized education for it’s highly skilled scientists. The USA used a broader model still [see Hart, 2008]. My current view is that science is the way in which we understand reality, and technology is essentially a set of ‘trades’ that utilize science. In no way, however, is science subservient to technology.
Dr Campa’s article does not break any new ground but its usefulness is in the comments on some of the critical popular literature of the past few decades [this includes the usual suspects: Chomsky, Dennett, Dyson, Fukuyama, Horgan, Kuhn, Minsky and Moravac]. In this regard it offers a starting point for those interested in delving deeper into transhumanism.
————————————————————-
Hart, G. F.,2008. Evolution and the Future of Humanity, Homo sapiens’ galactic future. eBook edition. ScienceAnd Publications, Boulder, Colorado. ISBN-13 978-0-9818642-0-4 ,
Reference link: www.ScienceAnd.com.
Posted in Future Descendents | Comments Off
Tuesday, October 28th, 2008
The WWF* 2006 report “confirms that we are using the planet’s resources faster than they can be renewed” with humankinds ecological foot print having more than tripled in 45 years, so-much-so that Earth cannot regenerate its resources quickly enough to avoid a constant deterioration. Moreover, the Living Planet Index “shows a rapid and continuing loss of biodiversity – populations of vertebrate species have declined by about one third since 1970”.
The substance of the report should be frightening but the authors – like so many others who have a political agenda, fail to point their finger clearly at the root cause of the problem – an increase in population selection pressure. It is population numbers and their critical index of population density that are the cause of the stress being placed upon the Earth System. The WWF statement that “The biggest contributor to our footprint is the way in which we generate and use energy” is simply a dumb statement with strong political overtones suggesting an organization that has set its eyes on increased funding rather than solving the problem of Earth’s deterioration. Yes, “our reliance on fossil fuels to meet our energy needs continues to grow and that climate-changing emissions now make up 48 per cent – almost half – of our global footprint”; and, yes “the challenge of reducing our footprint goes to the very heart of our current models for economic development”. But the prime selection pressure on the Earth System is, and has been for 150 years humankinds prodigious growth and expansion which increasing consumes Earth’s resources. The rate of consumption is a function of population selection pressure – primarily population density. As I pointed out in “Evolution and the Future of Humanity” [Hart, 2008] both global corporations and religious institutions have an interest in increasing global population. The politicians of the Institutionalized Liberal Democracies have a vested interest in keeping in the good graces of both Corporations and Religious leaders and will do nothing that hints of population culling – more people mean more consumers and more competition for work [i.e. lower wages can be paid]; and’ more Roman Catholic [or Hindu, or Muslim] children means a larger flock to be fleeced.
The WWF like so many other environment related groups seems scared of pointing to the real culprit because it would have to address the question of reducing humankinds numbers. Again as I pointed out earlier [Hart, 2008] this must address the issue of “who shall live and who shall die” and a morass of ethical principles: euthanasia, eugenics, and restrictions on breeding. These are the issues that must be placed before the public because population reduction is critical – as I have said repeatedly for at least 20 years the “global population is too large for a sustainable Earth System”.
The WWF does do the things it does well – they do work “with leading companies that are taking action to reduce the footprint – cutting carbon emission, and promoting sustainability in other sectors, from fisheries to forests.” This is admirable but the environmental groups need to link the problems to the real cause – unrestricted reproduction by humankind. The meat of the study is good science. The Living Planet Index monitors the health of Earths ecosystem by globally monitoring trends in bio-diversity based upon 1,313 vertebrates species. In the 33 year period ending 2003 it fell 30%. The Ecological Footprint Index tracks the biospheres productivity in terms of the “area of biologically productive land and water needed to provide ecological resources and services – food, fibre, timber, land on which to build, and land to absorb carbon dioxide”. The amount of biologically productive areas is termed the bio-capacity and the report shows that since the late 1980’s the Ecological Footprint Index exceeds the Earth’s bio-capacity by about 25% i.e. the resources are being used up faster than they can be replaced. Both of these indices show overload of the Earth System. The fact is that by 2050 humankind’s resource utilization will be twice the amount the Earth System can sustain and this will lead to ecosystem collapse.
The WWF recognizes that decreasing the human population is one part of the problem and families that choose to have fewer children should be supported. “Offering women access to better education, economic opportunities, and health care are three proven approaches to achieving this”. Reducing per capita consumption, increasing efficiency in the production of goods and improved land management are other offered solutions.
For anyone concerned about the Earth System and it’s future the report is a necessary acquisition: like many I now download such .pdf reports and archive them on DVD as part of my digital reference library. Please read the report for it concerns the whole of humankind.
Get the report: http://assets.panda.org/downloads/living_planet_report_2008.pdf
George F. Hart, 291008
*The WWF’s stated mission is to stop the degradation of the planet’s natural environment and to build a future in which humans live in harmony with nature, by:
- conserving the world’s biological diversity
- ensuring that the use of renewable natural resources is sustainable
- promoting the reduction of pollution and wasteful consumption.
Wednesday, November 12th, 2008
The feature article in November-December 2008, New Scientist alerts us to what many have suspected for a long time. “Genomic Confounds Gene Classification” by Seringhaus and Gerstein points to large scale genomic studies that question the prevailing hypothesis in molecular biology that genes are distinct parts of the DNA molecule that operate by producing an mRNA transcript which translates into a polypeptide that folds to form a protein that has a specific function within the organism. This one gene – one protein view is at the core of our current understanding of biological processes. Nevertheless, scientists have suspected that whereas the idea is fundamental it is too simplistic. As I noted in “Evolution and the Future of Humanity” [Hart, 2008]
“Those parts of the DNA molecule that do code directly for proteins are termed exons, and those parts which do not are called introns. The intron regions of the chromosome molecule are often referred to as junk sequences. These sequences are probably important in controlling the development of traits in some way or another because chromosome duplication processes are far too precise to allow replication of useless materials.”
Seringhaus and Gerstein note that as “high-throughput genomics is generating data on thousands of gene products ….. biology’s basic unit, it is clear, is not nearly so uniform nor as discrete as once thought”. Although the basic concept of one gene – one protein still stands there is a need for an enhanced taxonomy of genes that can improve our ability to classify and interpret the molecular products of the DNA – RNA genomes. Current analytical methods that simultaneously examine the relationship of millions of bases along the genome are showing that “creating an RNA transcript from a DNA region is more complex”, involving transcription of areas of the genome beyond the known boundaries of a specific gene often involving areas of the genome thought to be relic genes harbored in the introns. These introns were previously thought to be spliced out prior to protein production in the Eukaryotes. It is now seen that introns can be incorporated in the protein during transcription and exons can be discarded – this complicates the work of the systematicist and demands a new taxonomy to allow a more rigorous and comprehensive classification system.
The authors significantly note “understanding of gene regulation is also changing”. The classical idea that the repressors, operators and promoters are located in close proximity to one another, as exemplified by the classic lac operand in bacteria is again too simplistic: “in mammalian systems and other higher eukaryotes … genes can be regulated very far upstream by enhancers over 50,000 base pairs away, even beyond adjacent genes”. This is done by the folding ability of DNA. Moreover, we have known for a while that gene activity can be modulated by epigenetic effects such as the addition of methyl groups.
Genomics is at a developmental stage that many other natural sciences passed through. My own early interests were in taxonomy and classification of microfossils and in that field it was recognized, early on, that only with a rigorously enforced and standardized nomenclature integrated into a well thought out taxonomic framework could progress ensue. Seringhaus and Gerstein imply this is what is needed for gene classification if genomics is to progress. In Neontology and Palaeontology nomenclature is standardized through International Codes e.g. The International Code of Botanical Nomenclature and the authors point out the need for such a code for genes. Nomenclatural standardization is necessary as a means of unambiguous communication but also an added value is that a standardized naming system when viewed within a classification is also a global knowledge holder about each object classified. The classification structure itself becomes a a knowledge web that can be queried as a massive bio-database. Seringhaus and Gerstein use the semantic web of the internet and the ability of Google to extract information, as an example of a rich classification scheme. However, I would urge caution of any direct approach in this direction because the web as illuminated by Google will produce a classification in which systematic anarchy prevails. Only with a rigorously enforced and standardized nomenclature within a well thought out [multiple] taxonomic framework would a semantic web produce what is needed. Anyone who uses the web for non-trivial scientific research is aware that the opinion of a single professional is worth more than those of a thousand amateurs [anyone know who said that first?].
It is important to remember the following distinctions: “Taxonomy pertains to a system devised for dividing up things into different types and how they are arranged one to another. Classification pertains to an actual classification that is set up for a group of things. Systematics is the actual classification of individual things within a taxonomic framework.” Hart, 1996. In palaeontology the route to a more stable palaeo-species classification was based in the simple move from a morphospecies definition involving measurable traits to one in which evolutionary [temporal] acquisition of traits is important. This led to a significant advance in the inherent knowledge content of fossil classifications.
Now that we realize that the DNA sequence in a single region does not necessarily define a gene we can incorporate the biochemical effect, [developed by transcription] on the functional phenotype, into gene classification. Moreover, the external and internal selection pressures operating during transcription need to be more fully understood and incorporated into. Clearly, the phenotypic effect does not necessarily capture the function of the gene at the molecular level and to understand the genotype we need to know how biochemical products affect the metabolic pathways and the resulting biological system. All of these aspects need to be incorporated into an improved gene classification.
The taxonomy for genes needs to be non-hierarchical i.e. a multi-level taxonomy. The authors hint at a classification based on gene ontology that uses directed acyclic graph structure [DAG] within which a multiple classification exists but all pointing towards a single gene. What is interesting in the DAG approach is that the multiple classifications [each for its own purpose] and each of which points to a single object i.e. a gene, allows a web to be built that can be interpreted in semantic terms. It is a system that would lend itself to an AI approach to gene systematics. Recently, I have been looking at the SOAR programming language as a system for understanding complex relationships, like those within a genome, or a cultural gamodeme, and perhaps this is the direction in which to develop a useful classification of genes. SOAR can allow a single gene to have multiple functions and multiple genes to have a single function within a semantic framework. This area I hope to explore in the future [as soon as I learn how to use SOAR correctly!]. Such a system could result in a greater understanding of evolution and relationships among living systems.
References:
Hart G. F. 1996 http://www.geol.lsu.edu/hart/NOTES/taxonomy.htm
Hart G. F. 2008 Evolution and the Future of Humanity: Homo sapiens’ galactic future. SCI& Publications, Boulder, Colorado. www.ScienceAnd.com
ISBN-13 978-0-9818642-0-4.
Seringhaus M. and Gerstein M. 2008 Genomics Confounds Gene Classification. American Scientist, 96[6]:466-473.
George F. Hart.
Monday, November 10th , 2008.
Wednesday, November 12th, 2008
Dr. James Hughes of the Institute for Ethics and Emerging Technologies chats with George Hart on the eBook “Evolution and the Future of Humanity: homo sapiens’ galactic future”. Use the following link:
http://www.archive.org/download/BioethicsLawEvolutionAndTheFuture/1109sc
The IEET site link is:
http://ieet.org/index.php/IEET