Thursday, December 10, 2015

The Eugenics Period in Research

Research Domain Criteria (RDoC) is a new classification of diseases—nosology-- championed by the U.S. National Institute of Mental Health (NIMH). It was especially promoted by the NIMH then director Thomas Insel. Insel has now migrated to Google Life Sciences which in the Google empire has become a full-fledged member of Mountain View's Alphabet Inc., and taken on a new name: Verily, a for profit health company.
RDoC baptism coincided with the publication of the DSM-5 in 2013, and heralds a radical diagnostic departure by relying exclusively on biomarkers—biological indicators. The implicit assumption being that behavioral/mental/clinical disorders are manifestations of biological/neurological disorders. Bad behavior is nothing more than shorted circuits in the physical system. Finding the bad circuits will fix the problem. The explicit emphasis of RDoC is to “yield new and better targets for treatment.” [1]  While demoting the importance of understanding the disease, it elevates the search for a cure. There are emerging criticism of this new nosology [2] [3] [4], but what remains untold, is how RDoC is gaining legitimacy.
RDoC’s biological determinism was promoted by the success of how easy it was for the public and scientists to believe that Alzheimer’s disease was determined by biomarkers. The history of Alzheimer’s disease laid the foundation for a new way of biological determinism that has not been seen since the height of the eugenics movement in 1923 when the American Eugenics Society was founded. But this emphasis on biology is unfounded. There is no evidence that biology exclusively determines Alzheimer’s disease or many other mental disorders. But the illusion was made possible by the acceptance of such an association—that Alzheimer’s disease is purely a neurological disease.
Historically only tenuous evidence separated Alzheimer’s disease from senile (old age)dementia. Alois Alzheimer’s observation—shared by many of his contemporary researchers—was that the biomarkers were not unique either for Alzheimer’s disease or among younger people. But the plaques and tangles were elevated as a unique disease classification by Emil Kraepelin—Alzheimer’s supervisor at the Munich clinic. From its inception, Alzheimer’s disease was promoted as a unique disease because of a context that; 1) promoted biological psychiatry, 2) encouraged competition between Munich and Prague laboratories, 3) the belief that genes and biology determine behavior—eugenics, and 4) ageism, the idea that old age invariably results in diminished capacity but that a similar disease among young people is more noteworthy. These socio-political factors supported the legitimacy of accepting that the plaques and tangles were indicators of Alzheimer’s disease—an association that remains unsupported to this day. RDoC’s new way of biological determinism follows from this illusionary success in defining Alzheimer’s disease as biological. Let’s follow the story further.
The story of Alzheimer’s disease took a new turn in the US with the inception of the National Institute on Aging. The first director of the NIA, Robert Butler—when reflecting on NIA's strategy of emphasizing  neurobiological research—confessed that such political machinations reflect the “health politics of anguish.”  Politics have been germane to the context of Alzheimer’s disease from the beginning. Although the NIA adopted Alzheimer’s disease as its war banner—a war to get enhanced funding from Congress—for this war to be won, the NIA needed to create a push and a pull. The pull came from creating an epidemic, while the push came from massive public pressure on Congress. 
To achieve this "push" the NIA co-opted the mandate of the Alzheimer’s Association—originally the Alzheimer’s Disorder and Associated Dementias—in order to focus on research. In turn the "pull" was created by "creating" an epidemic  which pure Alzheimer’s disease could not generate because it was exclusively diagnosed only among a small group of younger adults.  So in 1975—in contradiction to Alzheimer, Fischer, Perusini, Bonfiglio, Kraepelin and Pick, who all believed in a dementing disease that afflicted younger adults—the classification of Alzheimer’s disease was singlehandedly, and without consultation, modified to included senile dementia. "We should like to make the suggestion, simplistic as it may be, that we should drop the term 'senile dementia' and include these cases under the diagnosis of Alzheimer's disease" [5]  Overnight Alzheimer’s disease subsumed the much larger group diagnosed with senile dementia.[6] Older adults became co-opted in a war for the "pull" of research dollars . The following year this new approach was accompanied by a more detailed study on prevalence [7]. 
Overnight Alzheimer’s disease became the sixth highest cause of death in the United States, becoming an instant epidemic. With its name—Alzheimer’s disease—came all the attributes of a real neurological disease, while abandoning senile dementia meant dumping the muffled reference to old age. Although this was a shrewd political move it increasingly meant that the meaning of Alzheimer’s disease expanded and broadened, resulting in an increasingly muddled and confused meaning. Such lack of clarity was intentional.
By broadening the meaning, Alzheimer’s disease instantaneously became the king of all dementias.  That year, in 1976, Alzheimer’s disease became the most common form of dementia, being diagnosed in over 60% of all dementia cases—followed by Vascular dementia, Dementia with Lewy bodies, Fronto-temporal dementia, Korsakoff syndrome, Creutzfeldt-Jakob disease, and HIV-related cognitive impairment. The rare forms which occur in 5% of cases relate to corticobasal degeneration: Huntington's disease, multiple sclerosis, Niemann-Pick disease type C, normal pressure hydrocephalus, Parkinson's disease, posterior cortical atrophy and progressive supranuclear palsy. Different types of dementias might have different and very specific causes. Arnold Pick saw dementia as “. . . a mosaic of localized partial dementias. . .” [8]  Disregarding specificity of dementias,  Alzheimer’s disease became the focus for the fight to cure all dementias. It attracted billions of research dollars and captivated an entourage of highly talented researchers, dedicated to advancing biochemical and neurological science.
The ongoing story of how RDoC could have gained momentum came in 2011 when the NIA and Alzheimer’s Association (AA) published the Alzheimer’s disease guidelines. These guidelines created separate stages of the disease from pre-clinical to Mild Cognitive Impairment (MCI), to early and advanced stages of dementia. Although promoted as research guidelines, no suggestions were offered to improve research methodology, identify anomalies, formalize and standardize instrumentation, define MCI, establish causality, develop hypotheses, generate theoretical predictions, discuss and assimilate alternative interpretations, summarize research updates or propose a road map for future research—all recommendations that normally would be expected in research guidelines.
Nevertheless, the guidelines promoted a powerful agenda—but it was a political rather than a research agenda. Despite the lack of evidence for this approach—inherent in the Amyloid Cascade hypothesis [9]—the NIA/AA guidelines effectively allowed the pharmaceutical industry to experiment on a clinical disease before it becomes clinical. To define a behavioral disease by ignoring behavior. To develop guidelines without providing any guidance.  This has proved to be a surreal policy experiment. Despite the mounting evidence that the Amyloid Cascade hypothesis is incapable of explaining even the most rudimentary of anomalies, there continues to be a willful promotion to maintain the status quo. The handing of the baton to RDoC will continue this status quo, ignoring accumulating anomalies that contradict biological determinism. But the simplicity of a cure inspires a solitary yearning for a panacea, something which RDoC has explicitly embraced with vigor.
Many opportunities exist to address anomalies in research by broadening the study of Alzheimer’s disease to public health. A public health approach to dementias argues that this disease is not only a neurological or chemical disease but that it is also promoted, mediated and/or moderated by other biological, social and psychological conditions and factors. It is time to confront the encroachment of biological determinism in psychology and aim to navigate a way out of this RDoC diagnostic dead-end.
____________
[1] Insel T. (2013) Director’s Blog: Transforming Diagnosis April 29, 2013. Accessed  on 12/8/2015: http://www.nimh.nih.gov/about/director/2013/transforming-diagnosis.shtml(link is external)
[2] Nemeroff,  C.B., Weinberger,  D., Rutter,  M.,  et al.  (2013). DSM-5: a collection of psychiatrist views on the changes, controversies, and future directions. BMC Med. 2013;11:202.
[3] Peterson, B.S. (2015)  Editorial: Research Domain Criteria (RDoC): a new psychiatricnosology whose time has not yet come. J Child Psychol Psychiatry. 2015;56(7):719-722.
[4] Weinberger, D.R., Glick, I.D., & Klein, D.F. (2015). Whither Research Domain Criteria (RDoC)?: The Good, the Bad, and the Ugly. JAMA psychiatry, 1161-1162.
[5] Katzman, R., & T. Karasu. 1975. Differential Diagnosis of Dementia. In Neurological and Sensory Disorders in the Elderly, edited by W. Fields, 103- 34. New York: Grune and Stratton.
[6] Katzman, R. (1976). The prevalence and malignancy of Alzheimer disease: a major killer. Archives of Neurology, 33(4), 217-218.
[7] Lijtmaer, H., Fuld, P.A., & Katzman, R. (1976). Prevalence and malignancy of Alzheimer disease. Archives of neurology, 33(4), 304-304.
[8] Tilney, F. (Ed.). (1919). Neurological Bulletin: Clinical Studies of Nervous and Mental Diseases in the Neurological Department of Columbia University (Vol. 2). Paul B. Hoeber.
[9] Hardy, J.A., & Higgins, G.A. (1992). Alzheimer's disease: the amyloid cascade hypothesis. Science. 256(5054):184-5.
© USA Copyrighted 2015 Mario D. Garrett
Excerpt from Garrett M. (2015) Politics of Anguish. Createspace.  

Monday, December 7, 2015

The Politics of Anguish: How Alzheimer’s disease became the malady of the 21st century.

Since 2015, neurological research surpassed cancer research at the National Institutes of Health (NIH). It was with great hindsight that the National Institute on Aging (NIA) managed to champion a disease that was, for all intents and purposes, a neurological disease—Alzheimer’s disease. The NIA and Alzheimer’s disease have a symbiotic relationship from the beginning of NIA’s conception in 1974. This emphasis meant that the NIH/NIA had to diminish the role of social factors in Alzheimer’s disease research. But how effective has this approach been? The final judgment needs to be based on outcomes, and the NIH/NIA outcomes are starkly devoid of substance both in theoretical development as well as in practical applications.  We still do not know the exact role of the plaques and tangles in the brain, and what knowledge we have we still cannot apply to elevate some of the symptoms let alone cure the disease. After a century of false hopes, it is time to re-evaluate our approach. The constant search for a cure is becoming a worthless meme. Perhaps we can learn something from cancer research.
Cancer research continues to evolve, but one lesson learned, is that cancer is not simple and not one drug will cure all cancers. We need a similarly nuanced understanding of dementias. Why such a simple understanding is not embraced might have something to do with the politics of how research funds are managed. In Alzheimer’s research there is a hierarchy, a cabal, a virtual club whose members receive most of the federal research funds and who determine the agenda. It's a powerful club that determines the direction of research and determines how to frame the disease, how to define it for the public and what is prioritized. But the direction this inner sanctum charted has resulted in a research cul-de-sac. For more than a hundred years we have been encouraged to foster a false hope of a pharmaceutical product, a drug, which will cure Alzheimer’s disease. This has not happened and this will never happen. And the reason why this can be said with such apparent gusto is because we still do not know what we are trying to cure. The construct we now call Alzheimer’s disease is so broad that any intervention that shows any diffuse outcome, will be heralded as a cure. But despite these advertisements, the disease remains elusive. There are numerous researchers who have pointed out anomalies in research, stressing that the direction we are taking is incomplete (Ballenger, 2006).
Sixty years ago David Rothschild highlighted anomalies that he optimistically anticipated will “…open(s) up many fields of study—for example, unfavorable hereditary or constitutional tendencies, and unfavorable personality characteristics or situational stress.” (Rothschild, 1953, p. 293) Unfortunately it did not. The science of Alzheimer’s disease remains firmly and reticently rooted in biology and neurology, despite compelling evidence that this mechanistic approach is too simplistic and does not explain observations. Another physician predicted how future researchers might use the knowledge of plaques and tangles as “…a good playground…” (Perusini, 1911, p. 144). The historical context tells us that researchers today keep ignoring the complex facets of Alzheimer’s disease and playing a game of causality—that biological markers translate to behavior. And we are paying for these choices by being denied any progress towards understanding the disease, or being closer to a cure or alleviating the disease.
Science is not a destination but a journey. It is purely a method of epistemology, of assimilating knowledge. It is not scientific “knowledge,” but knowledge that is gathered using “scientific methods.” All scientific knowledge is incomplete (or wrong), since science continues to generate more detailed questions which determine a better methodology, that result in more complex and accurate results.  As a function of this process, science is based on reviewing all information, assimilating all observations in a model and being able to predict outcomes. Despite all the science invested in studying Alzheimer’s disease, there remain numerous anomalies.  Why these anomalies remain unrecognized is not due to ignorance, nor incompetence, but due to a political strategy--it is intentional. There is a way out of this research cul-de-sac but we have to confront the truth that Alzheimer’s research is politicized to the detriment of humanity.

Except (edited for this blog) from the book: The Politics of Anguish: How Alzheimer’s disease became the malady of the 21st century. Mario Garrett. Createspace.

References
Ballenger, J. F. (2006). Self, senility, and Alzheimer's disease in modern America: A history. JHU Press.
Rothschild D (1953) Senile Psyhcoses and Psychoses with Cerebral Arterioscelrosis p289-331 in Kaplan Oscar J (ed) Mental Disorders in Later Life, 2nd Edition. Chapter XI.


© USA Copyrighted 2015 Mario D. Garrett

Monday, November 23, 2015

How Would Real Capitalism Work?

The derivatives market is twenty times the total world economy. At $1.2 quadrillion the derivatives market is undefinable,  complex, unregulated, and highly profitable. Derivates are a complex set of investments that are based on future events. For example if i bet that someone who has aides is likely to die and then I can buy their life insurance policy earlier and make a profit that is a derivative transaction. But more importantly, if I take action to make sure that that person does not live longer (by restricting life-extending medications, or restricting access to information that might delay their demise, then such action would increase the value of my derivatives.  Then there are those in the "positive feedback loop" that just follow the market as it goes down and sell a proportion of their stock and hopefully countered by "negative feedback loop" that buy when it goes low and sell when high.

Saturday, November 21, 2015

We are Becoming Gods

At one of the seminars at the University of Melbourne students were discussing prostitution (which is legal in Melbourne), I was wondering why we find certain activities distasteful: Defecating, getting drunk, binge eating, spitting, drooling, burping, public sex, masturbation, dying, giving birth, nursing a baby, crying, farting, being needy…and a pattern started to emerge. These behaviors are natural—even mechanical—aspect of our biological being, and the only reason that we find these activities distasteful is because there must be an imbedded ideal standard that we aspire to.
In our mind we have a model of the world. The sole purpose of having such a complex brain is to represent the world in its entirety as far as it affects us. Every day we adjust this model to make it closer to reality. An unachievable objective since reality is ephemeral but we make reality conform to patterns that help us to predict it. Both our dreams and our waking emotions signal a need to modify and adjust this view of the world. And this cognitive representation remains mostly unconscious. Our brain interacts unconsciously first and only gives us consciousness when it requires our full attention in order to address some complex event. This is the internal world that is driving us to think like gods. This unconscious model of the world furnishes us with a feeling of mastery and control because we can predict and affect change. But this feeling of mastery is an illusion and it is this illusion that is growing.
Rousseau's ‘Malo periculosam libertatem quam quietum servitium” If gods were people, they would govern themselves democratically. In our cognitive model the world is predictable and just (Lerner, 1980). Despite an onslaught of daily news informing us otherwise, we still believe in a just world. We continue to be surprised by disasters or catastrophes thinking that they are exceptions. They are not.  They are exceptions only in our model of the world—the one that we cultivate in our head, the virtual box—because everything is in harmony, everything is balanced, and just. We continue to aspire to a world where we can "cure" death, "regain" our youth, "fight" terrorism, "save" humanity...these are illogical and delusional aims only if you are NOT a god. If we aspire to behave like—or think that we are—gods, then these aspirations are attainable. These aspirations confer a delusional sense of control over our world.
The Emergence of Individuality
The idea that we are somewhat godlike requires that we have a belief that we are unique individuals. Jean Twenge and Keith Campbell exploration of the narcissistic epidemic (Twenge & Campbell, 2009) document an alarming rise of narcissism at every level of our society. With the pampering of social media promoting a world filled with egos, our individualistic selves are thriving. We only have to look at the world economy where such individualistic arrogance is promoted by banks, despite resulting in risky, unrealistic investments. A financial system that enshrines the motto that it is "too big to fail".  Godlike structures which are resilient to reality and to change, that rest above the law and it seems above basic economics. And such arrogance comes across from individuals running these structures. Among the oligarch—the very rich—there has always been godlike arrogance. Traditionally oligarch were philanthropic, they wanted to change the world for the better. But it is their view of “better” that is creating inequity. Not only are we turning into a world—not just a society—run by oligarchs, but we are seeing these inequities transferred across generations. The oligarchs have discovered immortality they are transferring their wealth down successive generations. Their individualism will live into eternity though the management of their wealth. These are not delusions of grandeur, since these individuals have the power to affect great social changes. And oligarchs have always been with us, although perhaps not to such as extent as today. What is uniquely transformative is the emerging belief that anyone can have this power. This form of individualism has infected the general public. And it is this belief—that anyone can become an oligarch—allows us to sell short our collective futures.

Individualism should not be confused with individuality. Individuality is a psychological concept of self that is separate and distinct from others.  Individualism is a historical interpretation of self as the center of all interest and a belief that individual achievement rather than community or societal progress is the ultimate development. That the moral and intellectual imperative resides with the individual and not the community, something that is now shared with corporations (cf. Lukes 1990). Individualism is a deformity of individuality. There is a certain form, a shape that individuality takes that fits in within society. But through individualism, this shape become malformed, and no longer fits within the bigger puzzle of society. It stands out as separate and, in some cases, an antithesis to the social setting it resides in.  Individualism is predicated on the expectations that wellbeing and life satisfaction are achieved only through one’s personal goals rather than through community achievements (Diener and Diener 1995). An offshoot of this is that we base our predictions—judgment, reasoning and causal inference—based on the person (or people) rather than the situation or social context (Morris and Peng, 1994).  We judge others, and not the context that we find them in. Terrorists are deranged people rather than sane people in deranged contexts. Our godlike behavior elevates control of behavior to internal thought rather than social contexts.
There is evidence that in pre-history—before written records—humans were aware of their individuality but without individualism. Historically our personality was shared with the community that we lived in. Bell provides numerous examples where “there is wide agreement among anthropologists, evolutionists and cognitive specialists that early humans had little or no awareness of themselves as independent personalities, but instead felt themselves to be parts of the group (collective) to which they belonged.” (Bell, 2010)  Remnants of this common sharing is still seen in marriage rituals were the marriage is seen as uniting two families together, in as much as a coupling of two individuals.
The growth of individualism has been identified in psychiatry through the work on personality.  Personality is a hypothetical entity that cannot be observed or studied other than when studied within interpersonal situations. There is no “I” in personality unless there is an interaction with others.  The “I” without a social interpersonal context does not exist. In Social Identity Theory 'individuals define themselves in terms of their social group memberships and that group-defined self-perception produces psychologically distinctive effects in social behaviour' (Turner, 1982). This socialization is what makes us distinct. As early as the 1950s, Harry Sullivan in “The Illusion of Individuality” argued that: “…human beings are human animals that have been filled with culture—socialized…” (p. 323) His personality theory is based on relationships rather than internal psychodynamics (such as with the theory proposed by Sigmund Freud). Culture is how we define ourselves as individuals—different cultures promote different versions of individuality—and this is achieved not by defining an individual but by defining an “ideal” individual through a global acceptance of individualism. Individualism—an emphasis on personal aspects such as personal goals, personal uniqueness and personal control and in contrast marginalizing social aspects such as community, family and civics.  The only way that individualism can grow is by developing these unique qualities for the self through abstract traits (Baumeiser, 1998).  There are no examples in reality that reflect individualism—we have to create them ourselves through our construction of gods.  It is this abstract nature of individualism through our construction of gods which is driving the narcissism epidemic. But these are not just abstract ideas, but ideas that have been embedded in our way of thinking.
There is no clear historical demarcation of when individualism gained a significant footing in our personality. The historians Jacob Burckhardt and Jules Michelet discuss how the growth of individualism can be seen around the dawn of the Renaissance period (Skidmore, 1996). And we can see how social context promotes individualism. But the first endorsement that individualism is a positive attribute was by Thomas Hobbes. Hobbes’s first law of nature states that man has the right to do whatever it takes to get what he wants, even if it means harming others. The only compromise is through Hobbe’s second law of nature, which states that in a consensus people can give up some rights (of their individualism) to live peacefully in a society, without conflict. Ayn Ryand takes this form of narcissism further with her radical and dysfunctional interpretation that individuals should not compromise. Individualism is enshrined as an ideal despite the harm to society and to the community. The individual trumps all other causes. Both individuality and the malformed extreme that we see through individualism are both social constructions. They are both an illusion since they exist relative to their social context. The argument against this self-centered growth towards individuality comes from a very unique place: Biology.
Against Individuality : Superorganisms
We talk about biological determinism as a negative philosophy whereby biology undermines any other influence especially in how we behave. But biologist themselves are eroding this biological imperative by conducting some amazing science. It was the sociologist Emil Durkheim who proposed that humans are  “homo duplex”, leading double existences. According to Durkheim, one existence is rooted in biology and one in a social world. This interpretation holds amazing foresight at the time. This is an important distinction because while our social self (morally, intellectually, spiritually superior) is moving towards a more narcissistic form of individualism, the biology is moving in the opposite direction and showing how biologically diffuse we all are.
We are finding that the more we look at our body the more we see that we are made up of collective external organisms.  Our bodies and our brain are not an exclusive entity—we have parts of other organisms and other people within us. In addition to genes that we inherit (in most cases, but not always) from both our parents, there are viruses, bacteria and potentially, other human cells within our body. Even our genes and brain are not deterministic and are influenced by external events.
Alien Cells in our Body
With 37 trillion cells in our body, Berg (1996) estimates that there are 10 times more bacterial cells in your body than human cells. Although bacteria is smaller and lighter than human cells—weighing 1-3% of our body weight—the 500-1,000 species of bacteria that inhabit our body have evolved with us for millions of years.  Such mutual evolution is found in our mitochondria "the powerhouse of the cell" because they generate most of the cell's supply of chemical energy. In addition they are used for signaling, cellular differentiation, and cell death, as well as maintaining control of the cell cycle and cell growth. The presence of mitochondria in our cells varies with liver cells having more than 2000 mitochondria. Without mitochondria we will not survive since they are necessary to generate energy needed for the cell to function. It is humbling to learn that such an integrate part of our existence, these cells have their own genetic code and replicate independent of the rest of our cells. The reason for this is because mitochondria are a form of bacteria that were absorbed into our cells and now forms a symbiotic relationship with human cells—an endosymbiotic relationships in our body. However, in some cases the bacteria stay as independent contractors.
As independent contractors bacteria reside all over our body—inside and out—but bacteria has a special place in our human gut. Here in the dark recesses of our plumping reside trillions of microorganisms engaged in fermenting, killing off other harmful bacteria and viruses, enhancing our immune system and producing vitamins and hormones. This bacterial activity is so necessary to the body that their outcome function as an independent organ—a virtual "forgotten" organ. Gut bacteria help extract energy and nutrients from our food. This sharing of benefits shows in experiments where bacteria-free rodents have to consume nearly a third more calories than normal rodents to maintain their body weight. Such symbiotic relationship has direct implications for older adults.
In 2012 Marcus Claesson and Ian Jeffery from University College Cork in Ireland and their colleagues found that institutionalized older adults have a different bacteria in their gut than community older adults and younger people. And they related this change—caused by a restricted diet—to becoming weaker physically and increase mortality.  That an alien microorganism can have such dramatic life enhancing properties is startling. But this revelation was overshadowed in December 2014 when Martin Blaser from New York University and Glenn Webb from Vanderbilt University, Nashville, Tennessee, tried to explain how bacteria directly kill older adults. They argue that modern medical problems, such as inflammation-induced early cancer, resistance to infectious diseases and degenerative diseases are in response to bacterial change, as we get older. Bacteria that live with us have learned to kill us off in old age. Using mathematical models the authors show that bacteria evolved because they contributed to the stability of early human populations: an evolutionary process that enhanced the survivability of younger adults while increasing vulnerability of older adults.  In our modern world such bacteria's legacy is now a burden on human longevity. But bacteria is not just a passive guest. Sometimes bacteria can call for delivery.
Gut microbes can produce neurotransmitters that alter your mood and even may control your appetite. Causing you to crave food bacteria enjoys but which might be detrimental to your overall health. Such risky behaviors, in some cases, causes an earlier death. An infection of a parasite called Toxoplasma gondii, for example makes rats attracted to cats. Since the bacteria can reproduce only in cats (their vector) they make the rats lethargic around cats improving the chances of the rat being caught and improving the bacteria’s chances of infecting the cat and reproducing. In humans the same microbe increases the chance they will suffer from schizophrenia or suicidal depression.
Bacteria is not the only alien organism in our bodies. While we are being incubated, in the fetus, cells pass between twins or triplets and sometimes from previous siblings that occupied the womb. Around 8% of non-identical twins and 21% of triplets, for example, have not one, but two blood groups: one produced by their own cells, and one absorbed from their twin. There are even examples (anecdotal ABC News, 2014) where mothers passed on her twin sister’s genes, and not her own, to her children. Her eggs carried different genes from the rest of the body.
Alternatively, cells from an older sibling might stay around the mother’s body, only to find their way into your body after you are conceived. Lee Nelson from the University of Washington is examining whether cells from the mother herself may be implanted in the baby’s brain and the other way round where a baby’s genetic material finds itself in the mother’s brain. Nelson took slices of women’s brain tissue and screened their genome for signs of the Y-chromosome. Around 63% of mothers had Y-chromosome male cells in multiple brain regions. The authors cite a correlational observation that shows that these alien cells seemed to decrease the chances that the mother would subsequently develop Alzheimer’s--though exactly why remains a mystery.
Our body is home to a universe of external components. Not only is our body permeable to outside organisms, our brain is similarly influenced by external events, both in terms of how it functions and in terms of how it behaves.
Mirror Neurons
We have specialized areas in our brain that “mirror” our environment. In the 1980s, the Italian Giacomo Rizzolatti and his colleagues at the University of Parma, first observed mirror neurons in monkeys. Although mirror neurons exist in most animals, in humans they have been observed in multiple areas of the brain, with as many as 10 percent of neural cells devoted to mirroring. A mirror neuron fires both when a person acts and also while observing the same action performed by another person. Such mirror neurons respond directly to what is observed outside. Our brain responds and mimics the activation of another person’s behavior and activity. Oberman & Ramachandran (2009) believe that the existence of mirror neurons explain the development of self-awareness and reflection because humans can have “meta-representations of our own earlier brain processes” (Ramachandran, 2009). The individual is looking more diffuse and more dependent on its immediate environment.  Even our genetic material is now more likely to be influenced by our environment that we had previously thought.
Epigenetics


Living in poor and dangerous neighborhoods has a direct effect on our hormones and stress chemicals—such as interleukin 6, acting as both a pro-inflammatory cytokine and an anti-inflammatory myokine that indicates body stress.  A stressful environment—such as poor neighborhood—results in negative changes in the chemical composition of older adults, regardless of other factors. And these chemicals initiate changes in the body that are longer lasting because they switch on and off the expression of some genes. These epi-genes (above genes) can be switched on and off in order to help establishing and maintain a consistent optimum level of chemical balance within the body. Environmental factors such as mercury in water, second-hand smoke, diet including foliate, pharmaceuticals, pesticides, air pollutants, industrial chemicals, heavy metals, hormones in water, nutrition, and behavior have been shown to affect epi-genetics.  Furthermore, epi-genetic changes are associated with specific outcomes such as cancer, diabetes, obesity, infertility, respiratory diseases, allergies, and neurodegenerative disorders such as Parkinson’s and Alzheimer’s diseases. Our body changes our epi-genes—establishing an optimum level of chemical balance in response to our environment that can then influence our overall health.
This accumulating evidence suggests that the body is a meeting place of interaction, a venue with the outside world—the geography, the community and significant others. Accepting that there is not just a "me" inside us but also a "we" then there is a more concise understanding how the environment, community, family and friends can determine our behavior and outcomes.   My individuality is no longer solely about me but about my upbringing, my community and the people around me. Eroding the exclusivity of the individual exposes the extreme deception of elevating individualism as an ideal state.
Societal Implications
In reaction to the rise of individualism has been the concept of monasticism—living in a closed community with people that we share similar beliefs. Such an experiment was initially started by Epicurus and later evolved to the monastic life we see represented today in both religious communities such as among monks and nuns but also in social groups such as kibbutz, some “houses” in universities and the largest monastic living, prisons. Whereas society is moving towards a generation of younger adults believing that individualism will bring them happiness at the same time we are seeing groups of people being treated less than human. Hobbes’s first law of nature that man has the right to do whatever it takes to get what he wants, might not involve a compromise if only another group of people give up their rights. While the winning group are thinking like gods, another group are made to take responsibility for all the negative events that happen.
Hofstede (2001) observed that poorer countries were more likely to be collectivistic whereas wealthy countries were individualistic in nature.  Dimensions of individualism and collectivism seem to be affected by economic factors such as wealth or poverty.  Not only are there rich and poor countries/ individualist vs collectivism, but each society is becoming more divided. There are people behaving like gods and there people treated less than humans. This is what the Roman historian Sallust (Gaius Sallustius Crispus, 86–35bc) identified that: "We have public poverty and private opulence". We have yet again reached this time in history where a group of people is in poverty and a smaller group in private opulence, behaving and thinking that they are gods.
Emil Durkheim argued that there will be a conflict between the biological and the social aspect of the homo duplex but he could not have predicted that it was the biology that made us more collective. There might be a separation of homo duplex, where one group becomes more godlike and another falls from heaven. There must be a story there somewhere.

References
ABC NEWS (2014). She’s her own Twin.  Accessed 10/12/2015
http://abcnews.go.com/Primetime/shes-twin/story?id=2315693
Baumeister, R. F. (1998). The self. In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., pp. 680-740). New York: McGraw-Hill.
Berg, R. (1996). "The indigenous gastrointestinal microflora". Trends in Microbiology 4 (11): 430–5. doi:10.1016/0966-842X(96)10057-3. PMID 8950812.,
Bell M.G. (2010). Agent Human: Consciousness At The Service Of The Group. Kindle edition.
Bianconi, E., Piovesan, A., Facchin, F., Beraudi, A., Casadei, R., Frabetti, F., ... & Canaider, S. (2013). An estimation of the number of cells in the human body. Annals of human biology, 40(6), 463-471.
Hofstede, G. (2001). Culture’s consequences: Comparing values, behaviors, institutions and organizations across nations (2nd ed.). Beverly Hills, CA: Sage.
Lerner, M.J. (1980). The belief in a just world: A fundamental delusion. New York: Plenum Press.
Oberman, L. & Ramachandran, V.S. (2009). "Reflections on the Mirror Neuron System: Their Evolutionary Functions Beyond Motor Representation". In Pineda, J.A. Mirror Neuron Systems: The Role of Mirroring Processes in Social Cognition. Humana Press. pp. 39–62.
Ramachandran, V.S. (2009). "Self Awareness: The Last Frontier, Edge Foundation web essay". Retrieved July 26, 2011.Skidmore, M. J. (1996). Renaissance to millennium: Ideological insights from creative works. The European Legacy, 1(4), 1628-1633.
Triandis, H. C., & Gelfand, M. J. (2012). A theory of individualism and collectivism. Handbook of theories of social psychology, 2, 498-520.
Turner, J. C. (1982). Towards a cognitive redefinition of the social group. In H. Tajfel (ed.), Social Identity and Intergroup Relations. Cambridge: Cambridge University Press.
Twenge, J. M., & Campbell, W. K. (2009). The narcissism epidemic: Living in the age of entitlement. Simon and Schuster.

© USA Copyrighted 2014 Mario D. Garrett

Saturday, October 31, 2015

Why Death is Important

It seems that humans cannot come to grips with death. Even when someone has died, we hold on to shreds of belief about their continued existence in realms that are independent of us. This vestige of residual existence is represented throughout all religions, to varying degrees of realism and ceremony. Our present clinical age has transformed death from a natural—but incomprehensible—cycle of life to one of clinical failure. Death is a medical embarrassment.
Of all disciplines, biologists are perhaps at an advantage in accepting death not only as a natural process but as a necessary process. Leonard Hayflick, the renowned biologist/gerontologist was perhaps the most succinct in saying that (paraphrasing) “death might be detrimental to the individual but necessary for the species.” Biologists understand death because they look at species and how species develop. Because higher turnover (death rate) means that the species is more adaptive—these are known as r-selection coined by the ecologists Robert MacArthur and E. O. Wilson (Pianka, 1970). The alternative biological strategy would be to have fewer offspring but to invest more into their nurturing (such as humans.) This type of strategy is referred to as K-selection species. Biologists are so good at dealing with death that they categorized species on the basis of their death rate.
Such an important construct as death should have more relevance to us as humans. And it does, especially when we need to understand the foundation for our sense of being, as in metaphysics--a branch of philosophy interested in the first principle of things. Metaphysics asks radical questions include abstract concepts such as being, knowing, substance, cause, identity, time, and space. How can we understand that we are not just performing actors on the stage of life following a genetic narrative, but that we are participating directors as well. It seems that death—an idea of the expectation of death—provides us with an urgency to live. When Simon Critchley compiled the thoughts about death by more than 190 philosophers, the central theme that he summarized was the idea that death provides an urgency to live in the present. Philosophers use the concept of death to define the interactions at the present as the only real aspect of the passage of time. The idea of death defines our idea of the reality of the present. But death has to be more than idea. At the turn of the 1900s Sigmund Freud was the first one to assign the idea of death as a drive.
Thanatos--the hypothesis of a death drive, that lead to an inanimate state--was originally proposed by Sigmund Freud in 1920 in Beyond the Pleasure Principle. Freud was trying to explain the First World War. How can virile men willingly go to their death rather than follow their true desire for sexual gratification? However Freud’s interpretation of why patients repeated relive the traumatic experience as it is still happening to them now (as personal), rather than as a past and abstract experience (as actors) indicates a certain lack of understand of the ontological belief of how the self, the “I” remains constant across time. For such an interpretation Martin Heidegger has a better interpretation of death.
Martin Heidegger’s book “Being and Time” refers to time as finite defined at the end by our understanding of death. In our being, death provides the final full stop/period. To be an authentic human being, we must be aware of our ultimate death. This is what Heidegger famously calls "being-towards-death". Heidegger needed death in order for us to care. For Heidegger, caring is not being nurturing and showing empathy, for Heidegger caring is owning your being. To care we have to appreciate death and because we cannot truly know and experience death we have to accept the "possibility of impossibility”--our non-existence. One cannot fully live unless one confronts one's own mortality through a courageous "anxiety" (Heidegger, 1927, p. 310). Michel de Montaigne said this much better when he said that: "The premeditation of death is the premeditation of liberty; he who has learned to die has unlearned to serve.” (Montaigne, 2012, Chapter XIX).  This anxiety about dying is why we care--we feel responsible for our lives. It is the primary fulcrum that energizes human engagement in a world that we own, that is personal and not a backdrop for a theatrical existence.
Death is important in constructing theories about how people behave because death--and our internal appreciation of death—means that we start to care about our world, our behavior and existence.  All philosophers have discussed death, some in passing other in more detail. However Heidegger’s interpretation of pinning the basis of knowing about oneself on the idea that we have an appreciation of our ultimate non-existence is the strongest. Freud’ analysis is too specific to a wish to die, which does not translate well nowadays with our narcissistic cohorts. Heidegger’s interpretation does however suggest that there is a developmental process in that our appreciation of our own demise translates directly to our caring to us owning our world and doing something about it.

References
Critchley S (2009). The book of dead philosophers. Vintage Books.
Freud S (1920) Beyond the Pleasure Principle
Heidegger M (1927) Being and Time. Reprint, New York: Harper and Row, 1962.
Montaigne deM (2012) The Essays of Montaigne. Reprinted Ebook. Accessed on 10/31/2015 from: http://www.gutenberg.org/files/3600/3600-h/3600-h.htm

Pianka ER (1970). On r and K selection. American Naturalist 104 (940): 592–597.

© USA Copyrighted 2015 Mario D. Garrett 

Wednesday, September 30, 2015

God, Mathematics and Psychology: Are they all one?

This discussion focuses on psychology and the philosophy of mathematics and will contribute nothing to mathematical thought. Its aim is to introduce mathematics as a creation of psychology. Sophisticated, complex and ever evolving, but nevertheless psychology.


Mathematics translates patterns into reducible parts. These parts form theorems—incremental reasoning based on a chain of formal proofs—that conform to logic but operate beyond logic. Mathematicians argue that these patterns are universal and real and that the interconnecting system of reducible parts is what constitutes mathematics—a language of spatial positioning, geometry, numbers, volume, movement and patterns. These are complex patterns that lead to complex theorems.
Sometimes these patterns exist in reality and prove useful in terms of predicting physical events in the universe and sometimes they are the perfect embodiment of a cognitive world--true forms that exist primarily in our imagination, such as the perfect circle. Sometimes the theorems relate to patterns that are solely--as far as we know, or yet--in the realm of a group of mathematicians’ imagination. Although mathematics is not set-up, by mathematicians, to explain our reality, there is however a symbiotic relationship, in that proofs can come from within the physical experimental world.
The basis for elevating mathematics to more than just a complex system of creating theorems is the role that mathematics was given by Pythagoras (6th Century BC).  Pythagoras believed that numbers were not only the way to truth, but truth itself. That mathematics not only described the work of god, but was the way that god worked. This belief, that mathematics holds an intrinsic truth remains with mathematicians today. They believe that mathematics is the language of the gods. And that is a problem if you do not believe in god or in an over-ridding principle of existence--none that we can understand anyway. Science is by definition both atheist and agnostic despite what individual scientists believe. Most mathematicians behave as deist who believe that God created the universe but that natural laws determine how the universe plays out. This is a Epicurean (341–270 BC) belief that the gods are too busy to deal with the day-to-day running of the universe but they set it in motion using mathematics.
Mathematicians therefore argue that mathematics is a higher order that is found in reality. But there are no examples of such proofs. Mathematicians argue that they are more discoverers rather than inventors. But this dichotomy also seems false. Mathematicians seem to do both, most often at the same time.  The British philosopher Michael Dummett suggests that mathematical theorems are prodded into existence--he uses the term probing (Dummett, 1964). Using the analogy of the game of chess where, “It is commonly supposed … that the game of chess is an abstract entity” (Dummett, 1973). But there is certainly a sense in which the game would not have existed were it not for the mental activity of human beings. It is a delusion to believe that just because we find a pleasing pattern, a game that resonates across cultures, that the reason it is pleasing is because there is a god behind it. But mathematicians argue that chess, or theorems are not entirely products of our minds since there must already be something there to prod. But the obverse argument is equally true that mathematical “truths” are entirely dependent on us since we need to prod them to bring them into existence.
The same is true for language, art, music and other “Third World” constructs—these are incrementally evolving systems and form one of Karl Popper’s ontological tools (Carr, 1977). Third World is where the system that is developed exists beyond the creator. Language is an excellent example, although Third World also includes abstract objects such as scientific theories, stories, myths, tools, social institutions, and works of art. Language is incremental and ever evolving, and is used to help us communicate reality. Within this Third World, language as with mathematics, is also argued to be both discovered or invented.
Theory of language development has oscillated between two schools of thought.  One school that argues that language is culture-bound, known as Descriptivists. And on the other side is the argument that promotes language as part of our biological makeup, known as the Generativists. As a Generativist, Chomsky (1980: p134) phrased it eloquently when he said that, “we do not really learn language; rather, grammar grows in the mind”.  The analogy between formal mathematical systems and human languages is not a new or novel idea. In fact such formal language theory have already been established in its modern form by Noam Chomsky in an attempt to systematically investigate the computational basis of not just human language but has become applicable to a variety of rule-governed system across multiple domains--computer programs, music, visual patterns, animal vocalizations, RNA structure and even dance (Fitch & Friederici, 2012). This symbiotic relationship exists across all Third World constructs: mathematics and music, music and art, art and language and all other permutations. As with mathematics, we refine language with time. Future generations build upon language and mathematics and the only constraint seems to be our psychology.  Mathematics similarly has this incremental nature. The last sentence of a talk given by Fine on mathematics  “The only constraint is our imagination and what we find appropriate or pleasing.”  (Fine, 2012: p27).  What we find appropriate and pleasing is where the psychology comes in and our clue to the inception of mathematics and the description of our psychology.
As a guide, we have to go back to earlier (and more simple) mathematics to understand this principle of “pleasing.” Pythagoras and music is the basis for a convergence between mathematics and psychology.  Pythagoras (6th century BC) observed that when the blacksmith struck his anvil, different notes were produced according to the weight of the hammer. He later discovered that the ratio of the length of two strings determines the octave "that the chief musical intervals are expressible in simple mathematical ratios between the first four integers" (Kirk & Raven, 1964: p.229). Thus, the "Octave=2:1, fifth=3:2, fourth=4:3" (p.230). These ratios harmonize, meaning that are pleasing both to the mind and to the ear. Although this mathematical system breaks down the higher we go up the scale, there was a solution by adjusting the ratio of the fifth so that it is commensurable with seven octaves. Seven octaves is 128:1, or 27. John Stillwell (2006) argues that "equal semitones" or "equal temperament" (p.21), was developed almost simultaneously in China and the Netherlnds, by Zhu Zaiyu (Chu Tsai-yü) in 1584 (during the Ming Dynasty and by the Simon Steven in 1585 and by (Ross, 2001). But the point is that a mathematical rule was developed on the basis of a harmony that we humans find pleasing.
In nature, all sounds are the same. The creator of the universe created all acoustics, all sounds are perfect. Nature cannot discriminate among them since they are all necessary and useful. As such, selecting harmonics is psychological rather than godlike. We like the separation of scales because we can psychologically compartmentalize the sound. We are creatures of order and consistency and prefer to have distinct and distinguishable sounds. In reality there is no such thing as harmonics, we look for it as humans because it is pleasing.
Such psychological preferences are automatic and require no processing and thinking on our part. This automation can be easily be disrupted by playing a tone that is ostensibly ever increasing or decreasing without end. Such a tone was developed by Roger Shepard and consists of a superposition of sine waves separated by octaves. This creates the auditory illusion of a tone that continually ascends or descends in pitch, yet remaining constant.
Not only does the Shepard Tone create dissonance because we find it difficult to understand, it also creates uneasiness as a result of this dissonance. This perceived auditory dissonance causes emotional uneasiness.  We become uncomfortable when we cannot pigeon hole our perception. We need sounds that are at a prescribed distance from each other that make perception easier. Pythagoras defined the first mathematical rule for auditory perception, the definition of an octave that pleases our psychology for order and form. The fact that both European and Chinese figured this out at the same time indicates that the perception of octave generalizes across linguistic and auditory differences (for more auditory illusions see Deutsch, 2011). These psychological requirements, codified into mathematics are also found true for vision.
We like to see things in “chunks.” Mathematics was the earliest discipline to reflect this psychological need by inventing the number “one.” This basis of an “entity” forms the upside down pyramid of mathematics. Without a “one” there is no mathematics. But there are problems with the number one. There is a point at which a “one” cannot be defined mathematically, or where it fails to conform to some particular way, such as differentiability. This singularity--which is proving to be so problematic for mathematicians in explaining quantum physics for example--is only a problem for mathematicians, because an entity of “one” is the perfect creation of our mind and not nature. In fact the only way that quantum physics can explain superposition, entanglement and other quantum mechanics is by removing the “one” from the theorem. By removing the parenthesis around “one” quantum physics can be better explained, although then we have to readdress our psychology and the reliance on our perception of separate entities. From a psychological point this can be easier achieved rather than forcing quantum physics to conform to psychology.
History has been here before. Pythagoras--having traced the hand of god in how music is constructed--thought that each of the seven planets produced particular notes depending on its orbit around the earth. This was Musica Mundana and for Pythagorians, different musical modes have different effects on the person who hears them. Taking this a step further, the mathematician Boethius (480-524 AD) explained that the soul and the body are subject to the same laws of proportion that govern music and the cosmos itself. As the Italian semiotician Umberto Eco observed we are happiest when we conform to these laws because "we love similarity, but hate and resent dissimilarity" (Eco, 2002; p31).
This is not the first time that mathematicians thought they have touched the hand of god, neither will it be the last time. But what Pythagoras touched is our psychology. By focusing on pleasing patterns, similarities, and order, mathematicians are exploring the foundations of our psyche. And to do this they had to build rules and “common notions” that bind all these thoughts into a coherent language that translates into mathematics. For example if we take Euclid (4th Century BC) five "common notions” as defined in The Elements:
  Things that are equal to the same thing are also equal to one another
  If equals are added to equals, then the wholes are equal
  If equals are subtracted from equals, then the remainders are equal
  Things that coincide with one another are equal to one another
  The whole is greater than the part.

There is an unambiguous relationship with classic Euclidian mathematics and Gestalt psychology. Gestalt psychology has rules that mirror these Euclidian common notions (Lagopoulos & Boklund-Lagopoulou, 1992). But there have been further developments. The prolific Swiss psychologist Jean Piaget (1896–1980) while investigating children’s conception of space discovered highly abstract mathematical structures in the child’s primordial conception of space.  He argues that the further development of geometric space should not be understood as reflecting the capacity of the child’s developing physiological functions, but as a product of the child’s interaction with the world. The child constantly builds up specific structures of perception and reorganizes spatial conception. Accordingly, Euclid’s elements and the topological properties of shapes have their origin neither in the world nor in the history of sciences, but in cognitive schemes that we build up in our daily interaction with objects.
The same understanding—that there are mathematical structure embedded in our cognitive processes—precludes the need for either mathematical or language. These theorems exist independent because that is how the brain is structured. A good example of this pre-mathematics and pre-linguistic ability is provided by a tribe that does not have a concept of numbers in its language. Dan Everett’s description of the Pirahã language of the southern Amazon basin exposes the tangled relationship between mathematical constructs and our cognitive capacity (Everett 2012). The Pirahã language has no clause subordination (e.g. after, because, if) at all, indeed it has no grammatical embedding of any kind, and it has no quantifier words (e.g. many, few, none); and it has no number words at all (e.g. one, two, many).  But they can still count and perform quit complex mathematical comparisons despite the lack of linguistic structure. The main deficit is that they cannot memorize these functions. So they can perform mathematical functions only for the immediate situation. In Popperian terms, they do not have a Third World construct of mathematics to enable them to retain an abstract representation of numbers which mathematicians, through their use of mathematical language can. And mathematicians have created this language, this mathematics where “one” forms the foundation.
Mathematics however has evolved and built upon this concept of “one.” It would be naïve to assume that mathematics has stood still as a discipline. Although the early conception of “one” is very restrictive number, in which ‘number’ means ‘natural number’ mathematics evolved to adopt a less restrictive conception of “one” in which it means ‘integer’; then meaning rationals; then reals, and then complex numbers. With such creations, there is a more nuanced appreciation of the finite interpretations of “one”. In psychology we might distinguish a human being (aka one), and then talk about aggregate or composite features such as family, community, or head, eyes, nose (reals), and then complex numbers such becoming a millionaire, getting divorced, losing a limb, becoming blind (complex numbers.) Mathematics has not extended the domain of numbers, but liberalized what we mean by ‘number’ and as a collinearity what we mean by “one.” Our presumption that there is a single number “one” and that, in extending the number system we simply add and perform “functions” to the numbers that were already there is not what mathematics has become. There are as many number “ones” as there are types of numbers. But by redefining the meaning we are creating a new definition of “one”. One that is less suspect to investigation and study, and bears less of a relationship with anything tangible (Fine, 2012).
We think in very complex ways that is still not understood, continues to be misrepresented and remains misunderstood. The human brain has more synaptic transmissions than we have stars in the universe. The capacity for human thought is immense. Clues are emerging that we think in very abstract ways that mirror the development of theorems in mathematics. Holographic theory of thinking is just one crude method of representing this universe of thought. It is plausible that mathematics could be a portal to understanding our psyche, our art and our behavior. We could learn our limitations, and our attributes and allow for the exploration of a process that we do not yet know and cannot know. We grow up developing our thinking as theorems--despite that in some cases our language does not accommodate such thinking--we still use innate mathematics to develop our sense of numbers and patterns. Mathematics is our way of thinking.  We simply grow out of it, as do mathematicians who simply grow out of being brilliant mathematicians and converge into cultural thought (language, roles and cultural morals.) Mathematicians have a short life of brilliance since their natural thought processes are eventually taken over by pragmatic concerns. Such is the final objective of our brain, survival in the real experiential world. Survival in a sentient world—a world dominated by feeling and experiencing. But mathematics can form the basis of formalizing theories of our thinking processes, mind sensations and feelings.  We need to see beyond the silos of disciplines and view our humanity as more than pitting humans against the hand of god, and simply see the hand of god as our own genius waiting to be acknowledged.

References

Carr B (1977). Popper's Third World. The Philosophical Quarterly Vol. 27, No. 108, pp. 214-226
Diana Deutsch Accessed 8/20/2015:: http://deutsch.ucsd.edu/psychology/pages.php?i=201)
Dummett M (1964) Bringing about the past. Philosophical Review 73: 338–59.
Eco U (2002). Art and beauty in the middle ages. Yale University Press.
Everett C (2012). A Closer Look at A Supposedly Anumeric Language 1. International Journal of American Linguistics, 78(4), 575-590.
Fine K (2012). Mathematics: Discovery of Invention? Think, 11, pp 11-27
Fitch WT & Friederici AD (2012). Artificial grammar learning meets formal language theory: an overview. Philosophical Transactions of the Royal Society B: Biological Sciences, 367(1598), 1933–1955. Accessed 8/20/2015: http://doi.org/10.1098/rstb.2012.0103
Hockenbury DH & Hockenbury SE (2006). Psychology. New York: Worth Publishers.
Kirk GS & Raven JE (1964). The Presocratic Philosophers, Cambridge University Press.
Lagopoulos, A. P., & Boklund-Lagopoulou, K. (1992). Meaning and geography: The social conception of the region in northern Greece (No. 104). Walter de Gruyter.
Ross KL (2011) Mathematics & Music, after Pythagoras. Accessed 8/20/2015: http://www.friesian.com/music.htm
Stillwell J (2006). Yearning for the impossible: The surprising truths of mathematics A. K. Peters, Ltd.

I am indebted to David Edwards, emeritus professor of mathematics from Georgia University for discussing with me the subtleties of some of these thoughts. Having such a knowledgeable and challenging adversary promoted the thinking of this argument and produced a much clearer thesis. However, all misrepresentations, deficiencies and shortfalls are purely my responsibilities.


© USA Copyrighted 2015 Mario D. Garrett