Sunday, October 15, 2017

Are Hormetins the new Fountain of Youth in Aging?

Although aging is inevitable—most likely due to the accumulation of damage at the cellular level, rather than from any one specific program—the actual rate of aging can be an adaptive feature in nature. So although we will all die, there is a certain amount of plasticity in how fast we age and therefore how early or late we die. This plasticity is likely to be controlled by relatively simple mechanisms. Aging research focusing on this plasticity has shown some encouraging results.
Hormetins—sometimes referred to as adaptogens—are a mild stress-induced action that has long-term and broad beneficial effects. Following the dictum that what does not kill you make you stronger, hormetins kick start the body to respond to the mild stress and this response has broader and longer-lasting benefits. Benefits that translate to living longer.
Mild Stress can be induced through four main methods. The easiest and most common is physical activity like exercise, heat, gravity and irradiation. There are emerging interest in psychological methods like meditation, brain exercises, juggling and balancing. However, out of all these methods, hormetins—a unit of a hormesis—are best defined by a supplement. Pop a pill and let your body do the work.
Nutritional stress includes caloric restriction, and anti-oxidants, polyphenols—found more commonly in as fruit and vegetables, tea, red wine, coffee, chocolate, olives, and extra virgin olive oil—flavonoids—plants especially parsley, onions, berries, tea, bananas, all citrus fruits, red wine, and dark chocolate—and lastly micronutrients—that include some vitamins and trace amounts of  iron, cobalt, chromium, copper, iodine, manganese, selenium, zinc and molybdenum..
The trick is to ensure that the trauma is mild enough not to be counterproductive.  With nutrients this is easier to achieve since most of these nutritional supplements are water-soluble and therefore in cases of ineffectiveness you are at worst producing expensive urine.

The problem with nutrients is that everyone is trying to make a buck. Not just snake-oil salesman but also academicians and researchers getting into the “business” of selling immortality and anti-aging pills. In table 2 are a number of nutrients on the far right and far left, that are promoted as beneficial to living longer. On the far right, from Rhodioia  down to Glucosamine, these are said to contribute to the mechanism to their left (shaded smaller cirlces), from Stress Resilience to Tumor Suppression.
We can see that although there are many potential mechanisms, in this review there are nine mechanisms they all contribute to the two main and connected mechanisms through their anti-oxidant benefits and by mimicking caloric restriction (large shaded circles to the left).
Hormesis represents a gyroscope in maintaining a balance between an individual and the environment. Even if a slight elevation of a certain toxic chemical, event or condition in the environment occurs, the body chemistry changes to prepare for it. But this balancing act is not without limitation. The capacity for the body to make biological/chemical adjustments is limited, but there is plasticity in this system of person–environment interaction. Nadine Saul with the Humboldt-University of Berlin and his colleagues have argued that the process of hormesis is a balance that has both positive and negative outcomes. It emerged that for every longevity improvement, there is a reduction in the capacity of the organism for growth, mobility, stress resistance, or reproduction. Saul argues (correctly it seems) that longevity comes at a price, and although hormesis seems to promote longevity, other hormetic costs may ensue, some of which are unknown and unpredictable.
The mechanism of hormesis remains an enigma, although we continue to learn more about how the body develops resilience in response to changes in the environment. In 1962, Italian geneticist Ferruccio Ritossa discovered that heat shock proteins are produced when cells are exposed to a variety of stresses. Initially identified with fruit flies that were exposed to a burst of heat resulting in the production of new proteins that help cells survive. The epigenes responsible for this are called “vitagenes” and maintain balance within cells under stressful conditions. As with the heat shock proteins, these act as chaperones, as minders, in assisting the establishment of “proper protein behavior.” Despite these terms, we do not know how this function is carried out.
Similarly, we now acknowledge that caloric restriction itself might be effective because of its hermetic qualities—a shock to the body—rather than through diet. This might be the case since there are  multiple ways of producing the same effect without adhering to a diet of calorie reduction. The underlying mechanism—rather than the reduction of calories—becomes important. And the underlying mechanism is a shock. If we accept this mechanism, then we should ask “why?”   Why does a shock cause the body to build resilience?
The answer is both simple and radical. A shock causes the body to build resilience because the body is designed to do exactly that. Our body interacts with the environment in order to survive. And to accomplish this adaptation there must be plasticity, some wiggle room, in our capacity. And our biology is a constellation of different entities that depend on each other. How it does this adaptation is more enigmatic, but we now know that there are plasmids and bacteria that help address the needs of our body. These might even recombine with our own DNA to make these adaptations more permanent.
Just as Thales of Miletus (624-546 BCE) the ancient Greek philosopher created science by arguing that we should stop referring to natural phenomena as the “will of god,” in our world we should move away from looking at end of life diseases as “caused by aging” and become more appreciative of the balance we maintain with our natural world. By discarding the new mythology of aging—immortality gurus—we can then focus on plasticity in older age. The fountain of youth might be a fountain for living-well in older ages.

© USA Copyrighted 2017 Mario D. Garrett 


References

Garrett, M. (2017). Immortality: With a Lifetime Guarantee. Createspace. USA.

Lenart, P., & Bienertová-Vašků, J. (2017). Keeping up with the Red Queen: the pace of aging as an adaptation. Biogerontology18(4), 693-709.

Rattan, S. I. (2017). Hormetins as Drugs for Healthy Aging. In Anti-aging Drugs (pp. 170-180). Royal Society of Chemistry.


Sunday, September 24, 2017

Hope Versus Depression

In Hesiod’s telling of the Greek myth of Pandora—the first woman on earth—Pandora is said to have opened a large jar from which all evils escaped into the world, leaving behind hope. Hope was the only thing that remained for us humans. Hope is not tangible, but a state of positive expectation. Hope is an illusion—a trick of the mind—that motivates us to anticipate rewards, rewards that are themselves purely cerebral encouragement. Hope is a house of cards built on the anticipation and yearning for illusory and ephemeral rewards. When Pandora left us with hope she left us with a whole bunch of tricks of psychology. Perhaps for those with depression, even hope escaped out of “Pandora’s box.”  In reality we struggle and suffer and gain momentary pleasure and transient satisfaction until we are released from this ongoing strife by death. This is how we view the life of animals, but not how we view our own lives.  This trick of psychology—Pandora’s Box—releases us from acknowledging our natural daily grind of survival. We have something that we do not ascribe to animals. Humans have feelings, emotions and hope.
In order to understand why we have emotions, we must grasp that humans have a very large brain. Our brain is the most complex entity in the universe and it is this complexity that provides us with a clue of what it does. It represents the world—as we know it—as a model. A virtual reality machine designed to understand our environment and predict the world. It is our passport for survival as individuals and as a species. Emotions are our transient indicators of how well we are approaching this virtual ideal.  Emotions nudge us to change towards specific expectations. Our brain is a perfectly balanced tool to help us improve. However, having such a complex thinking organ comes with one huge disadvantage: It also has the capacity for self-reflection. And self-reflection might be the Achilles Heel in our survival strategy.
In order for the brain to deal with this seemingly inconvenient critical contemplation, it has developed ways of dealing with self-reflection and the obvious daily struggle to survive and our eventual death. Our brain has generated hope as an illusion of a utopia, a heaven—whether on earth or in the afterlife. For the long term we have hope that everything has a meaning, a purpose.  We have a narrative, a story that we make our own. For this hope to be realistic we need to think of ourselves as unique and at the center of our reality. A selfish existence—solipsism—necessary in order for us to have hope. Without a selfish investment in the outcome we would have no interest in hope. Hope is selfish and central to being human.
In 2017 Claudia Bloese wrote that, “…almost all major philosophers acknowledge that hope plays an important role in regard to human motivation, religious belief or politics.” Hope can either be seen as a way to motivate humans to do better, or an excuse to be lazy and hope for the best. In psychology, starting with Charles Snyder’s hope theory, there are two components to hope: the belief that there is a possibility of happiness in achieving goals, and a path to achieving these goals. A kind of behavioristic stepladder, with each successive step-up being promoted by positive reinforcement. But this interpretation changed with Ernst Bloch‘s three-volume work The Principle of Hope (1954-1959). Bloch transforms the aim not of happiness but of an ideal state. Bloch argued that we aim to achieve our goals not because we become happier but because we will achieve our utopia. This is an important admission. For Bloch, a German Marxist, hope is not about being optimistic—some kind of behaviorist ploy of gaining pleasure for every rewarding behavior—hope is an ambition to attain an ideal state. In this interpretation of hope, there is only one other alternative, if not heaven then hell.
The psychology of hope has converged with the utopian and dystopian view of mankind. And Bloch’s proposition fits in with traditional religious beliefs about utopia. Bloch argues that the utopian package entails no death, no disease, no injustices and where everyone is equal. Richard Rorty, the American pragmatist philosopher shares such an interpretation as well. Rorty further acknowledges that hopelessness is always based on the absence of a narrative of (political) progress. This lack of (positive) narrative defines depression.
This is the triad of depression: lack of self worth, negative evaluation of situations and lacking optimism for the future. The opposite of hope, depression is defined by the feeling that “there is nothing to live for.” Depression is having a narrative arc that does not anticipate positive changes. Both hope and depression project into the future. The difference comes in that in order for hope to be real our psychology needs to get rid of the looming prospect of death that has a long shadow in our future. Hope cannot exist with the acknowledgment that we will stop existing. Death is the antithesis of hope. How do we “cure” this final nothingness in our narrative arc?
One of the wrinkles in this concept of hope however is the fact that we all die. What’s the point of everything if at the end of this journey we find that it was just a transient passage. Hosting a party at an airport lounge. There is something rotten in the center of hope, this forbidden fruit for the depressed. In the 1900s William James, the early psychologist called this fear of death the  “worm at the core” of our being. This tension between the belief that we behave as though we are at the center of a consistent universe, with the knowledge of the certainty of our death. To psychologists that now follow Terror Management Theory, this tension constitutes a fundamental quandary for humankind, affecting us radically as nothing else does.

Our psychology came up with a more subtle solution than simply to completely ignore our mortality. We have learned to trick ourselves that perhaps even if we die, we don’t really die. A small part of us remains (soul), or this is only temporary (reincarnation), or we remain living in other dimensions (legacy), or everyone else is already dead (zombies) or this is all a dream anyway (intellectualization.) All together these sophisticated tricks embrace hope and are a formidable barrier to accepting death.
This tension is alleviated by some sophisticated thinking strategies. And these tricks are exactly what are needed to dispel that loss of hope, that depression. But does the science support this view?
In a review of the effectiveness of therapies for depression Andrew Butler and his colleagues reported that Cognitive Behavior Therapies (CBT) was better than antidepressants for depression and was found to be effective for many other mental disorders.  Which is good news since a recent study by the Canadian Marta Maslej and her colleagues reported that medication for depression increases the risk of dying early from all causes, by some 33%. So if we look at the mechanisms of CBT we find some surprising insights. In a classic book on cognitive therapy in 1979, Aaron Beck and his colleagues go on to say that the difference is due to the “…gross changes in his cognitive organization…” (p.21) These cognitive deficits involve:
1.  Arbitrary inference: making preconceived conclusion
2.  Selective abstraction: focusing on select negative aspects
3.  Overgeneralization: applying the lessons from an isolated incident to broader contexts
4.  Magnification and minimization: highlighting the negative and diminishing the positive
5.   Personalization: relating external event to self
6.   Absolutistic dichotomous thinking: categorizing events into two extreme classes (perfect vs. broken)

But if the function of our mind is to develop a view of the world, a world that might be dangerous, then these aspects of cognition are what we do best for our survival. In a world that can and does ultimately kill you, you have to make everything personal.  We select quickly what is good or bad and enhance the ability to protect ourselves and ensure that future events are anticipated, especially if they are likely to be dangerous. The fact that this makes us feel miserable is a separate issue. This cognitive organization is designed for survival, focused exclusively on what could harm you and that ultimately there is no hope as we are all mortal. This acceptance of mortality is perhaps the reason for the salience of death and suicide ideation, attempts and engagement.

Aaron Beck and his colleagues go on to report that: “A way of understanding the thinking disorder in depression is to conceptualize it in terms of “primitive” vs. “mature” modes of organizing reality.”(p.14).  Within our line of thought, if we see depression as a natural state without the tricks of hope, then we can interpret this excellent description of “primitive…gross changes in [his] cognitive organization.” Rather than a mature embrace of this bag of tricks, those with depression are stuck without their own bag of tricks. This is where CBT comes in. Resulting in a narrative arc that our life holds great benefits and pleasure and success and accomplishment, CBT is a way of accepting this bag of tricks that accompany hope. To paraphrase Dan Gilbert, we manufacture happiness. The conclusion is that we accept and promote certain beliefs that round the edges off our ultimate fate—we delude our impending death by having these celebratory moments like bread crumbs on the path to nirvana.

Understanding how we maintain this delusion—of hope—for so long is the linchpin of human psychology. As we get older we lose this shine of hope. We face our mortality up close and personal. As a result, depression increases with older age. From the very first step we take, we strive for independence. Our brain gains mastery in predicting the environment we live in and gaining a sense of self-mastery, even hubris. We control others when we have a positive disposition, when we have a positive story line. Out brain understands this advantage. Our positive narrative arc attracts others and our brain gains better mastery of the environment.   The mastery of our brain is perhaps the only understood at older age, when some of the social façade starts to disintegrate. The question is whether it is better to be happy and live in a delusion of hope or to be depressed and be right. Hesiod’s story of Pandora might have revealed a deeper truth.

© USA Copyrighted 2017 Mario D. Garrett 

References
Bloeser, Claudia and Stahl, Titus, "Hope", The Stanford Encyclopedia of Philosophy (Spring 2017 Edition), Edward N. Zalta (ed.). Accessed online: https://plato.stanford.edu/archives/spr2017/entries/hope/
Beck, A. T. (Ed.). (1979). Cognitive therapy of depression. Guilford press.
Butler, A. C., Chapman, J. E., Forman, E. M., & Beck, A. T. (2006). The empirical status of cognitive-behavioral therapy: a review of meta-analyses. Clinical psychology review, 26(1), 17-31.
Crona, L., Mossberg, A., & Brådvik, L. (2013). Suicidal Career in Severe Depression among Long-Term Survivors: In a Followup after 37–53 Years Suicide Attempts Appeared to End Long before Depression. Depression research and treatment, 2013.
Gilbert, D. (2009). Stumbling on happiness. Vintage Canada.
O'donnell, I., Farmer, R., & Catal, J. (1996). Explaining suicide: the views of survivors of serious suicide attempts. The British Journal of Psychiatry, 168(6), 780-786.
Maslej, M. M., Bolker, B. M., Russell, M. J., Eaton, K., Durisko, Z., Hollon, S. D., ... & Andrews, P. W. (2017). The Mortality and Myocardial Effects of Antidepressants Are Moderated by Preexisting Cardiovascular Disease: A Meta-Analysis. Psychotherapy and Psychosomatics, 86(5), 268-282.
-->

Wednesday, September 20, 2017

Why Does God Want to Kill Me?

We know that we will die.

Yet four out of every five younger adults aged between 18 and 29 believe in the afterlife. At the same time fewer of them say they believe in a god. These Millennials born around 1980-1994—despite doubting the existence of God, believing the Bible is a book of fables, not attending religious services, never praying, and  reporting “being not religious at all”—still believe that they have aspects of immortality. They feel entitled enough to be saved after they die without the necessity of a god to save them. The more entitled they feel—white and middle class—the more likely that they do not believe in god but that there is an afterlife waiting for them. Minorities do not feel this entitlement to the same degree.
This resurgence in the belief of immortality without the shackles of believing in god is new. But they have to ask themselves why does god want to kill them in the first place. Such an entitled group needs to face this question head on. For the answer might hold a greater insight then religion, and to address this question we have to look at physical (biological) anthropology.
As a specie, survival is our only ambition.  The only way the successive generations prosper is if they are a good fit in their environment and survive long enough to create a new generation. Nature has two extreme methods to achieve this single aim. One is to produce an enormous number of offspring and hope that a few survive long enough to pass on their genes. Another approach—one followed by humans—involves having few children whom we nurture until adulthood. This is our survival strategy as a species. These strategies have mythological names: Semelparity refers to “r” strategists (large number of offspring and then die), and Iteroparity for “K” strategists (a few offspring whom we nurture).
Nurturing is an important—and integral—component of our survival strategy. Nurturing involves having things to teach and living long enough to be able to teach them. Which is why we live so long and have such a big brain, the two go together. Aging is not a dustbin of genetics, but an integral part of our strategy for endurance as a species. Aging and having a big brain go hand-in-hand as nature’s plan for our survival.
With aging also comes the opportunity to learn about the environment. We learn in terms of our skills and also through our biology. We develop immunity from the day we are born and some of these biological adaptations end up in our genes through the transfer of genetic material. Our genes are more permeable then we once thought. We get genes not just from our parents but also from the environment. We get gene transfer from bacteria (plasmids), fungi, viruses, sometimes siblings, mothers from their children. We are a magnet for adaptive genetic material from our environment. As we age we pick-up new genetic material and modify existing genes (epigenetics) before we pass these genes on to our children. Our lives are devoted to just this aim.
Because of our big brain we need more than that…we need a more substantial meaning in life because out brain has made us solipsists—at the center of the universe.
Our brains create virtual realities. We create a model of how the world works. For this world to have meaning to us—other than a simple mental toy that helps us predict our environment—we have to be at the center and “own” this world. We therefore believe that we are unique and have free will. Our impression of reality, dictated by having an image of the world that is just, fair and constant, also requires that we do not think about our own death or our model of the world becomes untenable. This is where our belief in immortality comes in. But god wants to kill us, because that is how our species improves. The faster the turnover, with new generations coming through, the better our species can adapt to the environment.  But this clashes with our model of the world.
We want our world to stay constant so that we can retain some level of control over this finely tuned balancing trick. Anticipating our death destroys this impression that the world is orderly and just. But there is one problem with this made-up reality, we see others eventually get old, frail and die. We point at aging as the culprit. That is when we see aging as the problem that we need to solve rather than a survival strategy.
But if we understand aging we will understand the tricks of our psychology. Our strategy for survival—Iteroparity “K” strategy—means that we nurture younger generation. With our increasing lifespan we have been extending this nurturing longer. Could it be that we have been nurturing them too long? That this younger generation have forgotten how to be adults themselves.  The younger generation increasing sense of entitlement is but a reaction to the knowledge that they are truly one mortal link in the immortal chain of life. The shackle of religion that buffers us from this realization is no longer strong enough.
For this new generation, we have nurtured them to the extent that they feel important enough that they do not have to consider death. The model created by their brain includes an afterlife to alleviate the possibility that they are not at the center of the universe. Emerging generations are rejecting death and they are also not having children. By liberating themselves from religion, norms and expectations they are rejecting the need to have children. Children will necessitate them to move away from being at the center of the universe.
 Our aging is an integral part of survival up to a point. Because we have extended our longevity we are nurturing our children too long. Our human psychology that relies on us being at the center of the universe feeds off this nurturing and becomes a prominent feature of our existence. Throughout all this nature wants too maintain a turnover. We are meant to die, as much as it is detrimental to the individual, aging and death form our strategy as a species. Our personal salvation is that we delude ourselves this reality and for emerging generations they are doing this by avoiding god and believing in an afterlife.

  © USA Copyrighted 2017 Mario D. Garrett 

References

Garrett, M (2017). Immortality With a Lifetime Guarantee. Createspace. USA.
Harley, B., & Firebaugh, G. (1993). Americans' belief in an afterlife: Trends over the past two decades. Journal for the Scientific Study of Religion, 269-278. 
Twenge, J. M., Sherman, R. A., Exline, J. J., & Grubbs, J. B. (2016). Declines in American adults’ religious participation and beliefs, 1972-2014. Sage Open, 6(1), 2158244016638133.   

-->

Monday, September 11, 2017

Age Apartheid

I sometimes stray off in class. Like some students, the classroom becomes my own little world of fantasy. Except, unlike my students, I am teaching the class.
Last week I was discussing how peer-ist our society is. We tend to only mix with people our own age. As I was lecturing I tried to recall the last time I held a baby in my arms, and in front of 110 students I realized that it must have been more than two years ago. I joked that I see a lot more older people because that is my job. But unless you live in an extended family, and most students in the United States do not, then it is unlikely for them to interact with children or older adults on a consistent basis. By not engaging with older adults my students are likely to develop negative ageist stereotypes
In 1992 Joann Montepare and her colleagues looked at how college students’ spoke with their grandparents and parents on the phone. They found that with their grandparents, college students had a higher pitch used more babyish, feminine voice, while at the same time being more deferential and congenial. Different from the type of speech exchanged with their parents. And this differential treatment starts much earlier than college.
Children tend to evolve a negative view of older adults early on. Negative views of older adults seem to come naturally to young minds. For example in 1990 Charles Perdue and Michael Gurtman asked children to recall traits after they were introduced to the person they are recalling the traits for. They could recall more negative traits when their reference was an “old” person and more positive traits about a “young” person. Children already have preferential memories. They remember AND recall negative traits because they are already associated with older adults. The author argue that these age biases are automatic, unintentional and unconscious. It seems that such discrimination is pervasive and results in negative behavior towards older adults.
In 1986 while observing behavior of children as they interacted with elderly people Leora Isaacs and David Bearison found that children were quite discriminating.  When faced with either of two study helpers—one was much older, but both dressed similar and professionally—when with the older helper, children sat farther away, made less eye contact, spoke less and initiated less conversation and asked for less help. Children have already learned to keep older adults at a distance.
Could closer interaction remove these stereotypes?
One way to deal with these negative stereotypes is to develop a closer association with older adults. But the results were initially surprising. The University of Maryland professor, Carol Seefldt in 1987 found that 4 and 5-year-old children who had visited infirm elders in a nursing home once a week for a full year held more negative attitudes towards older adults compared to a similar group without this contact. However, the day care and nursing home staff, reported positive and long-lasting benefits to both the children and elders.
I remember my children coming home from Montessori School proud to tell me that they visited a nursing home with “old people.” Knowing that this was my interest they knew I was interested in what they learned and I was anticipating a positive response. Smelly and horrible was their response. But then in hindsight it should not have surprised me. If my experience of older adults is exclusively based on a nursing home, I similarly would have a very negative view of aging.
Which explains why the evidence that intergenerational contact influences children's attitudes is mixed. In 2002 Molly Middlecamp and Dana Gross enrolled 3-to-5 year old children in either an intergenerational daycare program or regular daycare program. They found that the two groups were very similar in their attitudes to older adults. In general, children rated older adults less positively than they did younger adults, and these children believed that older adults could participate in fewer activities than children could. The take home lesson is that not all prejudices can be overwhelmed by knowledge, only through appropriate knowledge.  
Without appropriate engagement, we get a voluminous amount of information about older adults exclusively from the media, especially as reflected in adolescent literature. David Peterson and Elizabeth Karnes reported that in fiction literature older persons were underdeveloped and peripheral to the major action in the books reviewed. And there are nuances in perception that are determined by the socio-economic context. Tom Hickey and his colleagues as early as 1968 found that among the third grade, students from higher socioeconomic groups looked more favorably on older persons (although perceiving loneliness problems), and children from poorer homes did not anticipate loneliness but expected senility and eccentric behavior. A social component of the type of stereotypes is evident.
If my information is coming from a negative source, then my negative views are unlikely to be assuaged. My social class or culture might modify these stereotypes. By designing an appropriate intervention, where young people interact in a meaningful way with older people, only then can negative views of aging be replaced with more realistic perceptions. This was the intention and success of a 2002 program initiated by Eileen Schwalbach and Sharon Kiernan. The program was designed for fourth grader to visit an elder "special friend" at a nursing home every week for five months. They were primed before their visit by describing some of the issues that might come up during their visit. During the course of the study, the 4th graders’ attitudes toward their "special friends" were consistently positive and their empathy increased.
Milledge Murphey, Jane Myers, and Phyllis Drennan wrote a review of such effective programs. They especially focus on the seminal program begun in 1968 by Esstoya Whitley.  As part of their school curriculum, 6-8 years old students "adopted" a grandparent from among residents of a nearby nursing home.  As anticipated the children’s attitudes became more positive towards their adoptee. But what was unexpected was that the children continued visiting their adopted grandparents for a few years at least three times per week. The children gained a positive attitude toward the elderly and a more realistic view of aging and developed a true relationship with their adoptees.
But perhaps the most memorable study of interaction was a recent 2017 British factual entertainment program—euphemism for reality TV in the United States—by Channel 4. Although such intergenerational programs have been conducted in the United States for more than half a century, this was the first time it was televised from the start. The nursing group participants came from St Monica Trust retirement community in Bristol where once a week for six weeks a group of 4-yer old kindergartners descended upon the sedentary tranquility of the nursing home and infused it with ambulant energy. The weekly television series updates the viewers with funny and eccentric interactions. But at the end what the show clearly shows is how the older residents improve their cognition, physical ability and mental health across the six week of interaction with the children. In turn the children develop greater empathy for their older playmates.
And the question is why were we separated in the first place? How and why society become so age-segregated?
Looking across a sea of young faces in class I realize that we start at school and the best place to disaggregate is schools. Ivan Illich, the infamous activist from the 1960s already covered this topic. In the 1971 book on Deschooling Society Illich discusses ways of removing the barriers to education and to incorporate education into the general social network through social hubs like libraries. With the incredible amount of money that educational institutions make—especially publicly funded ones—there is no incentive to change the status quo. Until then, we have to suffer the consequences of age apartheid that we continue promoting, while feeling enriched and uplifted when we see those barriers removed, even if for our brief viewing pleasure, albeit on television for now. In the meantime I need to get back to my age-disaggregated class.

 © USA Copyrighted 2017 Mario D. Garrett 

References
Atchley, R. C. (1980). Social forces in later life. Belmont, Calif.: Wadsworth.
Brubaker, T., & Powers, E. (1976).The stereotype of 'old': A review and alternative approaches. Journal of Gerontology, 31, 441-447.
Channel 4 (2017). Old Peoples Home for Four Year Olds. Accessed online 12/9/2017: https://www.youtube.com/watch?v=Xm2z5468htA
Duncan, R. Preface. In E. Whitley (Ed.), From time to time: A record of young children's relationships with the aged. Florida: College of Education Research Monograph No. 17, University of Florida, 1976.
Gruman, G. J. (1978). Cultural origins of present-day ageism: The modernization of the life cycle. In S. F. Spicker (Ed.), Aging and the elderly: Human perspectives in gerontology. New Jersey: Humanities Press.
Henderson, M. E.; Morris, L. L.; 8c Fitz-Gibbon, C. T. (1978). How to measure attitudes. Beverly Hills, Calif.: Sage,
Hickey, T., Hickey, L. A., & Kalish, R. A. (1968). Children's perceptions of the elderly. The Journal of genetic psychology, 112(2), 227-235.
Holmes, C. L. (2009). An intergenerational program with benefits. Early Childhood Education Journal, 37(2), 113-119.
Illich, I. (1973). Deschooling society (p. 46). Harmondsworth, Middlesex.
Isaacs, L. W., & Bearison, D. J. (1986). The development of children's prejudice against the aged. The International Journal of Aging and Human Development, 23(3), 175-194.
Middlecamp, M., & Gross, D. (2002). Intergenerational daycare and preschoolers' attitudes about aging. Educational Gerontology.
Montepare, J. M., Steinberg, J., & Rosenberg, B. (1992). Characteristics of vocal communication between young adults and their parents and grandparents. Communication Research, 19(4), 479-492.
Murphey, M. & Myers, J. E. (1982). Attitudes of children toward older persons: What they are, what they can be. The School Counselor, 29 (4), 281-289.
Perdue, C. W., & Gurtman, M. B. (1990). Evidence for the automaticity of ageism. Journal of Experimental Social Psychology, 26(3), 199-216.
Peterson, D. A., & Karnes, E. L. (1976). Older people in adolescent literature. The Gerontologist, 16(3), 225-231.
Robertson, J. (1976). Significance of grandparents. Gerontologist, 16, 137-140.
Schwalbach, E., & Kiernan, S. (2002). Effects of an intergenerational friendly visit program on the attitudes of fourth graders toward elders. Educational Gerontology, 28(3), 175-187.
Seefeldt, C. (1987). The effects of preschoolers' visits to a nursing home. The gerontologist, 27(2), 228-232.
Whitley, E. (1976) From time to time: A record of young children's relationships with the aged, Florida: College of Education Research Monograph No. 17, University of Florida.

-->

Sunday, September 10, 2017

Dawn of the Age of Parrhesia

"Parrhesia" comes from the Greek playwright (tragedian) Euripides meaning literally "to speak everything" and by extension "to speak boldly", or "boldness". It is a form of extreme candor. It implies more than just freedom of speech, but the obligation to speak the truth for the common good, even at personal risk. 

Parrhesia was a central concept for the Cynic philosophers, and then later on applied by the Epicureans in a manner of frank criticism. This was a common method of discourse in philosophy at the time, which was later championed by the post modernist Michel Foucault. This was in contrast to rhetoric, which was a method to help persuade the audience. Rhetoric, the art of effective or persuasive speaking or writing, designed persuade and impress its audience. Often regarded as lacking in sincerity. In todays context some would (incorrectly) refer to it as fake news.

For the parrhesiastes--the one who uses parrhesia--they say everything that is on their mind, hiding nothing. By opening one's heart and mind completely to other people through their discussion, the speaker gives a complete and exact account of what they feel and think, unfettered by niceties or eloquence,  so that the audience is able to comprehend exactly what the speaker thinks. One who uses parrhesia must be critical of everything including themselves. They must also not bend to popular opinion or cultural norms, even if this endanger their life. Publicly the user of parrhesia must be in a subordinate to those being criticized. 

In todays world we are lost between the rhetoric and the false parrhesiastes. 

Those that tell the truth--parrhesiastes--are many, but they are necessarily individual. In a complex society, to be able to be fully knowledgeable and honest you have to be a specialist, an expert. Non experts are the rhetoricians. They persuade others that they are telling the truth but they truly do not understand the truth. And those are the rhetoricians. Unfortunately it is very difficult to tell them apart. So the knee jerk reaction is to disqualify all experts, Like a serpent that swallows its tail we have dismissed those individuals that can help us understand the truth.

The serpent that swallows its tail is the symbolism for alchemists. History can teach us the fate parrhesiastes. Alchemists were another set of individual experts that have been vilified throughout history.  While the symbol of Ouroboros--the snake eating its own tail--represent infinity and wholeness, it does have some strange bedfellows.




Reference
Foucault, Michel (Oct–Nov 1983), Discourse and Truth: the Problematization of Parrhesia (six lectures), The University of California at Berkeley.he parrhesiastes uses the most direct words and forms of expression he can find.