Saturday, January 27, 2018

The Religion of Aging: Finding Meaning

In a 2014 Pew Research Center study nine out of ten adults in the United States report believing in God and more than half are “absolutely certain” God exists. While one in five Americans pray every day, attend religious services regularly and consider religion to be very important in their lives. Although these proportions are declining precipitously since an earlier 2007 study, today religion still plays an important role in the lives of older people.
As adults get older they get more spiritual and some become more religious. It is not only that religious or spiritual people tend to live longer (they do, for many reason other than spirituality), but that older people become more spiritual and religious as they age.
There is a great attraction to argue for a spiritual interpretation of aging. Two religious gerontologists did just that when Jane Marie Thibault and Richard Lyon Morgan in 2012 made themselves their own subject matter when they wrote a book about their aging experiences. In a self-described pilgrimage into their third age, they interpret aging through religion. While growing up God has shown us how much he loves us by making us healthy, giving us pleasure through our bodies, nature, perhaps experiencing the miracle of having children. As we age then it is time for us to show God how much we love him in return. God stops showing us how great he made us and now it is our turn to reciprocate. In one example, by using “dedicated suffering,” we acknowledge our pain and dedicate it for the benefit of others. And it works. When people dedicate their suffering they report a reduction in pain. This spiritual switch—as older adults we are now responsible for the expression of gratitude—has some surprising support in the scientific field.
The Swedish sociologist Lars Tornstam in 1989 developed a theory that argued that older age brings about spiritual growth. Gerotranscendence Theory suggests that older individuals—perhaps because of ill health—tend to experience a redefinition of self and their relationships with others. By redefining ourselves we become more spiritually aware. More recent in 2009 the American Pamela Reed in developing her own Theory of Self-Transcendence states that individuals who face human vulnerability have an increased awareness of events that are greater than them. So is spirituality the answer to this increasing loss of control that we experience as we age?
Research tends to support this interpretation. In one review, the Portuguese researcher Lia Araújo and her colleagues, report numerous studies showing that religion, spirituality, and personal meaning have a broad range of mental and physical health benefits, satisfaction with life and coping better with stress. In older age, existential issues—contemplating life and death—appear to gain increasing importance. There seems to be a growing preference for acquiring meaning from faith. It seems that the greater the challenge the greater the religious or spiritual meaning that we gain from the experience. By gaining a positive meaning of life, purpose, religion, and spirituality individuals also gain a higher level of life satisfaction. Regardless of physical health, developing a positive attitude toward life has positive outcomes. It is only when religion becomes an ineffective tool for explaining dramatic challenges that people start revoking their religious conviction.
Christopher Ellison with the University of Texas at Austin and others have referred to this area of research as the “dark side of religion.” Doubt in our beliefs can have very negative consequences. Doubt erodes one of the major functions of religion which is to provide an explanation for why we are aging—such religious explanations are generally referred to as theodicies
But we are always looking for a reason, a model of the world that is just, logical and predictable. Religion has that extra facet of immortality—life in the afterworld, a comfort to those that have to confront the eminence of death. Whether we get this view of the world from religion, science or from intellectualizing, the overarching observation is that we need to have such a view. Everyone has an opinion on things that matter to them. Some simply don't call it religion but having an explanation comes with the territory of being human.

© USA Copyrighted 2018 Mario D. Garrett

References
Araújo, L., Ribeiro, O., & Paúl, C. (2017). The Role of Existential Beliefs Within the Relation of Centenarians’ Health and Well-Being. Journal of religion and health, 56(4), 1111-1122.
Ellison, C. G., & Lee, J. (2010). Spiritual struggles, and psychological distress: Is there a dark side to religion? Social Indicators, 98, 501–517.
Rogers, M. E. (1989). An Introduction to the Theoretical Basis of Nursing. Philadelphia: F. A. Davis
Rodin, J. (1986). Aging and health: Effects of the sense of control. Science, 233(4770), 1271-1276.
Thibault, J. M., & Morgan, R. L. (2012). Pilgrimage Into the Last Third of Life: 7 Gateways to Spiritual Growth. Upper Room Books.

Humility or Humiliation in aging, its your choice.

It is personal when it happens to you. As much as we talk about changes in older age, it remains at a distance, until it happens to you. Most of the time the loss of function happens fast and we are unprepared. While most of us might recover from an initial loss, we only have to face another different one shortly thereafter. Little pieces of you are taken away. And our mind does not deal well with these losses. You did not plan for it, and even if you thought of this eventuality, when it happens to you it is different. It is personal and real.
We have a model of the world in our brain. Within this perfect heaven there is our avatar, an image of us, who we think we are. As we get older and frailer—usually these come together—the reality conflicts with the avatar that we have built.  This model is important for us. Most of the time the model of the world and the avatar representing us functions well. We function on a daily basis without needing to be aware of this model. We behave in automatic mode most of the time. Until something goes wrong and the avatar can no longer do what its suppose to do.  The mental narrative that we have taken so long to build up suddenly needs to be re-arranged and re-modeled.
In aging, not long after the first of such redefinition of our model—perhaps we realize that we cannot read small print anymore without using prescription glasses—then comes another onslaught of loss. The constant change and attrition, requires us to be repetitively modify our model and our avatar. Aging is an existential danger to our model, because it threatens how that model is suppose to function. Making these changes is difficult for everyone since our model resists change, as it has been a faithful portrayal of our reality for so long. The older we get the more entrenched this model becomes. It is also doubly difficult in older age because there is so much variance among our peers. We delude ourselves that perhaps these attritions are only temporary and therefore we do not need to change our avatar just yet. There is always a lag in how old we are in reality and how old we see ourselves—a subjective age bias. Of course we are biased to see ourselves younger.
Many theories exist for why we underestimate our age. Overestimating our abilities, our looks, how satisfied we are in life, and aligning our personality, attitude, behavior and interests with that of a much younger person. Some theories also suggest that there is an internal bias to be young. But these theories assume that there is a conscious, if not willful desire to stay young. Although all these theories are valid, but there could be a simpler answer. There could be a lag, a time difference, between reality and how our model represents it. It takes time for us to reconcile reality. And the process is dynamic and we are continuously fighting this change. This dynamic process has not gone unnoticed.
In psychology by the 1950, Erik Erikson developed the first personality theory that included older adults. Before then most theories stopped at young adults. Erikson’s eight-stages of development comes closest to explaining this constant fight we experience in older ages. Likely written by his wife Joan Erikson, the final stage of development emerging late after age 65 years. This stage contests that there is a fork in the road. At this fork which Erikson called “crisis,” we either go towards ego integrity or we go headlong into despair. As dramatic as this crisis seems, it is emerging that such depictions are very close to the experience of aging.
By ego integrity Erikson means that we come to accept who we are. That we only have this life to live, and that we need to resolve issues in order for us to be able to be comfortable with where we are. Although seemingly diametrically apposed (ego versus non-ego) Lawrence Kohlberg’s 1973 theory of moral development later expanded to address older adults, included a stage of self-transcendence a “...contemplative experience of the nonegoistic or nonindividual variety” (p.500-501). Ego integration and non-ego seem to refer to the same concept, that of humility. The only salvation to older adults is becoming humble. John Cottingham in 2009 defines humility is, “ ... a lack of anxious concern to insist on matters of status, a recognition that one is but one among many others, and that one’s gifts, if such they be, are not ultimately of one’s own making” (p.153).
The alternate to humility is pride, when we are constantly fighting unresolved issues that continue to fester and create discord in our life. Later on Joan Erikson formulated a ninth stage of very old age that starts in the eighties when physical health begins to deteriorate and death becomes more real. She recognized at this stage that society similarly ups the ante, “aged individuals are often ostracized, neglected, and overlooked; elders are seen no longer as bearers of wisdom but as embodiments of shame” (p. 144). It seems that unless we subjugate ourselves to humility the alternate is humiliation.
That is why it is personal. Its not just about accepting aging, its that we have no choice. We either suck it up and become humble or fight it and face a certain humiliation. By sucking it up we acknowledge our mortality and therefore impermanence, our humility. If we fight it we rally our pride and confront these changes with certain outcome, failure and humiliation. Science tends to support this view. Neal Krause and David Hayward with the University of Michigan wrote that when it comes to humility, people that live the longest are the ones that accept where they are in life. By become less of ourselves (ego-less) nature rewards us with more of ourselves (long life.)
Someone has a dark sense of humor, and I hope that I live long enough to  learn to appreciate it.

© USA Copyrighted 2018 Mario D. Garrett

References
Cottingham, J. (2009). Why believe?. New York: Continuum. Erikson, E. H. (1956). The problem of ego identity. Journal of the American Psychoanalytic Association, 4(1), 56-121.
Kohlberg, L. (1973). Stages and aging in moral development—some explanations. The Gerontologist, 13, 497–502.
Krause, N., & Hayward, R. D. (2012). Humility, lifetime trauma, and change in religious doubt among older adults. Journal of Religion and Health, 51(4), 1002-1016.
Teuscher, U. (2009). Subjective age bias: A motivational and information processing approach. International Journal of Behavioral Development, 33(1), 22-31.

Aging Envy


Sunday, January 21, 2018

An apology to Paul Dirac.

The basis of mathematics is "one"
But "one" does not exist in reality
It remains a construct of the mind.
Mathematics is a product of psychology
Patterns and rhythms that seem godlike
Even to some, music to our ears
Patterns we can hear and enjoy, comfort
Comfort reflecting more what we seek
Mathematics being the sign of divinity
A divinity that we seek as comfort to our search

Perhaps physics is all wrong, we need to examine our psychology first as post-modernist scholars before predicting the behavior of gods..

Monday, January 1, 2018

Theoretical Summation of Culture.

What is Culture? 
A cursory literature search will result in 16 different papers with exactly this same title “What is Culture?”. Much more has been written about culture in general. Culture attracts an apparent interest for obvious reasons as the concept seems to determine how we humans behave. However we are still not sure what “culture” means. The level of confusion resulted in Merriam-Webster’s announcement in 2014 that “culture” is their Word of the Year. Everyone was querying the meaning of this concept. For a concept that is so important, and intuitive, it eludes concrete definition. Culture seems to have different meaning, covering a broad number of social influences that we are trying to describe.
As early as 1952, while attempting to define culture the American anthropologists Alfred Kroeber and Clyde Kluckhohn ended with 164 different definitions. That was then. Nowadays everyone seems to enjoy the liberty of defining their own unique meaning of culture. Today it would be a daunting task to catalogue all the different definitions. Most definitions are unique while other definitions are amnesiac plagiarism.
Some of the differences in definitions emerge from the different use of the word. ​ By providing an historical perspective Kevin Avruch came up with three basic classes of definitions.
1. There is the culture that defines the ambitions of mankind “high culture” in contrast to “popular culture.” Popular culture being a failed and inferior culture that emerges from the people as apposed to high culture with a set of shared behavior dictated by historical protocol. This has its roots from Matthew Arnolds’ Culture and Anarchy (1867). Having contrasting cultures inevitably leads to conflict as defined in the 19870s by by Antonio Gramsci’s concept of hegemony between dominant and subordinate cultures. Hegemony refers to how one set of cultural rules are imposed and accepted by another group, usually to the detriment of the second group. And giving rise to the concept of “sub” culture as defined early by the Chicago School, who interpreted sub-cultures as forms of deviance and delinquency. Which leads into the earlier interpretation of society by Emil Durkheim, the French Sociologist who in the late 1800s defined how different parts of a society have different functions—cultures—but that society was more than the sum of its parts.
2. In this same vein of thought, culture could also be seen as gauge of how civilized a community is along a continuum. One measure of civilization. The American anthropologist Lewis Henry Morgan’s influential scheme provided evidence for monogenesis, the theory that all human beings descend from a common source—as opposed to polygenism, with multiple and equally valid development. In monogenesis, cultures evolve on one criterion only: from “savagery” through “barbarism” to “civilization”.  Such simplistic determination was of course very popular. Similarly Edward Tylor in Primitive Culture (1870), referred to a quality possessed by all people in all social groups, who nevertheless could be arrayed on a development and evolutionary continuum that assumes that humankind is heading towards an ultimate sophisticated culture. A linear progression where the western culture sits at the pinnacle.
3. The third use of culture reacts to this monogenesis and is best exemplified by Franz Boas. Influenced by the eighteenth-century writings of Johann von Herder, Boaz emphasized the uniqueness of the many and varied cultures of different peoples or societies. Boaz also interjected with relativity—we can only judge another culture from our own. Becoming the champion of post-modernism, which argues for the relativism of how we perceive everything, Boaz undermined the idea of a linear definition of culture. We are not heading to an ultimate goal of the "best" culture. Since cultures emerge from the uniqueness of their environment, one cannot differentiate high from low culture. Moralizing about cultures—“savagery” through “barbarism” to “civilization”—remains only one perspective from “our “ culture and not an inherent feature of cultures.As early as 1948, Thomas Stearns (T.S.) Eliot is primarily known for his poetry but he devoted a significant amount of time defining culture. He mused that culture is attached to religion and as a superorganic concept it evolves naturally from a community.
Others have attempted similar categorization of the use of culture. In 1976, the critic Raymond Williams reported that "Culture is one of the two or three most complicated words in the English language." His definition of culture in Keywords  is similarly based on three uses of the word: educating oneself becoming “cultured”; culture as a group’s shared way of living; and culture as an activity as in doing something cultural. Again the diversity of definitions is primarily based on the utility and use of the word. How the word culture is applied determines its definition.
All of these uses of the term culture refer to a common theme. Culture is a way of living. That there are certain values and traditions that determine religion, beliefs, shared ideas, habits, attitudes, expectations, norms, art, law, morals, customs, that are passed along from one generation to the next. Culture can be as broad as a language, and as specific as a dialect or an inside joke.  As a result,  culture remains intricately tied to our place of residence. All humans express multiple cultures.
Sometimes the place we reside, and the community we share are distinct enough for a sub-culture to emerge. Culture helps community members avoid misunderstanding and minimizes conflicts within that particular community. Conflict occurs when expectations are not met. When people move into the community from outside and are not aware of the expectations as dictated by the culture.
Culture is learned in a social setting. It is not inherited. It derives from one’s social environment. The enigma with culture emerged when researchers attempted to segregate it as a distinct body of expectations that we can adopt. Acculturation takes a long time if it can ever be fully complete. Only after acculturation can there be an understanding and acceptance of a different culture. Accepting and following a dominant culture allows for a smoother social engagement. Which is why we see that acculturation has numerous measures of psychological and physical wellbeing (e.g. increased life expectancy) but also inherit negative aspects of the culture we adopt (e.g. obesity in the US).
But culture cannot be distinguished from human nature or an individual’s personality. Culture is what makes us human and a particular kind of human. Feral children that are found living wild have all the mechanics of being human but none of the essence. Andrei Mihai reports that such children never fully integrate into society and have difficulty with basic language and civic protocol. Culture and its socialization is what makes us quintessential human including morals, language and aspirations.
In a culture that honors individuality, our personality will reflect that (e.g. on a continuum extrovert vs introvert). In a culture that honors the common good such a continuum does not make sense and instead people lie along a different continuum (e.g. collectivism vs. individualism). In personality research that are based on the five dimensions : Extraversion, AgreeablenessConscientiousness, and Neuroticism cross-cultural variations exist. Our culture can and does determine our personality. Human nature is not purely biological but social.
How we behave is acquired through learning and interacting with other members of our culture. Even what we eat, how often, how much and with whom is dictated by a set of rules enshrined in our culture. One extreme example is cannibalism. When Marvin Harris wrote Cannibals and Kings in 1977 we had a view that culture was somehow independent of the environment. Harris made that connection with food. For example the development of pork as a taboo food in ancient Egypt comes from the fact that pigs are poor grazers, destroy plants and compete with humans for grain. While cattle, sheep and many other domesticated animals consume grass without digging the roots, they also provide milk, transport, and labor.  Harris notes that pigs were taboo in ancient Egypt, then by Israelites and continues to be forbidden by Islam. The culture that dictates what food to eat emerges from environmental considerations. The culture of making food taboo enables a community to maximize its food production. Culture therefore resides as a bridge between environmental pressures and personal preferences. Culture also influences other aspects of our behavior other than food.  The emerging understanding about culture is that is allows for a way of moderating environmental demands and community needs across time. A historic template used for future generations on how to behave. Culture succinctly encapsulates a protocol, a set of rules, transmitted to future generations in order to  increase their chances of survival. In order for these protocols to be followed, we have developed a system here thee protocols are These protocols are not suggestions but dictates behavior. They are social formed and socially acquiesced and social transmitted.
Gary Ferraro in 1998  exposed the different levels of culture from national, regional, gender, generational, role, social class, employment, ethnicity and many more other spheres. Then there are the cultures among families, tribes or clans; those cultures distinguished by language, ethnicity, or religion; by social classes; by political interest groups; and by elected membership (clubs). No person has a single culture. Cultures are the flippers in a pinball machine, paddles (norms) that direct the ball (behavior) into a desired place (conventional behavior). The volume and depth of these different cultures makes it un-wielding. The conclusion is that no two individuals share the same cultures. Such insight necessitates that instead of addressing cultures as distinct we need to see all these different cultures as sharing a common heritage.
Lets assume that our distinction between an “I” and anything else outside of me is contrived. There is a force that stops me from making this judgment. There is a natural force that pushes me to think about “I”.  But even when I try and identify “me” I need a social context. 
In 1982, John Turner argued that: “individuals define themselves in terms of their social group memberships and that group-defined self-perception produces psychologically distinctive effects in social behavior.” This socialization is what makes us distinct. If I need the social context to define “me” then culture—being the social area where we define our norms of behavior—must be an integral aspect of who I am. My culture is both deterministic—controls what I do—but also is an expression of who I am within a given environment.
Such analysis is not new. As early as the 1950s, Harry Sullivan argued that: “…human beings are human animals that have been filled with culture—socialized…” (p. 323) Arguing that culture is how we define ourselves as individuals. We seem to have a dual aspect of ourselves. Both a social aspect and a personal self holds together my sense of self. Emil Durkheim proposed that humans are  “homo duplex”, where one existence is rooted in biology and one in a social world in our culture. What is surprising is that our biology is also designed to integrate our social environment. There are specialized areas in our brain that “mirror” our environment.
In the 1980s, the Italian Giacomo Rizzolatti and his colleagues at the University of Parma, first observed mirror neurons in monkeys. Although mirror neurons exist in most animals, in humans as much as 10 percent of neural cells are devoted to mirroring. A mirror neuron fires both when a person acts and also while observing the same action performed by another person. Such mirror neurons respond directly to what is observed outside. Our brain responds and mimics the activation of another person’s behavior and activity. Culture is automatically transferred through our brain.
The accumulating evidence suggests that the body is a meeting place of interaction, a venue with the outside world—the geography, the community and significant others interact with the idea of self. Culture is how we explain this interaction—social influence—to ourselves.
Psychologist have long known this. Especially with developmental psychology looking at how children develop and learn. 
The Russian Lev Semyonovich Vygotsky (1896-1934) founded cultural-historical psychology. He believed that children learn through play and interacting with their environment. At the time there were three theories of how we learn: Constructivism, Behaviorism and Gestalism.
Constructivism: We need to mature first to be able to learn. Development always precedes learning. Championed by Jean Piaget who referred to this as genetic epistemology, the theory proposes that we cannot learn unless we are developmentally ready to learn.
Behaviorism: Where both learning and development go hand in hand and occur simultaneously but where learning is development.
Gestalism: A symbiotic relationship between learning and development where development influences learning and learning promotes development.
Vygotsky argued the opposite to Piaget’s concept of genetic epistemology. Learning precedes development. In this sense he is more of a Behaviorist. He argued that, “We do not learn because we develop, we develop because we learn." Vygotsky's "Zone of proximal development" (ZPD) describes the interaction that a child has with their culture. By interacting with their culture in the ZPD a child learns skills that go beyond the child’s actual developmental or maturational level.
Learning in the ZPD is accomplished through both informal conversations and formal schooling. Adults pass on to children the ways to interpret the world. As children and adults interact with each other—later defined as scaffolding, supporting children to learn and then by taking away the scaffolding they learn the skills by themselves—adults share meanings about objects, events and human experiences. Adults are able to mediate and transmit meanings through language, math, art, music, and behavior (e.g. religion).
In The Ecology of Human Development, another Russian-born developmental psychologist Urie Bronfenbrenner transformed Vygotsky views on culture to one based on the environment. Vygotsky’s ZPD has been expanded to four zones or spheres.  Whereas Vygotsky ZPD sphere is  cultural, Bronfenbrenner calls these spheres environmental, and extended their influence. Bronfenbrenner was the co-founder of the Head Start program, a social program that provides comprehensive early childhood educationhealthnutrition, and parent involvement services to low-income children and their families. This social program is based on Bronfenbrenner’s ecological model. This ecological model expands Vygotsky's ZPD to four spheres of influence on the child’s development to include global environment of the child.
From a microsystem which defines the family and school; Mesosystem that describes the interaction of the family with social structures; Exosystem which involve interaction with less frequent visitors like relatives, friends, parent’s work colleagues, religious leaders, and neighbors and lastly; the Macrosystem which defines the broader culture of economy, customs and bodies of knowledge.
Bronfenbrenner argues that: “No society can long sustain itself unless its members have learned the sensitivities, motivations , and skills involved in assisting and caring for other human beings.” (p.53) Both Vygotsky and Brofenbrenner’s theories talk about spheres if influence that are equally important. Culture in this context both defines us and determines how and what we learn. We, in-turn, pass on this body of knowledge, this culture to younger cohorts. There is a symbiotic relationship and the central theme that makes culture humanistic is a caring curriculum.
Accepting that there is not just a "me" inside us but also a "we" then there is a more concise understanding how the culture determines behavior and outcomes.   My individuality is no longer solely about me but about my culture. Emil Durkheim argued that there will be a conflict between the biological and the cultural aspect of the homo duplex. That the cultural aspect of “me” will conflict with my own impression of “self.”
This is all esoteric stuff. But it leads to some very practical conclusions. We can never know the culture of another person. The culture of a group of people is tied to a time and a place. We can never know that culture unless we also experienced it ourselves. That becoming cultural attuned remains elusive. A more radical awareness being that we learn through "scaffolding"  a network of cultural interaction that might no longer be evident in the present day. 
Studying culture as a psychological feature might result in a better understanding  of ourselves as a product of our environment. Sometimes culture is a visible expression of that relationship, but it is mostly hidden and a historic event that cannot be traced back.
© USA Copyrighted 2018 Mario D. Garrett
References
Adler, N. (1997) International Dimensions of Organizational Behavior. 3rd ed. Ohio: South-Western College Publishing.
Apte, M. (1994) Language in sociocultural context. In: R. E. Asher (Ed.), The Encyclopedia of Language and Linguistics. Vol.4 (pp. 2000-2010). Oxford: Pergamon Press.
Avruch, K. (1998) Culture and Conflict Resolution. Washington DC: United States Institute of Peace Press.
Baumeister, R. F. (1998). The self. In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.), Handbook of social psychology (4th ed., pp. 680-740). New York: McGraw-Hill.
Bell M.G. (2010). Agent Human: Consciousness At The Service Of The Group. Kindle edition.
Berg, R. (1996). "The indigenous gastrointestinal microflora". Trends in Microbiology 4 (11): 430–5. doi:10.1016/0966-842X(96)10057-3. PMID 8950812.,
Bianconi, E., Piovesan, A., Facchin, F., Beraudi, A., Casadei, R., Frabetti, F., ... & Canaider, S. (2013). An estimation of the number of cells in the human body. Annals of human biology, 40(6), 463-471.
Bronfenbrenner, U. (2009). The ecology of human development. Harvard university press.
Eap, S., DeGarmo, D. S., Kawakami, A., Hara, S. N., Hall, G. C., & Teten, A. L. (2008). Culture and personality among European American and Asian American men. Journal of Cross-Cultural Psychology, 39(5), 630-643.
Ferraro, G. (1998) The Cultural Dimension of Global Business. 3rd Edition. New Jersey: Prentice Hall.
Ferraro, G. (1998) The Cultural Dimension of International Business. 3rd Edition. New Jersey: Prentice Hall.
Hofstede, G. (1991/1994) Cultures and Organizations: Software of the Mind. London: HarperCollinsBusiness.
Hofstede, G. (2001) Culture's Consequences. Comparing Values, Behaviors, Institutions, and Organizations across Nations. 2nd ed. London: Sage.
Lustig, M. W., & Koester, J. (1999) Intercultural Competence. Interpersonal Communication across Cultures. 3rd ed. New York: Longman.
Matsumoto, D. (1996) Culture and Psychology. Pacific Grove, CA: Brooks/Cole.
Saville-Troike, M. (1997) The ethnographic analysis of communicative events. In: N. Coupland and A. Jaworski (eds) Sociolinguistics. A Reader and Coursebook. pp.126–144. Basingstoke: Macmillan.
Schein, E. (1984) Coming to a new awareness of organizational culture. Sloan Management Review 25(2): 3–16. Schein, E. (1990) Organizational culture. American Psychologist 45(2): 109–119.
Smith, P. B., & Bond, M. H. (1998) Social Psychology across Cultures. London: Prentice Hall Europe.
Spencer-Oatey, H. (2008) Culturally Speaking. Culture, Communication and Politeness Theory. 2nd edition. London: Continuum.
Triandis, H. C. (1994) Culture and Social Behavior. New York: McGraw Hill.
Trompenaars, F., & Hampden-Turner, C. (1997) Riding the Waves of Culture. Understanding Cultural Diversity in Business. 2nd ed. London: Nicholas Brealey.
Vygotsky, L. S. (1980). Mind in society: The development of higher psychological processes. Harvard university press.
Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of child psychology and psychiatry, 17(2), 89-100.

Žegarac, V. (2007). A cognitive pragmatic perspective on communication and culture. In H. Kotthoff & H. Spencer-Oatey (Eds.), Handbook of Intercultural Communication. Berlin: Walter de Gruyter, 31–53.

Sunday, December 31, 2017

How Much Does your Soul Weigh?

On 10 April 1901 Duncan MacDougall together with four other physicians were waiting for six people to die. In a hospital in Dorchester, Massachusetts, each patients' entire bed was placed on an industrial sized Fairbanks scale that was sensitive within two tenths of an ounce (5.6 grams). After a few hours waiting, the patients died and something strange happened.

As soon as they died the scales dropped. They lost weight. The conclusion was that a human soul left the body and registered the loss of 21 grams. The weight of a mouse. Repeating the experiment with dogs resulted in no loss of weight, indicating that dogs have no soul to lose.

Since the soul was material, Duncan MacDougall reasoned that we should be able to measure it. Four years later the New York Times reported in a front-page story that MacDougall tried to take X-rays of the soul escaping the body at the moment of death. Then MacDougall died in 1920 at the young age of 54 leaving behind many questions and many charlatans to capitalize on his scientific legacy.

Following the publication of these experiments—both in the popular media as well as in academic journals—his colleague physician Augustus Clarke criticized the experiments. Clarke argued that the loss of 21 grams could be accounted for by expiration. Clarke noted that at the time of death as the lungs are no longer cooling blood there is a sudden rise in body temperature, causing a subsequent rise in evaporative sweating. Since dogs do not have sweat glands, and therefore cannot lose weight in this manner Clarke argued that the experiments were flawed.There was evidence to suggest that MacDougall knew of this alternate interpretation to his experiments beforehand.

Measuring is the scientific method. The medical historian Mirko Dražen Grmek wrote about the scientists Santorio Santorio (1561-1636) who diligently weighed and measured everything. In particular Santorio weighed all the food and drink that he ingested. He also measure all that come out the other end—feces and urine. After measuring his weight, the remaining weigh loss is due to something else. For every eight pounds consumed Santorio found that he only excreted three pounds. Five pounds of food and drink could not be accounted for.

It was not until 1862 that the infamous hygienist Max von Pettenkoffer constructed an insulated room designed to measure the exact amount of evaporative sweat and heat the body generated. As a hygienist, promoting good sewage and public health approach to health, Max von Pettenkoffer designed a machine—respiration calorimenter—for measuring heat given off by body’s chemical reactions and physical changes expended by a person at rest, standing and walking. He measured the weight of this metabolic energy use.

All the evidence was already there to suggest that our metabolism—the energy expanded in maintaining bodily functions—generates evaporative loss of weight. And MacDougall knew this. In his original paper he reports that: “He [dying patient] lost weight slowly at the rate of one ounce per hour due to evaporation of moisture in respiration and evaporation of sweat.” But he also addressed this loss as an explanation for the loss of weight when the patients died: “This loss of weight could not be due to evaporation of respiratory moisture and sweat, because…this loss was sudden and large…” It's undeniable that something else is taking place.

True science can only be conducted through experimentation. MacDougall’s theory, that there had to be “continuity” in life after death—a soul—was the incentive for his experimentation. The theory assumes that we know when people die. As strange as this question might seem there is no easy definition.

Our definition of death is a legal rather than a biological definition. In medicine it is a prognosis—predicting—rather than a diagnosis—confirming. Having no brain or heart activity indicates that the patient is unlikely to come back alive, it is by no means indicative of the body. Organs can still be harvested with the patient being dead. The legal definition of death does protect surgeons from liability when they are harvesting organs for transplantation.

In 1968—a year after the South African surgeon Christiaan Barnard performed the world's first human heart transplant—Stanford University surgeon Norman Shumway performed the first USA heart transplant from a brain-dead donor. These were nearly identical surgical procedures, except whereas Barnard’s surgery was received with adulation; in the United States, Shumway nearly ended up being prosecuted for conducting the operation. John Hauser, the Santa Clara County coroner, met Shumway with a threat of prosecution.  The infringement was that the donor did not have an autopsy performed to confirm that he was dead since performing an autopsy would have ruined the organs for transplantation.  Surgeons were being accused as killers. As a result of this threat of prosecution, organ donations stopped or slowed dramatically. Like an old Perry Mason TV series where the prosecutor is standing in front of the jury, pointing their right hand index finger at the transplant surgeon while declaring “Ladies and gentlemen of the jury, there is your killer. That surgeon killed my patient.”


If we are to use the Pope's language, that death needs to involve “decomposition,” “disintegration,” and “separation,” then it will truly stop all organ transplantation. Without the legal criterion of brain death, where the organs remain viable, there will be a dramatic deterioration in the quality of organs that can be harvested and transplanted. According to the World Health Organization, in 2014 120,000 solid organs were transplanted—more than 80,000 kidney, 26,000 liver and 6,500 heart transplants in 93 countries. After Austria, the United States has the highest per capita rate of transplants. Organ transplantation extends lives for a significant number of people.  But we cannot escape the fact that this is made possible by a legal definition of death and not a biological one. If organs are truly dead, they cannot be harvested and brought back to life again. However the reliance on a legal definition of death hinders a more scientific study of the biology of death.It is surprising to find how little we know about death.

The British researcher Sam Parnia argues that many people who can be classified as legally dead from heart attacks or blood loss could be resuscitated up to 24 hours after they "die". Parnia has been studying those who have no heart beat and no detectable brain activity for periods of time. While in this state the "dead" patients are given names of cities and when—sometimes, if—they recover patients are asked to ‘randomly’ name cities. They found that the patients are more likely to choose the same cities that they were exposed to while unconscious—legally dead. It seems that when we are dead we are still aware, although not conscious.

Pozhitkov and colleagues in 2017 found that death is not just a shutting down, but an orchestrated event. The authors found mRNA transcript profiles of 1063 genes became significantly more abundant after death. Even 9s hours after death. And this is not even, while most genetic activity increased 30 minutes after death, other activity increased only a day or two after death. These genetic activities are related to: stress, immunity, inflammation, apoptosis, transport, development, epigenetic regulation and cancer. We might be as ignorant of the biology of death is as much as we are ignorant of the creation of life.


As with the MacDougall studies there is a problem of small samples in these studies too. But such problems can eventually be overcome with better research design.
Weighing the soul might b
e complicated if we do not know when we actually die and the soul departs. There are increasing interest in both defining death and capturing the process. But evidence is scant and the methods used to examine death leave room for many errors and misinterpretations. Many publications exist of unsubstantiated reports of souls departing the body—Konstantin Korotkov, Eugenyus Kugis, Vitaliy Khromovaand and others—that purport to repeat the MacDougall’s findings, including photographic evidence. But none are published in scientific journals.

We have a great interest in “proving” things. The problem with science is that it is necessarily finicky with details and the problem with belief is that it is necessarily not. Science is just a method,without an answer. We are always refining the answer and the answer can never be completely correct. Belief, on the other hand,  is an answer without a method. It is always correct because we cannot test it and improve upon the answer.

Whenever we mix the two together—science and belief—both sides get muddled. But this space is where real science resides. In that uncomfortable area where we do not know what the outcome might be. Within this muddled space, soul searching might attain a new meaning.

© USA Copyrighted 2017 Mario D. Garrett

References
Grmek, M. D. (1952). Santorio Santorio i njegovi aparati i instrumenti. Jugoslavenska akademija znanosti i umjetnosti.
Kuriyama, S. (2008). The forgotten fear of excrement. Journal of Medieval and Early Modern Studies, 38(3), 413-442.

Pozhitkov, A. E., Neme, R., Domazet-Lošo, T., Leroux, B. G., Soni, S., Tautz, D., & Noble, P. A. (2017). Tracing the dynamics of gene transcripts after organismal death. Open biology, 7(1), 160267.
MacDougall, D. (1907). Hypothesis concerning soul substance together with experimental evidence of the existence of such substance, American Medicine, April 1907.
Parnia, S., Waller, D. G., Yeates, R., & Fenwick, P. (2001). A qualitative and quantitative study of the incidence, features and aetiology of near death experiences in cardiac arrest survivors. Resuscitation, 48(2), 149-156.


Sunday, December 10, 2017

Medicare Cuts in the 2018 Budget

The new tax bill Congress is passing will increase the deficit. Although this might seem antithesis to the Republican doctrine, behind the obvious spindrift there lurks a clever ploy to trigger an automatic program that reduces funding to most social programs, including Medicare. Known euphemistically as PAYGO the Statutory Pay-as-You-Go Act of 2010, is a rule that requires any federal deficit to be paid for with spending cuts to social programs. With the exception of Social Security, unemployment benefits and food stamps, most mandatory spending programs—some 228 programs—will be cut or eliminated.  Specifically, Medicare will be cut by 4 percent a year. Medicare represents the most important program for older people after Social Security.

We got here because people, and some gerontologists, are ignorant of what really helps older adults and how we achieved a modicum of support for them. Without civic engagement and social protest, such laws breeze through without even a mention that Medicare is about to be cut.

Gerontology is full of experts. It is one of the richest disciplines, with academicians and researchers studying the whole spectrum from genetics to policy, from biology to geography, from architecture to neurology. They are all gerontologists. So it is common to find disagreements but we live happily in our own silos. How do we improve aging? We try and communicate the problems associated with aging in order to bring about change

Most communication techniques are embellishment of the 1954 Schramm's Model of Communication. Wilbur Schramm defined communication as a two-way street where both sender and receiver take turns to send (encode) and receive (decode) a message. We need messages that can be understood (decoded easier). And this what eight national aging-focused organizations tried to do when—AARP, American Federation for Aging Research, American Geriatrics Society, American Society on Aging, Gerontological Society of America, Grantmakers in Aging, National Council on Aging, and the National Hispanic Council on Aging—banded together and hired FrameWorks to create a strategy for helping the public understand aging issues. The result was a bible for an aging future. Like all bibles it is populated by don’ts:

  1. Don’t lead with the story of demographic shifts.
  2. Don’t talk about aging as a “civil rights issue.
  3. Don’t use language that refers to older people as “other.”
  4. Don’t overdo the positivity.
  5. Don’t cross-contaminate efforts to build public will with “news you can use.”

FrameWorks simplifies scientific and societal messages to a point that the general public can understand in order for them to act positively on it.  The problem with simplification is that it is false. Changing attitudes does not necessarily change behavior. We believe that communicating a good message changes attitudes and brings about concrete changes. We therefore also believe that laws are enacted as an act of benevolence. But this is misguided, as we are witnessing right now with PAYGO. "Reframing Aging" and "Disrupting Aging" are a ruse because they simplify a process that is messy and volatile and exclude the participation of individuals in civil disobedience. Worst still these approaches deny the social activists their true worth in our political world. What changes and improves conditions for older adults are laws that are enacted, implemented and enforced. And these steps are accomplished by civic engagement (or lack thereof.)

A livable income remains the lynchpin of wellbeing among older adults. Income, especially in the United States increases access to affordable health care, housing, transportation and food at a minimum. And we got here through the single enactment of the 1935 Social Security Act. The act was not some kind of reframing aging, or disrupting aging. The act was enacted because there was civil unrest and a swell of support for alternate provisions. FrameWorks by focusing solely on ageism and seeing the problem as a public relations issue, misses out on one of the tenants of an aging reality: heteroscedasticity. As we get older, we as a group, become more varied and different from each other. A schism as wide as that between Donald Trump and Noam Chomsky. FrameWorks remain at a loss in representing these two extremes.

Reframing, disrupting, renewing, or any public relations exercise cannot address aging, understand the changes and needs, develop effective responses and tackle problems associated with aging—on an individual or at a community level. That thinking is nonsense. Neither Trump nor Chomsky complain of ageism. The obvious reason is that they are at their zenith. Their basic civic responsibilities seem to be provided for. Their other very vocal issues—however grave and important—have nothing to do with age. Aging becomes a policy issue ONLY when individuals are at their lowest—their azimuth.

The azimuth for older adults is similar to those for other ages. It includes provision for shelter, health, food and income. You cannot have other ambitions before meeting these basic requirements. Right now those basic requirements are unmet for an increasingly large minority of the older adult population. Social gerontologists focus on this vulnerable and abused group. The answer how to help them is not by reframing of issues, but by blue-color provision of services. And services are created through policy.

We have been here before. During these economic failures older adults are worst hit. The Great Depression of the 1930s followed previous economic collapses—1840s and again in the 1890s. Poverty among older adults grew dramatically so that by 1934 over half of older adults in America lacked sufficient income to be self-supporting. They needed charity to survive. State welfare pensions were non-existent before 1930, and for those that later developed State pensions only provided 65 cents a day for about 3% of older adults. Millions of older people were homeless, hungry and desperate. Millions more were unemployed. By some estimates more than two million adult men—referred to as hobos, travelling workers, the word likely derived from the term hoe-boy meaning "farmhand"—wandered aimlessly around the country. Banks and businesses failed. From this morass of civil depravity rose one of the most important piece of legislation. The 1935 Social Security Act that in 1965 spawned Medicaid and Medicare is the bedrock of services for older adults. No single act has ever-improved older adult’s wellbeing as much, or since.

Social Security Act

Social Security Act—passed by the President Franklin D. Roosevelt (FDR) administration in 1935—created a right to a pension in old age, and an insurance against unemployment. This legislation was not passed because of the benevolence of Congress, or that of FDR (who won in 1932 and 1936). The act was passed because there was civil unrest and a threat of further social upheaval.

Workers rose up, and although individual uprisings were ineffective, en masse this lead even the oligarchs of the time and the Supreme Court judges to back down. There are other interpretations of history. But a strong case can be made that civil uprising created dramatic political choices at the time. Characterized by worldwide turmoil that gave rise to communism, anarchist, fascism, and National Socialism—Hitler, Mussolini, Gandhi, Lenin/Trotsky/Stalin. Here in the United States it was Federalism as expressed through the many “alphabet agencies” created under the New Deal. Federalism emerged not in response to civic unrest but in competition. It managed to subdue it.

Before the Great Depression the poor already established a precedence of marching to Washington D.C. to express their ire.  The 1894 March of Coxey's Army after the industrialist Jacob Coxey organized tens of thousands of unemployed to march to Congress. Although this movement fizzled, Coxley later became an advocate of public works as a remedy for unemployment. But it was the Great Depression that awakened the masses. The story remains scattered among the literature. Six social movements have been etched in history and defined the New Deal, whether in competition or in promoting.

1.     With Every Man a King Governor and later Senator Huey Long wanted the Federal government to guarantee everyone over age 60 to receive an old-age pension while every family would be guaranteed an annual income of $5,000. He proposed to do this by limiting private fortunes to $50 million, legacies to $5 million, and annual incomes to $1 million. By 1935 the movement had 27,000 local clubs with 7.7 million members.

2.     The Long Beach physician Francis E. Townsend started the Townsend Movement. Long Beach in California was considered the “geriatric capital” of the United States at the time with over a third of its residents being elderly. After finding himself unemployed at age 67 with no savings and no prospects, Townsend proposed that the government should provide a pension of $200 per month to every citizen age 60 and older. The pensions would be funded by a 2% national sales tax. By 1933 there were 7,000 Townsend Clubs around the country with more than 2.2 million members.

3.     Fire & Brimstone movement takes its name from a radio preacher Father Charles E. Coughlin who rallied against the Social Security act as he did against FDR, international bankers, communists, and labor unions. In 1936, Coughlin, along with Townsend and the remnants of Huey Long's Share the Wealth Movement, would join to form a third party to contest the presidential election in the hopes of preventing President Roosevelt from being re-elected. They failed, but the preacher had some 35-40 million listeners.

4.     Upton Sinclair, a Californian novelist and social crusader, drafted a program called End Poverty in California (EPIC). In a 12-point program there was a proposal to give $50 a month pensions to all needy persons over 60 who had lived in California for at least three years. Using EPIC as his mandate, Sinclair was the Democratic nominee for governor in the election of 1934 that he lost.

5.     By 1938 there were approximately eighty different old-age welfare schemes competing for political support in California. The culmination of these different economic propositions was the Ham & Eggs movement. Named in response to a flippant put-down that this movement was for a common meal—Ham & Eggs was started by a radio personality Robert Noble. Based on the writings of Yale professor Irving Fisher, the movement demanded that the state issue $25 warrants each Monday morning to every unemployed Californian over the age of fifty. With more than 300,000 members with many more supporters it quickly grew into a movement. Although later the organization was co-opted by his two brothers advocating $30 every Thursday morning there remained a resilient support for this social program. Even after the passage of the Social Security Act in 1938 the successful Democratic candidate for governor Culbert Olsen openly supported the plan and an initiative was placed twice (1938 and 1939) on the ballot to adopt the Ham & Eggs plan as California state policy. Both propositions failed.

6.     In Ohio the Bigelow Plan named after Reverend Herbert S. Bigelow proposed a State amendment to guarantee an income of $50 a month ($80 for married couples living together) to those unemployed over sixty years of age. He proposed that funding would come from increased tax on real estate (2% increase on land valued at more than $20,000 an acre), and partly out of an income tax equal to one-fourth the federal income tax paid by individuals and corporations. This plan garnered nearly half a million voters before it was defeated.

All of these movements sometimes competed against the New Deal that FDR was pushing. There remains some resilient misunderstanding of the benefits of the New Deal. Most picture this as a battle between the good and evil, the benevolent against the greedy, the globalist against the small business. We have been here before. The true story is messier then as it is now.

When Kim Phillips-Fein, wrote Invisible Hands: The Businessmen's Crusade Against the New Deal the impression was that the New Deal was somehow transformative for the good. But at the time, the New Deal was anything but positive. Phillips-Fein has shown that unemployment during the New Deal remained high at around 17% (1934-40), and especially among African Americans and especially in the South, the economy was still depressed, federal income taxes were tripled, higher liquor taxes and (new) payroll taxes, high farm foreclosures (mainly African American farmers), and with more than 3,728 Executive Orders, the New Deal has been argued to have delayed recovery. It seems that the Social Security Act kept us lingering longer in depression. Only after the Second World War did the economy and public welfare improved. Despite this background, the 1935 Social Security Act, for the first time, provided a national safety net for older adults and transformed how we think about aging that still reverberates today.

The Social Security Act became a vehicle for social programs. In 1965, with the addition of Medicaid—health care for the poor and disabled—and then Medicare—healthcare for older adults—the social package was complete. Although Social Security is neither exclusively a social program nor an insurance program, so far is has resisted change. Until now.

What will protect and improve these services for older adults is not a reframing exercise, but a swell of civic protests and civic engagement that exposes and shames the architects of policy that will happily sell the future of our children (deficit increase), hit the poorest and most vulnerable members of our society (Medicare recipients) with only a murmur of protest from aging-focused organizations. Without protests to halt the cut to Medicare, no amount of reframing will ever reverse the damage done that will start over the next few months.



© USA Copyrighted 2017 Mario D. Garrett



Resources

Carlie, M. K. (1969). The politics of age: interest group or social movement?. The Gerontologist, 9(4_Part_1), 259-263.

Cushman, B. (1994). Rethinking the New Deal Court. Virginia Law Review, 201-261.

Phillips-Fein, K. (2009). Invisible hands: The making of the conservative movement from the New Deal to Reagan. Yayasan Obor Indonesia.

Sunday, December 3, 2017

Driving While Old

In the United States there are more older-adults drivers on the road and as a result many will end-up in hospitals.
In 2015 there were more than 47.8 million licensed drivers ages 65 and older in the United States. The fastest growing driving population. With this increase we are also seeing more accidents. That same year 6,800 older adults were killed—compared to 2,333 teens ages 16–19—and more than 260,000 were treated in emergency departments for motor vehicle crash injuries.
A quick review of the National Institutes on Aging website on older drivers quickly provides a simplistic answer. The website that address older adults and driving includes such enlightened subheadings as: Stiff Joints and Muscles; Trouble Seeing; Trouble Hearing; Dementia; Slower Reaction Time and Reflexes; Medications. It is not surprising therefore to see that fatal crashes, per mile traveled, increases the older the driver is—particularly males. It seems that these physical diminished capacities have direct negative consequences when driving.
Despite this obvious conclusion—that diminished physiology results in more accidents—the evidence is not so clear-cut.
A 2015 report by the Insurance Institute for Highway Safety suggests that such increased fatalities are more likely due to increased susceptibility to injury and medical complications rather than the increased risk of crashing. Older people are more likely to be killed when in an accident. Frail bodies as well as driving older and less safe cars are to blame. There are a lot of older pedestrian deaths as well which does not involve them driving.
Older drivers might have impaired capabilities but they are not all impaired drivers. In fact they are safer than some younger groups. In general older drivers are more likely to use seat belts, tend to drive when conditions are safest and are less likely to drive while under the influence of alcohol. In comparison, teen drivers—at the zenith of their physiological prowess—have a higher rate of fatal crashes, mainly because of their immaturity, lack of skills, and lack of experience. It’s not all about biology.
Teenagers have taught us that driving a car requires more than just physical attributes. Even if we just focus on the most obvious, vision, the results are surprising.
Cynthia Owsley and her colleagues with the Department of Ophthalmology, University of Alabama, found that the best predictor of accidents was not visual acuity but a combination of early visual attention and mental status. Having 3-4 times more accidents (of any type) and 15 times more intersection accidents than those without these problems. Driving, it seems, primarily requires a sense of spatial awareness—knowing what is around you and predicting how objects and people are moving. This perceptual capacity is known as the “useful field of view”—the area from which you can take in information with a single glance.
The psychologist Karlene Ball and her colleagues with Western Kentucky University, reported that older adults with substantial shrinkage in the useful field of view were six times more likely to have a crash. What was surprising was that when compared with eye health, visual sensory function, cognitive status, and age—although these all correlated with crashes—they were poorer in predicting crash-prone older drivers. Our perception and how we can predict the immediate environment is more important than having excellent vision.
Our useful field of view narrows with age. We take in less of the visual field in front of us resulting in greater susceptibility for accidents. This is not a negative, although it has negative consequences. This is a result of years of excellent driving and training our brain that now we do not need to concern ourselves with peripheral events. We are such good drivers. As a result our peripheral view has become unimportant, and we have erroneously eliminated that aspect of driving at a time when it becomes important because we have started losing other sensory sharpness.
But luckily there are ways to enhance our perception. There are great computer-based tools for improving useful field of view and to retrain our brain to drive safer. As a result of training, these studies have shown that drivers make a third less fewer dangerous driving maneuvers, can stop sooner when they have to and feel greater mastery of driving in difficult conditions—such as at night, in bad weather, or in new places. All of which translates to a reduction in at-fault crash risk by nearly half. This is all good news that will ensure that older drivers can keep their license longer, and more importantly drive safer, despite having diminished physiological capacities.


© USA Copyrighted 2017 Mario D. Garrett 

References
Ball, K. K., Roenker, D. L., Wadley, V. G., Edwards, J. D., Roth, D. L., McGwin, G., ... & Dube, T. (2006). Can High‐Risk Older Drivers Be Identified Through Performance‐Based Measures in a Department of Motor Vehicles Setting?. Journal of the American Geriatrics Society, 54(1), 77-84.
Centers for Disease Control and Prevention, National Center for Injury Prevention and Control. Web-based Injury Statistics Query and Reporting System (WISQARS). Atlanta, GA: CDC; 2017 [cited 2017 Nov 29]. Available from URL: https://www.cdc.gov/injury/wisqars/index.html
Insurance Institute for Highway Safety (IIHS). Fatality facts 2015, Older people. Arlington (VA): IIHS; November 2016. [cited 2016 Dec 21]. Available from URL: http://www.iihs.org/iihs/topics/t/older-drivers/fatalityfacts/older-people/2015
Owsley, C., Ball, K., Sloane, M. E., Roenker, D. L., & Bruni, J. R. (1991). Visual/cognitive correlates of vehicle accidents in older drivers. Psychology and aging, 6(3), 403.


-->