Opinion – Brain Blogger Health and Science Blog Covering Brain Topics Fri, 01 Feb 2019 16:17:23 +0000 en-US hourly 1 https://wordpress.org/?v=5.0.3 Is Social Media the Bad Guy? Redefining Beauty in a Digital World /2018/07/20/is-social-media-the-bad-guy-redefining-beauty-in-a-digital-world/ /2018/07/20/is-social-media-the-bad-guy-redefining-beauty-in-a-digital-world/#respond Fri, 20 Jul 2018 15:30:11 +0000 /?p=23831 We’re living in an age of hyper-connectivity where social media is being widely used by almost every age group in the world. It’s connected people from all corners of the planet and given us the opportunity to have global conversations about practically any subject, event, or news piece.

However, many mental health and behavioral experts believe that social media has had a negative impact on the psychological well-being of those who use it because it gives people the illusion of being popular based solely on how many “likes” and “friends” they have on their profiles.

Psychologists have also observed that social media exacerbates the tendency for frequent users to develop a skewed impression of the world which is seldom accurate or healthy. Young girls and women, for example, may develop unrealistic standards when it comes to their looks and bodies based on what they see on social media.

But instead of labeling social media as the bad guy, I see it as a double-edged sword. The eventual effect that it has on your life really comes down to how you use it and for what purpose. The Internet is a neutral and open platform that levels the playing field when it comes to having access to knowledge that could help us live healthier, productive, and more fulfilling lives.

If anyone wants to avoid the negative impact that social media could have on her self-image, they need to become more conscious of their media diet. If they follow social media accounts and blogs run by people and institutions that are shallow and appearance-focused, such as Instagram models and celebrity fashion and gossip related profiles, it can hurt them if they aren’t mindful of its probable impact on them, especially on a subconscious level.

The negative impact of social media can be avoided if people are guided towards adopting a more empowering and all-encompassing standard of beauty which includes all aspects of being—intellect, aspirations, passions, talents and her morals.

In this way, they will be naturally drawn towards developing an identity that isn’t solely based on outward appearances but on character—this, in turn, will influence the use of social media for noble purposes that will expand the mind and provide a platform to express creative potential and to make a difference. In other words, we need to take an inside-out approach when it comes to combating the potentially harmful effects of social media, or any other forms of media.

Image via pixel2013/Pixabay.

]]>
/2018/07/20/is-social-media-the-bad-guy-redefining-beauty-in-a-digital-world/feed/ 0
Mental Health is Not Just the Absence of Mental Illness /2018/07/16/mental-health-is-not-just-the-absence-of-mental-illness/ /2018/07/16/mental-health-is-not-just-the-absence-of-mental-illness/#respond Mon, 16 Jul 2018 15:00:53 +0000 /?p=23514 In an increasingly globalized and mediatized world, in which mental illness is one of society’s most discussed cultural artifacts, Colleen Patrick Goudreau’s words ring out: “If we don’t have time to be sick, then we have to make time to be healthy”.

With the prevalence of mental health problems, it is clear why. Mental health issues are one of the leading causes of the overall disease burden globally, according to the World Health Organisation. One study reported that mental health is the primary source of disability worldwide, causing over 40 million years of disability in 20 to 29-year-olds.

Compared to previous generations, mental illness is now said to surpass the effects of the Black Death. The root causes of the unprecedented rise in people directly affected by mental illness, and the cost of this, can be considered across at least three levels of analysis.

If we don’t have time to be sick, then we have to make time to be healthy.

Colleen Patrick Goudreau

At the first level of analysis, the root cause of mental illness is an amalgamation of heredity, biology, environmental stressors, and psychological trauma.

Notions of specific genes being responsible for illness have been supplanted by those of genetic complexity, where various genes operate in concert with non-genetic factors to affect mental illness. That is, health-relevant biology and mental health impact each other in a complex interplay, which is inherently social.

Despite the importance of understanding the social underpinnings of biological risk factors for mental illness, there is a relative paucity of research investigating this topic. Research that does exist, is nevertheless engrossing. For example, one study, of many, found that social isolation leads to increased risk of coronary heart disease. Since low levels of social integration are related to higher levels of C-reactive protein, a marker of inflammation related to coronary heart disease, social integration is posited to be a biological link between social isolation and coronary heart disease.

Moreover, social support affects physical perception. In a landmark study, researchers demonstrated that people accompanied by a supportive friend or those who imagined a supportive friend, estimated a hill to be less steep when compared to people who were alone.

Mental health, like physical health, is more than the sum of functioning or malfunctioning parts.

At the second level of analysis, the complex bio-social interplay scaffolding mental illness points to the fundamentally chemical underpinnings of human thinking and emotion.

With recent advances in neuroscience like Clarity, we are now able to make the brain optically transparent, without having to section or reconstruct it, in order to examine the neuronal networks, subcellular structures, and more. In short, we can examine mental illness from a biological perspective.

The depth and complexity of the bio-social root of mental illness, however, paints a more nuanced picture than discussed thus far. With such pioneering work, there is an increasingly popular assumption that the brain is the most important level at which to analyze human behavior.

In this vein, mental illness perpetuates itself by virtue of the fact that people often consider it to be biologically determined. In turn, a ‘trait-like’ view of mental illness establishes a status quo of mental health stigma by reducing empathy. Such explanations overemphasize constant factors such as biology and underemphasize modulating factors such as the environment.

At the third level of analysis, the obsession with seeing mental health in terms of mental illness reveals the fallible assumption that mental health is simply the absence of mental disorder. However, the problematic landscape of mental health draws on a far wider set of working assumptions. That is, mental health, like physical health, is more than the sum of the functioning or malfunctioning parts. It is an overall well-being that must be considered in light of unique differences between physical health, cognition, and emotions, which can be lost in a solely global evaluation.

So, why do we as a society ponder solving mental illness, which should have been targeted long ago, far more than we consider improving mental health? In part, because when we think of mental health, we think of raising the mean positive mental health of a population, more than closing the implementation gap between prevention, promotion, and treatment.

Cumulatively, social environments are the lubricating oil to biological predispositions, which influence mental health, such that mental health and physical health should be considered holistically. In this vein, national mental health policies should not be solely concerned with mental disorders, to the detriment of mental health promotion.

It is worth considering how mental health issues can be targeted using proactive behavioral programs. To achieve this, it is pivotal to involve all relevant government sectors such as education, labor, justice, and welfare sectors.

In a diverse range of existing players, many nonprofits’, educational institutions’, and research groups’ efforts contribute to the solution landscape of mental health promotion. In Ireland, for example, schools have mental health promotional activities such as breathing exercises and anger management programs. Nonprofits around the world are increasingly seeing the value of community development programmes and capacity building (strengthening the skills of communities in so they can overcome the causes of their isolation). In addition, businesses are incorporating stress management into their office culture.

We think of raising the mean positive mental health of a population, more than closing the implementation gap between prevention, promotion and treatment.

The pursuit to empower people to help themselves joins up these social ventures to teach us that promoting mental health is optimized when it is preventative, occurring before mental illness emerges, and when it is linked to practical skills within a community. Furthermore, these social ventures exemplify how different types of efforts (government, nonprofit, business etc.) cater to different populations, from children to corporates.

While these social ventures bring hope to the future and underscore the importance of sustainable change, there are still too few programs effectively targeting people, who want to maximize already existent positive mental health not just to resolve or cope with mental health issues. If we continue to take such pride in our successful problem finding and solving of mental illness that we ignore mental illness prevention and mental health promotion, we are at risk of increasing the problem we are trying to solve.

References

Heffner, K., Waring, M., Roberts, M., Eaton, C., & Gramling, R. (2011). Social isolation, C-reactive protein, and coronary heart disease mortality among community-dwelling adults. Social Science & Medicine, 72(9), 1482-1488. doi: 10.1016/j.socscimed.2011.03.016

Lozano, R., Naghavi, M., Foreman, K., Lim, S., Shibuya, K., & Aboyans, V. et al. (2012). Global and regional mortality from 235 causes of death for 20 age groups in 1990 and 2010: a systematic analysis for the Global Burden of Disease Study 2010. The Lancet, 380(9859), 2095-2128. doi: 10.1016/s0140-6736(12)61728-0

Schnall, S., Harber, K., Stefanucci, J., & Proffitt, D. (2008). Social support and the perception of geographical slant. Journal Of Experimental Social Psychology, 44(5), 1246-1255. doi: 10.1016/j.jesp.2008.04.011

Image via Wokandapix/Pixabay.

]]>
/2018/07/16/mental-health-is-not-just-the-absence-of-mental-illness/feed/ 0
Race and Genetics /2018/02/28/race-and-genetics/ /2018/02/28/race-and-genetics/#respond Wed, 28 Feb 2018 13:00:57 +0000 /?p=23475 Dare I venture into this politically and emotionally charged issue? When researching the questions regarding our evolutionary biology as I do, ignoring the question of race as it relates to species would be negligent. On the other hand, trying to discuss it in a short blog such as this could be considered foolhardy. This sounds like a lose-lose situation. I’ll let you the reader be the judge.

Humans have existed for about two million years. In that time, there have been many human species. Homo sapiens emerged about 300,000 years ago and we have been the only human species still living for about 37,000 years. There is little debate today among taxonomists, evolutionary biologists and all the other “ists” that have an opinion on this subject: today all seven billion plus of us belong to one species, regardless of racial, geographic, ethnic or any other classification.

Why do we say that? There are many conflicting definitions of species—referred to in the literature as the “species problem.” There might be some room to argue that the different groups of today’s Homo sapiens that we call “races” could fit one or more of the definitions of species. Even more problematic is the definition of “subspecies”. If the different races don’t qualify as separate species, could they at least qualify as subspecies?

The answer is NO and NO.

A species consists of a group of organisms with a definable set of genetic characteristics or common gene pool that evolves independently of all other groups of organisms. A common gene pool is not a precise nucleotide-by-nucleotide definition of a set of genes. Rather, it a set of genes that perform all the same functions. There will be great variation within these genes among the members of the same species. The “evolving separately” component of the definition implies that there is some barrier to interbreeding of a species with other species such that when new genetic variants enter the gene pool, they are not intermingled with other species to a large extent. This does not mean that the barrier to interbreeding is absolute. Many species today interbreed with other species to some extent, but by and large over time they continue to evolve independently. For example, we now know that Homo sapiens interbred in the past with at least two other human species. With today’s human mobility and facile intermixing of genes among all ethnicities and localities, clearly there is not a separately evolving subgroup among us. That is particularly true of the large groupings that we call races.

The notion of subspecies is even more vague and difficult to define. The subspecies level is sometimes equated with “races”. In taxonomy, subspecies are designated with three Latin terms rather than the two that designate a species. There is only one subspecies of Homo sapiens alive today, called Homo sapiens sapiens, and it includes all present day humans. The only other subspecies of Homo sapiens, called Homo sapiens idaltu, is assigned to an extinct group of fossils thought possibly to represent the immediate precursor to today’s modern humans.

With that admittedly superficial background, let’s consider human races. If not separate species or subspecies, is there any genetic basis for categorizing people as African, Caucasian, or any other racial designation? That is, is there any genetic basis for race? One can find virtually any opinion on this subject in the legitimate scientific literature. In a publication in the New England Journal of Medicine, Robert Schwartz states that “race is a social construct, not a scientific classification” and that race is a “pseudoscience” that is “biologically meaningless.”

On the other hand, in the same journal, Neil Risch states that today’s humans cluster genetically into five continent-based grouping that are biologically and medially meaningful.

Are these two points of view really different answers to the same question about genetics and race, or are they answers to different questions? Specifically, can one state that there is no genetic basis for race and, at the same time, state that there are some genetically measurable differences between self-identified racial categories? I think the answer is yes.

Let’s take, for example, the sickle cell trait, which is much more prevalent in people who consider themselves African compared to those who consider themselves Caucasian. Yet the sickle cell trait exists in all races and one could not use it to define African vs. non-African people. In fact, when one looks at the genetic variation within any racial category, it exceeds the variation between racial categories. There is no genetic profile that can define any race.

Are there clusters of genetic traits that have higher probabilities in one race or another? Certainly. That would be true of other classifications of humans as well, such as classification by size, athleticism, or musical ability. Yes, certainly those who consider themselves African have, on average, darker skin than those that consider themselves Caucasian, but the variation in skin color is great in both groups. For example, the paleogenomic profile of the earliest human fossil found in Great Britain shows that it had dark skin in a geographic area that today consists primarily of Caucasians.

This comes back to the question of species. Aren’t there great variations within species as well? Yes, but they are far less than the variations between species. That is, today’s genomic variation between the various racial groups is less than the variation between Homo sapiens and Homo neanderthalensis. All of today’s human races, no matter how you define them, are clearly Homo sapiens and not Homo neanderthalensis.

This brings me to one final point that can either further clarify or further muddy this entire discussion of race and genetics. Generally, when we talk about genetic comparisons, we have been talking about comparing classical “genes” which are the DNA sequences that code for proteins (e.g., the hemoglobin protein coded by the sickle cell trait). It is only in the past decade or so that we have learned that much of the 98% of the human genome that does not code for proteins has a profound effect on our phenotype. That is the epigenome, which regulates the expression of classical genes.

One of the things we have learned about the epigenome is that it can change during the lifetime of an individual based on environmental factors such as diet, stress, toxins, and other factors. These changes do not affect the DNA sequence of genes, but they do affect the expression of those genes. More significantly, some of these epigenomic changes are passed on to offspring and can effect generations into the future.

This raises the question of environmental factors related to racial groupings and their impact on genetics. There is evidence, for example, that African-American descendants of slaves have lower birth weight children than African-American descendants of non-slaves, perhaps related to epigenetic factors of stress and diet during slavery. One can imagine many social-cultural factors that may vary by race that could impact the epigenome. Perhaps, when we have the ability to look at the full genome variation among racial groups, our knowledge of genetics and race will change.

References

Schwartz, Racial Profiling in Medical Research, New England Journal of Medicine 344 (2001): 1392.

Burchard, E. Ziv, N. Coyle, et. al., The Importance of Race and Ethnic Background in Biomedical Research and Clinical Practice,” New England Journal of Medicine 348 (2003): 1170.

Lotzof, Cheddar Man: Mesolithic Britain’s Blue-eyed Boy, Natural History Museum website, Feb. 7, 2018, http://www.nhm.ac.uk/discover/cheddar-man-mesolithic-britain-blue-eyed-boy.html

M. Meloni, Race in an Epigenetic Time: Thinking Biology in the Plural, The British Journal of Sociology 68 (2017): 389.

Image via pixel2013/Pixabay.

]]>
/2018/02/28/race-and-genetics/feed/ 0
Opportunistic Exercise: Use One Minute Exercises for Positive Aging /2018/02/26/opportunistic-exercise-use-one-minute-exercises-for-positive-aging/ /2018/02/26/opportunistic-exercise-use-one-minute-exercises-for-positive-aging/#respond Mon, 26 Feb 2018 13:00:53 +0000 /?p=23414 Would you believe me if I said that 95 percent of people are living today just as they did yesterday, a month ago, a year ago, without anything changing? Renewal doesn’t just happen; it comes to those who consciously pursue it. Just like only those who open their eyes at daybreak can see the dawn, the path to a new life will not open unless one chooses it.

As one gets older, however, choosing a new path becomes increasingly difficult. People may feel that their lives are essentially over after retirement so they may lack the energy or motivation to make changes even if they see their bodies and minds deteriorating. However, making incremental small changes can make a big difference. Rather than taking on a large task, altering just one small habit can empower and revitalize our lives. This is especially true if we change our level of mindful physical activity.

The great news is that exercise does not necessarily mean a long workout several times a week. What’s important is to keep moving, even if it’s just for a single minute at a time. One study at the University of California, San Diego demonstrated that frequently interrupting prolonged sitting had a positive correlation with cardio-metabolic biomarkers such as Body Mass Index (BMI), blood pressure, and blood lipid and glucose levels. Other studies have shown that reducing daily sedentary time to less than three hours has the effect of extending life expectancy by two years.

To increase your activity level in an easy and simple way, I suggest doing what I call “opportunistic exercise,” which is exercising from where you are when you have the opportunity.

The Power of a Single Minute
The best way of doing opportunistic exercise is One Minute Exercise, or stopping to exercise for one minute once an hour. One Minute Exercise is all about doing a short burst of exercise for 60 seconds, with a study at McMaster University in Canada has shown to be more effective than 45 minutes of moderate movement. Psychological resistance to exercise may be the culprit for developing a habit of living a sedentary lifestyle. While putting on a sweat suit, going to the gym, and exercising for an hour is a hassle, one minute can be done anywhere at any time.

For One Minute Exercise, I recommend doing exercises that effectively work your muscles and raises your heart rate in a short period of time. Depending on your condition, these exercises can include push-ups, squats, sit-ups, jumping jacks, or jogging in place. In just a short period of time, these exercises increase heart rate, body temperature, lung capacity, and muscle strength. But, if you have limited mobility, gentler exercises such as stretching or even deep abdominal breathing can serve as replacements.

To create a habit of breaking up inactivity, try setting an alarm every hour to remind you to get up and move your body whether you’re in the office, home, or even outside. There’s an app called One Minute Change that helps you do this easily.

Strengthening the Mind and Body
The most effective way to see the effects of One Minute Exercise is by mindfully focusing on the feelings experienced in the body after you’ve done it. An easy way to check in on what changes have occurred in your body and mind is to take just a few deep breaths in and out with the eyes closed. You will notice even small changes such as an increase in heart rate, more relaxed breathing, or a rise in body temperature.

It’s important to reflect on your body after doing One Minute Exercise to strengthen your mind and body. This simple practice helps concentrate scattered thoughts, release stress, and calm emotions. It is an intentional act to be present and inwardly-focused that wakes one up, bringing enhanced awareness on the body and mind. I include more details about One Minute Change in my latest book, I’ve Decided to Live 120 Years: The Ancient Secret to Longevity, Vitality, and Life Transformation.

A pounding heart, alert mind, and inward focus serve to increase passion and drive for life. Yet the best part of One Minute Exercise is the confidence you gain in how you manage your own health. You’ll realize that you can change yourself in all sorts of ways—physically, mentally, emotionally, and spiritually—and all it takes is just a little bit at a time, step by step, minute by minute.

References
Bellettiere, J., Winkler, E. A. H., Chastin, S. F. M., Kerr, J., Owen, N., Dunstan, D. W., & Healy, G. N. (2017). Associations of sitting accumulation patterns with cardio-metabolic risk biomarkers in Australian adults. PLoS ONE, 12(6), e0180119. http://doi.org/10.1371/journal.pone.0180119

Fergal Grace, Peter Herbert, Adrian D. Elliott, Jo Richards, Alexander Beaumont, Nicholas F. Sculthorpe. (2017). High-intensity interval training (HIIT) improves resting blood pressure, metabolic (MET) capacity and heart rate reserve without compromising cardiac function in sedentary aging men. Experimental Gerontology, https://doi.org/10.1016/j.exger.2017.05.010.

Gillen JB, Martin BJ, MacInnis MJ, Skelly LE, Tarnopolsky MA, Gibala MJ. (2016). Twelve weeks of sprint interval training improves indices of cardiometabolic health similar to traditional endurance training despite a five-fold lower exercise volume and time commitment. PLoS ONE 11(4): e0154075. https://doi.org/10.1371/journal.pone.0154075

Lee, I. (2016). The Power Brain. Phoenix, AZ: Best Life Media.

Lee, I. (2017). I’ve Decided to Live 120 Years. Phoenix, AZ: Best Life Media.

Rhodes, R.E., Martin, A.D., Taunton, J.E. et al. (1999). Factors associated with exercise adherence among older adults. Sports Med 28: 397.  https://doi.org/10.2165/00007256-199928060-00003

Søgaard, D., Lund, M. T., Scheuer, C. M., Dehlbæk, M. S., Dideriksen, S. G., Abildskov, C. V., Christensen, K. K., Dohlmann, T. L., Larsen, S., Vigelsø, A. H., Dela, F. and Helge, J. W. (2017). High-intensity interval training improves insulin sensitivity in older individuals. Acta Physiol, e13009. Accepted Author Manuscript. doi:10.1111/apha.13009

Warren, T. Y., Barry, V., Hooker, S. P., Sui, X., Church, T. S., BLAIR, S. N. (2010). Sedentary behaviors increase risk of cardiovascular disease mortality in men. Medicine & Science in Sport & Exercise, 42(5): 835-1038. doi: 10.1249/MSS.0b013e3181c3aa7e

Williams, Kristine N.,R.N., PhD., & Kemper, S., PhD. (2010). Interventions to reduce cognitive decline in aging. Journal of Psychosocial Nursing & Mental Health Services, 48(5), 42-51. doi:http://dx.doi.org/10.3928/02793695-20100331-03

Image via jarmoluk/Pixabay.

]]>
/2018/02/26/opportunistic-exercise-use-one-minute-exercises-for-positive-aging/feed/ 0
Artificial General Intelligence — Is the Turing Test Useless? /2018/02/02/artificial-general-intelligence-is-the-turing-test-useless/ /2018/02/02/artificial-general-intelligence-is-the-turing-test-useless/#respond Fri, 02 Feb 2018 16:30:39 +0000 /?p=23357 Artificial intelligence (AI) is all the rage today. It permeates our lives in ways obvious to us and in ways not so obvious. Some obvious ways are in our search engines, game playing, Siri, Alexa, driving cars, ad selection, and speech recognition. Some not-so-obvious ways are finding new patterns in big data research, solving complex mathematical equations, creating and defeating encryption methodologies, and designing the next generation weapons.

Yet AI remains artificial, not human. No AI computer has yet passed the Turing Test. AI far exceeds human intelligence in some cognitive tasks like calculating and game playing. AI even exceeds humans in cognitive tasks requiring extensive human training like interpreting certain x-rays and pathology slides. Generally, its achievements, while amazing, are still somewhat narrow. They are getting broader particularly in hitherto exclusively human capabilities like facial recognition. But we have not yet achieved what is called artificial general intelligence, or AGI.

AGI is defined as the point where a computer’s intelligence is equal to and indistinguishable from human intelligence. It defines a point toward which AI is supposedly heading. There is considerable debate as to how long it will take to reach AGI and even more debate whether that will be a good thing or an existential threat to humans.

Here are my conclusions:

  1. AGI will never be achieved.
  2. The existential threat still exists.

AGI will never be achieved for two reasons. First, we will never agree on a working definition of AGI that could be measured unambiguously. Second we don’t really want to achieve it and therefore won’t really try.

We cannot define AGI because we cannot define human intelligence—or more precisely, our definitions will leave too much room for ambiguity in measurement. Intelligence is generally defined as the ability to reason, understand and learn. AI computers already do this depending on how one defines these terms. More precise definitions attempt to identify those unique characteristics of human intelligence, including the ability to create and communicate memes, reflective consciousness, fictive thinking and communicating, common sense, and shared intentionality.

Even if we could define all of these characteristics, it seems inconceivable we will agree on a method of measuring their combined capabilities in any unambiguous manner. It is even more inconceivable that we will ever achieve all of those characteristics in a computer.

More importantly, we won’t try. Human intelligence includes many functions that don’t seem necessary to achieve the future goals of AI. The human brain has evolved over millions of years and includes functions that are tightly integrated into our cognitive behaviors that seem unnecessary, even unwanted, to build into future AI systems.

Emotions, dreams, sleep, control of breathing, heart rate, monitoring and control of hormone levels, and many other physiological functions are inextricably built into all brain activities. Do we need an angry computer? Why would we waste time trying to include those functions in future AIs? Emulating human intelligence is not the correct goal. Human intelligence makes a lot of mistakes because of human biases. Our goal is to improve on human intelligence—not emulate it.

The more likely path to future AI is NOT to fully emulate the human brain, but rather to model the brain where that is helpful—like the parallel processing of deep neural networks and self learning—but create non-human computer-based approaches to problem solving, learning, pattern recognition, and other useful functions that will assist humans. The end result will not be an AI that is indistinguishable from human intelligence by any test. Yet it will still be “smarter” in many obvious and measurable ways. The Turing Test is irrelevant.

If that is true, why would AI still be an existential threat? The concerns of people like Elon Musk, Stephen Hawking, Nick Bostrom, and many other eminent scientists is that there will come a time when the self-learning and self programming AI systems will reach a “cross-over” point where they will rapidly exceed human intelligence and become what is called artificial superintelligence or ASI. The fear is that we will then lose control of an ASI in unpredictable ways. One possibility is that an ASI will treat humans similarly to the way we treat other species and eliminate us either intentionally or unintentionally as we eliminate thousands and even millions of other species today.

There is no reason that a future ASI must go through an AGI stage to achieve this potential threat. It could still be uncontrollable by us, unfriendly to us, and never have passed the Turning Test or any other measure of human intelligence.

References

P. Saygin, I. Cicekli, V. Akman, Turing Test: 50 Years Later, Minds and Machines 10:463 (2000)

Musk and Zuckerberg bicker over the future of AI (Engadget, July 25 2017)

Simborg, DW, What Comes After Homo Sapiens?, (DWS Publishing, September, 2017)

N. Bostrom, Superintelligence: Paths, Dangers, Strategies (Oxford: Oxford University Press, 2014)

]]>
/2018/02/02/artificial-general-intelligence-is-the-turing-test-useless/feed/ 0
Musings of a Combat Professor, Former Military Medic and Psychologist /2018/01/28/musings-of-a-combat-professor-former-af-medic-and-retired-psychologist/ /2018/01/28/musings-of-a-combat-professor-former-af-medic-and-retired-psychologist/#respond Sun, 28 Jan 2018 16:30:42 +0000 /?p=23380 I am now retired and conducting research on killing in combat zones. I served as a medic in the Air Force at Andrews Air Force Base (AFB) in the emergency room of Malcolm Grow hospital. I served from Nov. 1969 to Nov. 1973. There I was, also part a team who started to process prisoners of war (POW) airmen. I also served temporary duty (TDY) at Lackland AFB treating airmen who were transferred out of Vietnam and presenting with substance use disorder (SUD). I never served in Nam. Upon discharge and with the GI Bill, I obtained my advanced degrees in clinical social work and psychology.

During my 1977 through 2000 stint at the Altoona Hospital as a BHC clinician, I had the privilege of conducting psychotherapy in our outpatient clinic and treated some of my comrades referred to us from our local veteran affairs (VA) center. I did therapy with the armed combatants who killed and some of the field medics and corpsmen that killed; but far more importantly, could not save some of their brothers from death.

Prior to my military service, I studied psychology in college. I paid some attention to the Vietnam War and was comforted by my deferment from the draft. The day I graduated from college, my protection ended. I did not go into teaching and would have had an additional deferment.

During my time in college, I had serious reservations about Vietnam and war in general. Though others protested Vietnam, I never did because of my respect for all men and women in the military. I detested those who protested as they directly, maybe not willingly, were supporting our enemy. So, once I lost my deferment protection, I enlisted in the Air Force mainly because my chances of having to kill someone were greatly reduced.  After completing basic, I enrolled in medical training at Sheppard AFB. By the grace of God, my first and only duty station was Andrews.

A number of my maternal uncles served either during WW2 or Korea. Two of my paternal cousins graduated from military academies and both made the military a career. Both were combat pilots in Vietnam. They were older than me and I really had limited contact with them.

My wife’s uncle, Mike, was killed during the Battle of the Bulge. He is buried at Arlington. My father-in-law served in the Navy during WW2 on a combat destroyer in the Pacific Ocean theater. He was part of a large flotilla the day the Japanese surrendered. He tells us he slept right through this historical event! He very rarely talked about his military service with any of us.

My older brother served in the Marine Corps, fortunately during “peacetime” as part of a howitzer platoon. My younger brother got drafted in the Army during Vietnam in a non-combat role, and did not serve “in country”.

In retirement, I now serve as a volunteer at our local VA Center. With four other fellows, we serve as sentries and guardians to what we refer to as “The Wall That Heals”. It is one of a few Vietnam traveling memorials that is now retired. It honors the legacy of all who served in Vietnam, including those who died, were killed or were wounded there. Soon I will be “deployed” to our on-site branch health clinic (BHC).

I am deeply indebted to the significant work in Lt. Col. Dave Grossman’s On Killing. And I am deeply inspired by the revelations of Marine Sgt. TJ Brennan and Finbarr O’Reilly in Shooting Ghosts. So here are some of my clinical observations about the impact of combat and its ensuing trauma:

  1. War and combat are extreme expressions of insanity. Those who serve are not insane!
  2. You killed due to a deep regard for self and others. I have two published articles on Brain Blogger about killing in combat and the other on aggression and violence in which I describe 6 types of aggression that I have encountered in my clinical work. The two that apply here are defensive and affiliative.
  3. No kind of training can prepare you for the sheer horror and terror of combat!
  4. Very few combatants enjoy killing!
  5. In combat zones you have no time to grieve your multiple losses. When you return home, you now have to face them!
  6. I am now very sure that I could kill under these extreme circumstances!

In conclusion, I am deeply moved by your experiences and don’t feel completely worthy to speak about your experiences. To you who are Marines, in the AF we called you “Gyrenes”. We are not worthy to unfasten your combat boots, and I readily admit that I served in the “cub scouts” of the armed forces as one ex-marmine observed when visiting The Wall!

Richard G Kensinger, MSW

Image via Pexels/Pixabay.

]]>
/2018/01/28/musings-of-a-combat-professor-former-af-medic-and-retired-psychologist/feed/ 0
Brain Evolution as a Path To the Gods /2018/01/21/brain-evolution-as-a-path-to-the-gods/ /2018/01/21/brain-evolution-as-a-path-to-the-gods/#respond Sun, 21 Jan 2018 16:30:03 +0000 /?p=23128 In the 1890s Paul Emil Flechsig, a German brain researcher, published a map showing which parts of the brain developed early in the course of hominin evolution and which parts developed more recently. The motor cortex, for example, evolved very early so that a newborn baby is able to hold onto its mother’s breast and feed. Flechsig divided the brain into 45 areas based on the degree of myelination present at birth. The 9 areas with the least myelination, and which thus evolved most recently, were referred to by Flechsing as “terminal zones”. They include virtually all of the brain areas which make us uniquely human, including the brain areas associated with executive functions and long-term planning.

At the time Flechsig published his findings, relatively little was known about the functions of specific brain areas. With the availability of functional neuroimaging, that has changed dramatically and we now have a reasonably good understanding of the function of many, but not all, brain areas. Also of great importance has been the development of diffusion tensor imaging (DTI) that has allowed us to visualize the major white matter connecting tracts and also assess which tracts developed earlier in human evolution and which ones more recently. Thus, what we now have is a map of the brain with timelines; this can be used to ascertain which areas with what specific function, and connected by which connecting tract, evolved early and which evolved more recently.

As a young man, Charles Darwin was deeply religious and even considered entering the ministry.  During his five-year voyage on the Beagle, during which time he formulated his theory of human evolution, he also “thought much about religion” as he recorded in his personal notebook. In his typical telegraphic writing style, Darwin wrote:

It is difficult to imagine it [belief in God] anything but structure of brain heredity…love of deity effect of [brain] organization.

Thus, many years before Darwin would publish his theory of evolution, he had already suggested that religious belief also had an evolutionary origin.

In Evolving Brains, Emerging Gods: Early Humans and the Origins of Religion I attempted to merge Darwin’s theory with the contemporary neuroscientific evidence concerning the evolution of the human brain. There are relatively well-established theories regarding brain evolution; the uniqueness of the book is the way in which the facts are put together. The gradual development of specific cortical areas and white matter tracts made hominins progressively smarter. We then acquired self-awareness; an awareness of what others were thinking (theory of mind); introspection; and finally an autobiographical memory that allows us to project ourselves backward and forward in time. This afforded us an enormous cognitive advantage over the Neanderthals and all other hominins. However, it also enabled us, for the first time, to fully understand that we would die. Our dreams confirmed that life after death exists and that the afterworld was peopled with our ancestors. Since the ancestors could help us they had to be honored and eventually, the most important ancestors became worshipped. Then, during the agricultural revolution, large numbers of people came together to form towns and then cities. The most important ancestors were arranged in a hierarchy. The greatest of them eventually rose above an invisible celestial line and were regarded as gods, probably between 7,000 and 8,000 years ago. The organization of the major religions followed.

Fuller Torrey, MD is the author of 20 books, including Evolving Brains, Emerging Gods: Early Humans and the Origins of Religion from which this article was adapted. He is the Associate Director for Research at the Stanley Medical Research Institute and the founder of the Treatment Advocacy Center.

References

Bailey, P. and Bonin, G. (1951). The Isocortex of Man. Urbana: Univ. of Illinois Press. p.265.

Rilling, J., Glasser, M., Preuss, T., Ma, X., Zhao, T., Hu, X. and Behrens, T. (2008). The evolution of the arcuate fasciculus revealed with comparative DTI. Nature Neuroscience, 11(4), pp. 426-428. doi:10.1038/nn2072

Mori, S. (2007). Human White Matter Atlas. American Journal of Psychiatry, 164(7), p.1005.

Torrey, EF (2017). Evolving Brains, Emerging Gods. New York: Columbia University Press.

Jung, R. and Haier, R. (2007). The Parieto-Frontal Integration Theory (P-FIT) of intelligence: Converging neuroimaging evidence. Behavioral and Brain Sciences, 30(02), p.135. doi:10.1017/S0140525X07001185

Kjaer, T. (2002). Reflective Self-Awareness and Conscious States: PET Evidence for a Common Midline Parietofrontal Core. NeuroImage, 17(2), pp.1080-1086. doi:10.1006/nimg.2002.1230

Saxe, R. and Kanwisher, N. (2003). People thinking about thinking people: The role of the temporo-parietal junction in “theory of mind”. NeuroImage, 19(4), pp.1835-1842. doi:10.1016/S1053-8119(03)00230-1

van der Meer, L., Costafreda, S., Aleman, A. and David, A. (2010). Self-reflection and the brain: A theoretical review and meta-analysis of neuroimaging studies with implications for schizophrenia. Neuroscience & Biobehavioral Reviews, 34(6), pp. 935-946. doi:10.1016/j.neubiorev.2009.12.004

Okuda, J., Fujii, T., Ohtake, H., Tsukiura, T., Tanji, K., Suzuki, K., Kawashima, R., Fukuda, H., Itoh, M. and Yamadori, A. (2003). Thinking of the future and past: the roles of the frontal pole and the medial temporal lobes. NeuroImage, 19(4), pp.1369-1380. doi:10.1016/S1053-8119(03)00179-4

Suddendorf, T. (2013). The Gap. New York: Basic Book. p. 91.

Tallis, R. (2014). The Kingdom of Infinite Space. New York: Atlantic Books Ltd. p. 249.

Croucher, K. (2012). Death and Dying in the Neolithic Near East. Oxford: Oxford University Press. p. 221.

]]>
/2018/01/21/brain-evolution-as-a-path-to-the-gods/feed/ 0
In a World of Never Ending Tension, Seek Compassionate Neutrality /2017/12/30/in-a-world-of-never-ending-tension-seek-compassionate-neutrality/ /2017/12/30/in-a-world-of-never-ending-tension-seek-compassionate-neutrality/#respond Sat, 30 Dec 2017 16:30:10 +0000 /?p=23142 Amidst the rising tensions in the world around us, people are finding themselves in the unique position of having to make hard decisions about choosing passive observation or active participation, causing some to toss their opinions into the fray of the multitude of voices speaking out within society today.

In contrast to what we have been accustomed to living and being for several decades, the pendulum of Change has begun its swing towards the opposite direction. The social tension created from this shift in direction has many of us unsettled. Enough so that many of us have turned inward in order to make sense of our external reality.

The need to seek balance and stability is inherent within us as human beings. We are meant to understand the other person’s point of view and seek the middle ground in between our two views in order to create perspective. Out of that informed perspective, we make decisions on how to act, our behavior evolves from those decisions, and consequently, our lives come into being from those personal thoughts and actions. This is how human beings are naturally designed to grow, evolve, and become within a constantly ever-changing world. If we are in the right place (along the spectrum of choices), at the right time (to make a decision), then right action (behavior and action) is effortless as we pivot our way into a greater life. This is one of the key tenets of my book, How Me Found I: Mastering the Art of Pivoting Gracefully Through Life.

Compassionate Neutrality vs. “My Way or the Highway”

The word “Dual” means two or something composed of two parts – harmony and balance is achieved in the coming together. The word “Duality” is when the two parts are in opposition to each other – competitive adversaries moving away from each other towards extreme polarization.

In that context, we are naturally hardwired to seek balance, “true inner balance”. Yet our external world is filled with ego-driven polarization ? “duality” ? the corrupted version of what “dual” really means. Because we are naturally designed to always seek balance (whether we are aware of it or not) and because the only constant we can expect in life is “Change”, then balance in an environment of dynamic change is achieved through a “two-party system”, a dual-system structure of compensating complimentary counterparts. As humans, psychologically our dual-system structure is the Ego-Heart handshake. This uniquely human complimentary relationship is inherent within what I call a “Natural Person”. To strive for continual balance is our natural state of mind in dealing with changing realities.

Unfortunately, our society, in its current presence of mind, does not recognize that the ego and the heart has a dual-system relationship, meant to counter balance each other so that we humans can continue to evolve within an ever-changing dynamic environment. For many of us, as we age and grow, the individual personalities that we exhibit outwardly are the reflections of our egos maturing as we learn how to adhere to the conditional social norms set forth for us to survive and operate within society. In contrast to the “natural person”, this ego-developed persona is what I call our “Conditioned Personality”.

Today, the word “dual” has become synonymous with the word “duality”, “It’s my way or the highway,” or “I’m right, you’re wrong”. Absent of the true meaning of a two-party system of balance, we have disintegrated into a mindset of where everything is now seen from the corrupted filters of polarized duality. Collaborative and communal dialogue has given way to personalized monologues based on absolute judgment and opinion.

This need to convince people that one way, and only one way, “my way”, is the cause for the rising tensions in the world today, as evidenced in the political, socio-economic, and ethno-diverse arenas of discussion. Unaware that balance is inherently a desire that we are hardwired for and ignorant that it is a partner to the heart, the ego interprets that innate desire as a need to convince others that its viewpoint is the right way to go and disregards, seeks to dominate, even eliminate the natural role the heart is meant to play. For the ego, all roads must lead to Rome and it is all about me, only me. There can be no other.

We all have individual egos; each convinced that its way is the correct way to go. Our own heartfelt knowing and the innate need for natural equilibrium have been mutated into a need to strive for a dominant view, absent of room for another view to exist. We have left the middle field in the center to take up position at either end of the playing field. As it meets resistance from other egos, our ego’s need for superiority can only lead to an outcome of aggressive force, more domination, and ultimately violence. We will kill to be right. We must be right at all costs. The end justifies the means.

And yet, the heart does exist and is very much a part of our physical and psychological makeup. It cannot be ignored, subjugated, or disregarded. Without the heart, we physically cease to exist. Without the heart, we have no conscience. It is through the heart that we connect to the greater wholeness of life in accordance to God, Nature, and the Universe, whichever you choose to refer to the larger part of who we are. The heart inherently knows that balance is necessary for our very own existence as a species. It defines our humanness, guides our humanitarian endeavors, and nourishes our humanity. It knows that it is the counterpart to the ego in a dual-system structure designed to move towards compassionate neutrality, thus bringing well-being into our lives. It is what allows us to respond to the environment in a nonjudgmental and loving way. It doesn’t need to be justified; it just needs us to be aware so that it can guide the ego towards creating a better life. Together, this heart-ego handshake is what allows us to make sense of what is happening in our extrinsic environment and make the right intrinsic decisions that can mutually benefit ourselves and others, not only ourselves. It is this communal awareness that we are both individually a person, and yet connected to each other as part of an overall human community, that gives us comfort that we are truly ever alone, unprotected, isolated, or abandoned. This awareness is the umbilical existence of our personality within the nesting doll collective of the human species. If we take care and look out for the welfare of others, then in turn we also receive benefit for our own self. Vice versa, if we look for balance within, then our external society also receives the benefit of that internal balance because we will emanate that behavior out into our external world.

In order for us to de-escalate the rising tension and violence in the external world of our society today, we, as members of humanity, need to look within and seek balance the way a natural person would. We need to reestablish the dual-system structure of the ego-heart handshake through intention, voice, and action. Because we are human and inherently designed for balance, we will naturally always seek to return to a state of compassionate neutrality; regardless of how long it takes to do so and despite what our ego thinks. The pendulum will eventually swing from any extreme edge to the center fulcrum.

So when we are at the extreme edge of a pendulum swing (the ego’s viewpoint), the pull from the other extreme edge (the heart’s viewpoint) will become intense enough to cause the pendulum to begin swinging back to the center where the heart’s communal compassion and a neutral ego’s informed judgment jointly resides. It is Mother Nature’s way of ensuring survival in a dynamically changing way.

The fallacy of our egos not understanding this natural fundamental principle of life will cause us to resist that innate impulse of allowing the other viewpoint to change our mind. Instead, we will literally fight to enforce our viewpoint to maintain our position. We don’t want the pendulum to swing at all. We want to remain in status quo, defiant to the extreme, and will react violently, determined to maintain our current position, against all natural forces of movement. This is the cause of the rising tensions in our world today.

We All Have a Choice

As an alternative, may I suggest that it is compassionate neutrality that we all seek, an aligned response to the natural forces of constant change occurring within our environment. It is what Buddhism calls, “the middle way”. It is operating from the center fulcrum of the pendulum swing, the vesica piscis of creation, the ability to see both sides and optimally benefit from the combined viewpoints. It is living life from both the macro and micro views of both the mountain peak and the valley below. This is how we survive, evolve, and grow as humans being humans, citizens of humankind, and as members of the human species.

]]>
/2017/12/30/in-a-world-of-never-ending-tension-seek-compassionate-neutrality/feed/ 0
Cracking the Code–Revealing the Secret Behind Our Perception /2017/11/03/cracking-the-code-revealing-the-secret-behind-our-perception/ /2017/11/03/cracking-the-code-revealing-the-secret-behind-our-perception/#respond Fri, 03 Nov 2017 15:00:55 +0000 /?p=23042 When you’re an eye doctor, and I’ve spent my entire career as one, you learn a lot about how people use, and misuse, the sense of sight to perceive the world around them. As humans, we’re constantly interpreting and occasionally manipulating our experiences to distinguish fantasy from reality. Some people are better at this than others. Some, for example, are consistently taken in by conspiracy theories or fake news stories, whereas others can quickly sniff them out as bogus.

A few years ago, I asked myself-what’s the difference between people with keen powers of perception and those with weaker powers? Is it education? Experience? Genetics? I began researching the topic and discovered there isn’t even a term to classify our power of perception, so I adopted one. I call it perceptual intelligence, and it’s the title of my new book (in bookstores this month).

“Perceptual Intelligence,” (or PI), is our ability to interpret sensory data and arrive at a decision.  Just as with other forms of intelligence, some people have higher PI than others. Good decision-makers exhibit a high level of Perceptual Intelligence, whereas bad decision-makers demonstrate weaker PI.

PI, I learned, is an acquired skill. We can improve our PI, in fact, through awareness and practice.  You may, for instance, find yourself overreacting to certain situations or circumstances. But with proper knowledge and a different perspective, you can train yourself to arrive at a more appropriate reaction.

In this fast-paced digital age, where we’re often forced to make decisions on the fly; we often “leap before we look.”  That might mean handing over your credit card number without verifying a website’s security, or trusting a news story without considering the integrity of the source. People with high PI, however, consistently “look before they leap.” Before making a decision, they ask themselves, instinctively: Am I interpreting this sensory data correctly and making the best choice?

Every millisecond, our senses take in a massive amount of information, which then travels to the brain.  The brain, in turn, is where our perceptions originate. Those perceptions may accurately reflect reality but may also derail us toward fantasy. The driving question behind my book is: Why do our perceptions sometimes clash with reality? There are many reasons, I discovered.

One is medical. For example, a condition known as synesthesia can cause a person to literally see music or taste sounds. (A second form of synesthesia connects objects such as letters and numbers with a sensory perception such as color or taste.) Even the common cold, which affects the eyes, ears, nose, and throat—not to mention the brain, when our heads fill with congestion—has been known to distort our power of perception. When we are under the weather from the flu, our power of perception might seem so foggy that we develop a pessimistic view of situations that we might otherwise view with optimism. Another medical factor influencing perception is sleep deprivation. As any insomniac or parent of a newborn will tell you, a lack of sleep can distort our perception of the world, sometimes even fogging our memory of what happened during our sleepless state.

An obvious (and sometimes deadly) influence on our power of perception is drugs and alcohol. We don’t need to review criminal cases and “beer goggle” studies to see how drugs and alcohol impair our senses and affect our judgment.

There’s also our psychology, biology, genetics, habits, cultural upbringing, and memories, all of which combine to create our unique perceptual filter, influencing our decisions, thoughts and beliefs. The pope’s belief in life after death, for example, is diametrically opposed to that of theoretical physicist Lawrence Krauss’. Yet each is convinced that his view is the correct one. Is the pope blinded by faith? Is Dr. Krauss closed to any idea that isn’t evidence-based? We all create a version of the world unlike anyone else’s. And how could it not be? It is shaped by our perceptions.

Often, we mold our perceptions like Play-Doh to suit the story we create of our lives. But sometimes our perceptions work behind the scenes, shaping our thoughts and behaviors without us realizing.  When we have a vague memory of a painful incident, what purpose does it serve? Why do we hold onto an incorrect and hurtful perception when instead we could make something good of it? People with finely-tuned PI can identify and topple faulty ideas that try to sabotage them.

Part of strong perceptual intelligence is recognizing that your mind is more plastic than you think and can be molded. PI can be improved, like any other skill, such as driving a car, playing a sport, or learning an instrument.  Improving PI can have a profound effect on your life. Better decisions can reduce the risk of financial, health, family problems, and other issues that can arise from low perceptual intelligence.  You could say, therefore, that high PI even improves happiness.

Dr. Brian Boxer Wachler, M.D., an expert in human perception, is America’s TV Eye Doctor and internationally renowned for his expertise in Keratoconus treatments, LASIK and other vision correction procedures. His book, Perceptual Intelligence (published by New World Library), is available in bookstores October 17, 2017 on Amazon, Barnes & Noble, and Indie Bound.

Image via PublicDomainPictures/Pixabay.

]]>
/2017/11/03/cracking-the-code-revealing-the-secret-behind-our-perception/feed/ 0
David And Goliath: The Art of Turning All Weaknesses Into Strengths /2017/10/27/david-and-goliath-the-art-of-turning-all-weaknesses-into-strengths/ /2017/10/27/david-and-goliath-the-art-of-turning-all-weaknesses-into-strengths/#respond Fri, 27 Oct 2017 15:00:07 +0000 /?p=22922 Hello, everyone! I’m Arda Cigin, founder of Stoic-Leaders.com and in this article, I’m about to change your whole mindset towards all “disadvantages” and “less than stellar situations”.

How?

I’ll be telling you about the battle between David and Goliath as an instructive case study to understand how advantages can actually be the source of our greatest weakness, and vice versa.

And then, I’ll give you many practical solutions and mindset shifts that you can apply to your life today to turn disadvantages circumstances into your greatest strengths.

But before we get into the insights, for those who may not know, let’s analyze David and Goliath’s timeless story.

– Why even tell a story?

I want to tell a story because the human brain relates through stories, not facts and theories. If you truly want to take away an action plan at the end of this article, pay attention to this timeless story.

 

The Instructive Story of David and Goliath

Goliath is this giant who is six-foot nine, wearing a bronze helmet and full body armor. He is carrying a javelin, a spear, and a sword.

Why? Because he is about to go into a fierce battle.

The giant, Goliath was about to fight with a fledgling shepherd boy named, David.

David, as a fragile young man, inherently knew he was incomparably weak to his opponent, yet he wanted to take the stand and confront the wrath of Goliath nonetheless.

Is David’s confidence misplaced? Maybe there is more to David than meets the eye…

Only time will tell.

Naturally, everyone judged David to have no chance against Goliath. Most people who looked at this combat duo would bet their money on Goliath.

And trust me. You would too.

If we were to observe Goliath, he was prepared for close combat since he was wearing heavy armor and was armed with various spears and swords.

What many didn’t know was that the great and almighty Goliath, who was regarded as the supreme winner of this fight, had one fatal, characteristic flaw:

He had awful eye-sight.

That being said, the fight started.

At the beginning of the battle, Goliath shouted the words “Come to me!”.

Yet do not mistake this as an arrogant battle cry. Goliath needed David to be in arm’s-length so that he could see David and defeat him. It was more of a desperate cry than anything else, a definite side effect of his weak eyesight.

If you think about it, Goliath didn’t excel in close combat just because he chose to do so. He had no other option but to excel in close combat.

Remember his awful eye-sight? If he were to be a strong warrior, he can not be a long-ranged one like an archer. Combined with his bulky physical nature, a simple fault in eye-sight turned Goliath into a wrathful close-ranged warrior with almost-blind eyes.

Goliath was on, what Robert Greene calls, the “death ground”.

He was trapped and had no other option. Either he was going to master close combat or he’d lead his life as a blind giant. With the help of outside pressures and internal obstacles, he became the best in his niche—ruthless close combat.

David, the fragile young man, may be a shepherd but he was a smart boy. He was not going to fight with Goliath in close combat. That would be foolish.

Therefore, he’d prepared himself for a long-ranged combat—a kind of fight Goliath was not prepared for.

As Goliath started to get agitated, David took out his trusty slingshot, swiftly positioned a rock, pulled the end of the sling and shot right at the forehead of the giant.

Goliath couldn’t even see the rock because of his faulty eyesight.

The speed at which the rock traveled was more than enough to put Goliath into a deep slumber he’d never wake up from.

And so the shepherd boy won the fight he was predestined to lose. All the cards were stacked against him, or so it appeared.

A supremely disadvantageous fragile man came to be victorious against a supremely advantageous killing machine.

Naturally, everyone was shocked. They told themselves how lucky David was.

But this has nothing to do with luck.

All the spectators were wrong. There was one thing David was far superior to Goliath in.

It was neither his size nor his strength, but his ability to think strategically.

And this exact strategy that David had used to kill Goliath will be the topic of our discussion today.

______

Most often in life, strategic thinking is the secret ingredient to turning unfavorable situations into favorable ones

Understand: Strategic minds will always rise victorious—whatever the circumstance, whoever the enemy.

 

What Can We Learn From The Grand Strategist, David?

1) Adaptation: make it your greatest asset

While we are making decisions, if it proved successful before, we tend to repeat the same tactics and maneuvers we’ve familiarized ourselves with.

Humans are innately lazy creatures and naturally, we cling to what succeeded before and expect it to continue to do so in the future.

This move will prove ineffective in the long term.

Realize: by doing so, you only create rigid pathways, neural-connections, and habits you are better off not adopting.

I want you to see life as a chess game. As long as you repeat the same moves, you are bound to lose.

Always have the flexibility to adapt to your ever-changing circumstances. If something doesn’t work (e.g., self-actualization efforts or business and career success), then change your actions and thoughts. Start thinking strategically to find options that you haven’t thought of before.

Make adaptability your greatest asset.

As Darwin pointed out,

It is neither the survival of the strongest nor smartest, but the most adaptable.

 

2) Shift the Battlefield 

Close combat? That’s what Goliath wants.

Use your wits: In this circumstance, always use the slingshot, never the sword.

Understand: Never play in a field you are oblivious to. The knowledge of the terrain will give you unimaginable and untold power.

Realize: no one can force you to play a game you suck at. If they attempt such a thing, just politely decline, as David did, lead them into playing in your arena—a field where no one but you holds the cards. A field where you become the god and they become the puppet.

The lure of such power is undeniable, don’t you think?

 

  3) The Phenomenon of the Masked Opposite

Most often in life, people tend to mistake appearance for reality. In your interactions with people, always remember the facade of appearances. No one is as they appear to be.

Everyone sees what you appear to be, few experience what you really are – Nicollo Machiavelli

When you confront your enemies, never be intimidated by their appearances. Instead look at the parts that make up the whole. Once you identified the weakness, attack with all your might. They’ll surely fall swiftly just like Goliath.

Remember: The hypocrital nature of appearances always deceives the naive

If you see an extreme behavior pattern in someone (e.g., superiority, arrogance, extreme shyness, avoidance) you are most often confronted with the phenomenon of the masked opposite—what you see is actually the exact opposite.

For example:

If someone acts particularly arrogant, realize that they are actually trying to mask their insecurity and lack of confidence. Someone who is already confident wouldn’t need to act superior in the first place.

If someone smiles incessantly and laughs at every little thing you say, would you assume they are being natural?

No, of course not. They are only using what I’d like to call “the supreme joker mask”. No one can be extremely happy and euphoric all the time. Therefore, they are only happy when you are around.

Maybe they like you and want you to like them back, maybe they want to get close to you and hurt you or maybe they just want to break the ice, whatever the reason might be, they are wearing a mask that most definitely does not express their actual feelings.

You need to train yourself to see what is underneath the mask. Everyone you meet will wear some sort of mask. And there is a reason for that.

If we openly judged people around us, naturally we’d generate unnecessary offense and malice. Therefore, from an early age, most of us have learned to hide our real thoughts and emotions.

Otherwise, we’d be vulnerable and open to attacks. We’d be left alone and isolated. To prevent such unfavorable situations, we choose to cooperate and hide our less than favorable qualities.

This is really nothing more than our ancestor survival instincts.

Therefore, if one is actually insecure but masking it with arrogance, you need to get them to drop their guards.

They need to lose the control. Do something that will make them panic. Anger them on purpose if necessary.

Anger them on purpose if necessary.

 

Final Words

As final words, I want you to remember David and Goliath’s story every time you find yourself in a less than ideal situation.

– You financially struggle but want to start your own coaching business?

Well, that’s good, because. it is possible to bootstrap an online business by being creative and resourceful.

While wealthy business owners spend billions of dollars on advertisements—mistaking ad-generated customers for long term customers—you’ll find your unique selling proposition and create loyal customers much faster thanks to your creative product, resourceful marketing, and sheer hustle.

Starting a business without capital, especially nowadays, is actually a blessing in disguise.

– You want to write a book but you are not native?

Well, that’s good. As a language learner, your humble determination towards studying grammar, vocabulary, and phrases will enable you to get a better grasp of the nuances where most native writers will get over confident and skip the many important stages of becoming a writer—understanding how narratives work, how readers are captivated, how great writers structure stories.

Your humble and hardworking attitude towards writing will enable you to progress at a faster rate than most native writers.

Can you see the power of this strategy?

Nothing can be a disadvantage for you if you are equipped with the right mindset.

Before we wrap this up, don’t forget to share this article and comment below if you’ve experienced a similar “David and Goliath” situation.

Were you in David’s or Goliath’s position? Do you have any specific stories you would like to ask me about?

I’d love to hear your story. (I reply to almost all comments)

]]>
/2017/10/27/david-and-goliath-the-art-of-turning-all-weaknesses-into-strengths/feed/ 0
Follow Me: Astrocytes in Spinal Cord Repair /2017/09/09/follow-me-astrocytes-in-spinal-cord-repair/ /2017/09/09/follow-me-astrocytes-in-spinal-cord-repair/#respond Sat, 09 Sep 2017 15:25:08 +0000 /?p=22766 You are standing in the middle of King’s Cross with a postcode in your hands, your feet trodden on by the busy crowd. So much has changed since you were last here 20 years ago, and every way you go seems to meet frowning faces set in their own path. By a lucky coincidence, somebody recognizes you and shakes your hand—they are going the same way. What are the odds! “Thank you”, you mutter, “you are a star”.

Through thistles and thorns

Spinal cord injury (SCI) leaves patients isolated from their own bodies with devastating life-long consequences. The limited nature of adult human spinal cord repair is frustrating but understandable from a developmental point of view. If fasciculis gracilis, a tract of the spinal cord responsible for lower limb sensations that relays touch from the foot, gets traumatically disrupted, the dorsal root ganglion (DRG) cell axon would not only have to cross over the site of injury, but also find the correct path all the way up—up to 50 cm in a tall individual. Even if that tremendous task is accomplished, there still remains the challenge of finding the correct second order neuron of the nucleus gracilis that, in turn, connects to the third order thalamic neuron transmitting signals to the part of the somatosensory cortex representing that foot. Not something that can be easily done without help.

On the way to connectivity restoration

It has been noted that:

The three main aims of [axon regeneration] research are: to initiate and maintain axonal growth and elongation; to direct regenerating axons to reconnect with their target neurons; and to reconstitute original circuitry (or the equivalent), which will ultimately lead to functional restoration. Although many interventions have now been reported to promote functional recovery after experimental SCI, none has fulfilled all three of these aims. In fact, only the first aim has been extensively studied and convincingly demonstrated.

Indeed, even though the possibility of axon regrowth in the adult mammalian nervous system has been shown, the evidence supporting neuronal connectivity restoration is rarely convincing. Without careful guidance, aberrant axonal regrowth is a serious obstacle to functional regeneration.

Interestingly, long-distance regeneration of axons is not the only mechanism through which normal spinal cord function can be restored. Injury is known to induce plasticity by several mechanisms including unmasking inactive spared pathways, compensatory collateral sprouting from the intact pathways, or an increase in receptor number due to denervation hypersensitivity.. For example, Hofstetter et al. commented that:

Excessive local sprouting in our present study might have facilitated the formation of [novel corticospinal tract] pathways, although we could not detect a correlation between the amount of local sprouting and motor recovery.

Whilst researchers are trying to find a way to artificially guide axons to their correct targets and induce plasticity, there is a cell type that routinely orchestrates these processes. Astrocytes express a range of axon-attractive and repulsive molecules that are crucial for proper development and adult nervous system plastic reorganisation. They provide a physical adherent substrate for growing neurons, secrete extracellular neuro-attractants like vimentin and repellents like chondroitin sulphate proteoglycans (CSPGs) and semaphorins.

A scar or a star?

The neurocentric paradigm considered astrocytes to be a barrier to healing after the SCI for almost a century despite the lack of evidence that purely neuron-based therapies are sufficient for full regeneration. However, if the derogatory-dubbed astrocytic scar is removed or prevented from forming in the SCI context, not only do axons fail to spontaneously regrow through the lesion, they also become unable to regrow upon delivery of stimulating growth factors that promote growth in the presence of astrocytes.

Anderson et al. show that astrocytes are not responsible for the bulk of the inhibitory CSPG production after the SCI lesion, as hypothesized previously, but instead provide crucial axon-supporting molecules. The injury environment primes reactive astrocytes to re-express developmental synaptogenic molecules such as thrombospondin-1 that result in protective neuroplasticity.

Not all are created equal

Despite these recent discoveries, skepticism towards astrocyte-based therapies prevails. To exacerbate this view, several studies that used neural stem cell (NSC) transplantation in an attempt to repair spinal cord damage observed neuropathic pain development that correlates with the astrocytic differentiation of NSCs.

Nevertheless, it is becoming clear that the astrocytic population is far from homogeneous. Subsets of astrocytes with different permissive and restrictive qualities towards the growth of specific types of axons are found within different regions of the spinal cord that guide region-specific development of sensory and motor axonal tracts. Not surprisingly, certain astrocytic types are more selective towards regeneration and plasticity of specific types of neurons.

Davies at al. have discovered that astrocytes pre-differentiated from embryonic glial-restricted precursors (GRPs) (GRP-derived astrocytes or GDAs) are capable of promoting axonal regrowth alongside functional recovery and prevention of axotomized neuron atrophy upon transplantation in rodents with SCI, where this method supersedes transplantation of undifferentiated neural precursors.

Importantly, the method of astrocytic differentiation of precursor cells plays a crucial role in determining their regenerative capacity. If bone morphogenic protein-4 (BMP4) is used in GDA astrogenesis, the resulting population creates a strongly supportive environment upon transplantation. In contrast, the same GRPs treated with ciliary neurotrophic factor (CNTF) have poor locomotor regenerative properties, but induce active calcitonin-gene-related peptide (CGRP)-positive nociceptive c-fiber sprouting that is associated with allodynia.

This raises the possibility that the inflammatory environment of the injured spinal cord promotes differentiation of endogenous and transplanted astrocytes into the subtype that is not optimal for rubrospinal or dorsal column tract axon restoration, but, in turn, may be selectively supportive to pain-conducting c-fibers.

In addition to transplantation strategies, modification of endogenous astrocyte function can be employed. For example, oral administration of denosomin results in functional recovery in mice after SCI through increases in astrocyte proliferation, survival, and vimentin secretion that promotes locomotor raphespinal axon outgrowth.

Learning from the experts

Finally, it is noteworthy that astrocytic subtypes that promote recovery appear to physically realign local astrocytic processes in a linear fashion. Authors speculate that this linear organization provides more straightforward routes for regenerating axons to follow. Another exciting and unexplored possibility is that these astrocytes help to restore the endogenous astrocytic network function that gets disrupted by the injury, whilst beneficial neuroplasticity is its natural corollary.

Stepping back from the exclusively neurocentric view of SCI may allow for unexpected advances in functional restoration. Ultimately, the bulk of research on the intricacies of axonal guidance and plastic synapse rearrangement is an attempt at recapitulation of normal astrocytic functions. When offered a helping hand, take it, and see where it leads you.

References

Bradbury EJ, McMahon SB. Spinal cord repair strategies: why do they work? Nat Rev Neurosci. 2006;7(8):644–53. doi: 10.1038/nrn1964.

Pernet V, Schwab ME. Lost in the jungle: New hurdles for optic nerve axon regeneration. Vol. 37, Trends in Neurosciences. 2014. p. 381–7. doi: 10.1016/j.tins.2014.05.002.

Smith GM, Falone AE, Frank E. Sensory axon regeneration: Rebuilding functional connections in the spinal cord. Vol. 35, Trends in Neurosciences. 2012. p. 156–63. doi: 10.1016/j.tins.2011.10.006.

Weidner N, Tuszynski MH. Spontaneous plasticity in the injured spinal cord — Implications for repair strategies. Mol Psychiatry. 2002;(7):9–11. doi: 10.1038/sj.mp.4001983.

Hofstetter CP, Holmström N a V, Lilja J a, Schweinhardt P, Hao J, Spenger C, et al. Allodynia limits the usefulness of intraspinal neural stem cell grafts; directed differentiation improves outcome. Nat Neurosci. 2005;8(3):346–53. doi: 10.1038/nn1405.

Allen NJ, Barres BA. Signaling between glia and neurons: Focus on synaptic plasticity. Vol. 15, Current Opinion in Neurobiology. 2005. p. 542–8. doi: 10.1016/j.conb.2005.08.006.

Freeman MR. Sculpting the nervous system: Glial control of neuronal development. Curr Opin Neurobiol. 2006;16(1):119–25. doi: 10.1016/j.conb.2005.12.004.

Fallon JR. Preferential outgrowth of central nervous system neurites on astrocytes and Schwann cells as compared with nonglial cells in vitro. J Cell Biol. 1985;100(1):198–207. PMID: 3880751.

Shigyo M, Tohda C. Extracellular vimentin is a novel axonal growth facilitator for functional recovery in spinal cord-injured mice. Sci Rep. 2016;6(February):28293. doi: 10.1038/srep28293.

Wang H, Katagiri Y, McCann TE, Unsworth E, Goldsmith P, Yu Z-X, et al. Chondroitin-4-sulfation negatively regulates axonal guidance and growth. J Cell Sci. 2008;121(18):3083–91. doi: 10.1242/jcs.032649.

Molofsky A V, Kelley KW, Tsai H-H, Redmond SA, Chang SM, Madireddy L, et al. Astrocyte-encoded positional cues maintain sensorimotor circuit integrity. Nature. 2014;509(7499):189–94. doi: 10.1038/nature13161.

Chu T, Zhou H, Li F, Wang T, Lu L, Feng S. Astrocyte transplantation for spinal cord injury: Current status and perspective. Vol. 107, Brain Research Bulletin. 2014. p. 18–30. doi: 10.1016/j.brainresbull.2014.05.003.

Anderson MA, Burda JE, Ren Y, Ao Y, O’Shea TM, Kawaguchi R, et al. Astrocyte scar formation aids central nervous system axon regeneration. Nature. 2016;0(1):1–20. doi: 10.1038/nature17623.

Tyzack GE, Sitnikov S, Barson D, Adams-Carr KL, Lau NK, Kwok JC, et al. Astrocyte response to motor neuron injury promotes structural synaptic plasticity via STAT3-regulated TSP-1 expression. Nat Commun. 2014;5:4294. doi: 10.1038/ncomms5294.

Macias MY, Syring MB, Pizzi MA, Crowe MJ, Alexanian AR, Kurpad SN. Pain with no gain: Allodynia following neural stem cell transplantation in spinal cord injury. Exp Neurol. 2006;201(2):335–48. doi: 10.1016/j.expneurol.2006.04.035.

Davies JE, Huang C, Proschel C, Noble M, Mayer-Proschel M, Davies SJA. Astrocytes derived from glial-restricted precursors promote spinal cord repair. J Biol. 2006;5(3):7. doi: 10.1186/jbiol35.

Davies JE, Pröschel C, Zhang N, Noble M, Mayer-Pröschel M, Davies SJA. Transplanted astrocytes derived from BMP- or CNTF-treated glial-restricted precursors have opposite effects on recovery and allodynia after spinal cord injury. J Biol. 2008;7(7):24. doi: 10.1186/jbiol85.

Teshigawara K, Kuboyama T, Shigyo M, Nagata A, Sugimoto K, Matsuya Y, et al. A novel compound, denosomin, ameliorates spinal cord injury via axonal growth associated with astrocyte-secreted vimentin. Br J Pharmacol. 2013;168(4):903–19. doi: 10.1111/j.1476-5381.2012.02211.x.

Image via StockSnap/Pixabay.

]]>
/2017/09/09/follow-me-astrocytes-in-spinal-cord-repair/feed/ 0
Projection – When Narcissists Turn the Blame on You /2016/10/16/projection-when-narcissists-turn-the-blame-on-you/ /2016/10/16/projection-when-narcissists-turn-the-blame-on-you/#respond Sun, 16 Oct 2016 11:22:38 +0000 /?p=22217 Ah, projection. The fine art of making me guilty of your vices.

Projection

No one projects better or more frequently than a narcissist. They’ve practiced, honed and refined projection to a fine art.

Whatever they’re up to, by some mental “abracadabra,” suddenly they’re innocent and you’re actually the one up to no-good.

Deep In The Race

Since Eve ate the “apple” and blamed it on the serpent, projection has been a quintessential part of the human race. Since Adam ate the “apple” and blamed it on Eve, men have been projecting onto their wives. Wives have been projecting onto their husbands. Parents have been projecting onto their children. And siblings have been projecting onto each other.

C’mon, you know you’ve done it. I certainly have. And while projection may be elevated to a high art-form by narcissists, we’ve all done it or been tempted to do it. We’ve all got a little corner of narcissism in our souls. That’s why we understand them so well.

As I see it, projection proves how deeply and profoundly all homo sapiens, narcissists or otherwise, not only inherently know the moral code…but believe in it.

Why Project?

Guilt? Maybe.

Envy of other’s innocence? Perhaps.

Avoidance of the result of wrong-doing? Now we’re getting somewhere.

The need to be perfect! Bingo + all of the above.

Gotta protect that fragile little ol’ ego, y’know.

Equality

Projection is the great equalizer. Everyone the narcissist knows is equally guilty.

Their children, especially the one assigned the role of “scapegoat,” suffer the most from being projected upon. Trained to be humble and submissive, brainwashed to feel false guilt, they take on their elder’s vices without critical thinking. Bearing the “sins of the fathers” as a burden like Christian in Pilgrim’s Progress.

Spouses of narcissists suffer from a shit-load of projection too. They are in the unenviable position of being accused o’re and o’re of infidelity. Y’know, the infidelity the narcissists is actually engaging in.

Co-workers are also a dandy target for career projection. After all, fill-in-the-blank is never the narcissist’s fault, yet blame must be assigned.

Projection In Literature

In his charming books on veterinary practice in Yorkshire, James Herriot wrote a fascinating line about his highly, shall we say, eccentric partner, Siegfried Farnon. James is speaking to Tristan, the long-suffering younger Farnon brother.

You know the one thing I can’t stand about your brother, Tris? It’s when he gets patient with you. He gets this saintly look on his face and you know that any moment now he’s going to forgive you. For something he’s just done.

Oh, how I remember that vile, nasty saintly expression of condescending forgiveness for a wrong I didn’t commit. I also remember the hand-written notes left for me by my parent, forgiving me for the abuse I forced them to commit against me. Abuses like throwing a sheaf of my schoolwork across the room and yelling at me for half an hour. My crime: Ending a paragraph with a question, instead of a statement. Isn’t that awful of me!? (sarcasm)

Ah, but I was forgiven by the next morning.

Scapegoat

As it turns out, the concept of a “scapegoat” is thousands of years old, with a rich history.

In Leviticus 16:21 it says, “And Aaron shall lay both his hands upon the head of the live goat, and confess over him all the iniquities of the children of Israel, and all their transgressions in all their sins, putting them upon the head of the goat, and shall send him away…into the wilderness…”

Most ancient religions have some version of this. Projection of sin onto a sacrifice.

Projection makes us the scapegoat, wearing on our heads the iniquities of the narcissist. Our self-esteem and innocence is sacrificed on the altar of their ego, so they can go on their merry way.

Right Runs Deep

I would argue this shows how deeply and profoundly narcissists believe in right VS wrong.

If they don’t know right from wrong, why project?

If they don’t care about right vs wrong, why project?

If they don’t have a functioning conscience, why project?

It could be for expediency. To avoid the ramifications of their actions. To keep the smooth sailing going.

But, how would they know what needed to be projected, if they don’t know right from wrong?

They know, oh, how they know!

And therein is their undoing.

Atonement

I dunno about you, but most of my best ideas happen in the shower. (Or bed.) I’ll never forget the “discovering fire” moment under a hot shower when something went “click.” I finally got it. Let’s hope I can articulate it to you.

The Old Testament concept of the scapegoat comes to full circle in the New Testament concept of atonement.

There is one and only one setting where projection actually works! We get to project our sins onto Christ. It’s okay. Go ahead and project. And in exchange, through the shedding of His innocent blood on the Cross, His perfection becomes ours.

“For he hath made him to be sin for us, who knew no sin; that we might be made the righteousness of God in him”, II Cor. 5:21

That’s why He came. He didn’t just come to be a great moral teacher. As C.S. Lewis wrote in “Lewis’s trilemma” in Mere Christianity:

I am trying here to prevent anyone saying the really foolish thing that people often say about Him: ‘I’m ready to accept Jesus as a great moral teacher, but I don’t accept his claim to be God.’ That is the one thing we must not say.

A man who was merely a man and said the sort of things Jesus said would not be a great moral teacher. He would either be a lunatic — on the level with the man who says he is a poached egg — or else he would be the Devil of Hell. You must make your choice. Either this man was, and is, the Son of God, or else a madman or something worse. You can shut him up for a fool, you can spit at him and kill him as a demon or you can fall at his feet and call him Lord and God, but let us not come with any patronising nonsense about his being a great human teacher. He has not left that open to us. He did not intend to.

Wow!

Profundity

Pretty deep stuff, huh. The age-old battle between Right vs Wrong. Mankind’s deep-seated need to be in the right, even if it means doing wrong (i.e. projection) to maintain the appearance of being in the right. And the profound paradigm of scapegoating, sacrifice and atonement.

Like the pieces of a puzzle, it all holds together. It makes sense. As C.S. Lewis wrote in The Lion, The Witch and the Wardrobe, ”

“Logic!” said the Professor half to himself. “Why don’t they teach logic at these schools?”

Above all, narcissists are logical. And, in a twisted way, projection is also logical. Twisted. Sad. Wrong…but still logical.

This guest article originally appeared on PsychCentral.com: Projection: Narcissists’ Favorite Trick

Image via PublicDomainPictures / Pixabay.

]]>
/2016/10/16/projection-when-narcissists-turn-the-blame-on-you/feed/ 0
A Resident’s Reflections from within the American Board of Psychiatry & Neurology (ABPN) /2016/05/29/a-residents-reflections-from-within-the-american-board-of-psychiatry-neurology-abpn/ /2016/05/29/a-residents-reflections-from-within-the-american-board-of-psychiatry-neurology-abpn/#respond Sun, 29 May 2016 15:00:31 +0000 /?p=21698 Most residents have a very limited understanding of the American Board of Psychiatry and Neurology (ABPN), and understandably so.

I myself thought of the ABPN as a large, bureaucratic, governmental organization that spent most of its time siphoning money from hapless residents in order to administer board certification examinations. I was therefore surprised, and a bit skeptical, when my chairman asked me if I was interested in a three-month administrative fellowship at the ABPN during my final year of neurology residency. Although I had an interest in administration, I was hesitant because I was unfamiliar with the fellowship’s objectives, because it would be the first year it was offered.

Three weeks before my administrative fellowship was to begin, a thick binder arrived containing a detailed, day-by-day schedule and multiple articles including, 10 Most Common Mistakes Made by Administrators and Understanding Financial Statements of Not-for Profits. I also received three books on self-improvement: Drive, Talent is Overrated, and, ironically, Being Wrong. Reviewing the schedule, I was surprised to see that I would be spending, collectively, almost four weeks traveling (including internationally).

The binder clearly spelled out the objectives of the fellowship. I was expected to learn about the mission and structure of the ABPN as a whole, and in particular the fiduciary responsibilities of the board of directors. I was to have scheduled meetings with the senior staff to appreciate their role in the day-to-day workings of the ABPN. In addition, I was expected to complete a research project, suitable for submission for presentation and publication. Finally, I was to have weekly meetings with Dr. Larry Faulkner, the President and CEO of the ABPN. It would be these weekly meetings that I would find most useful, as they provided perhaps the greatest educational value of the entire fellowship.

About the ABPN

Prior to my arrival at the ABPN, I learned that it had been formed by psychiatrists and neurologists in 1934 in order to distinguish qualified specialists from those offering neurological or psychiatric care without adequate experience or training.

Rather than a large, bureaucratic organization, the ABPN is relatively small. It consists of less than 40 staff, of which only one is a salaried physician (Dr. Faulkner). The ABPN sitting directors essentially volunteer their time. I quickly learned that the ABPN does not have members (unlike the American Academy of Neurology (AAN) or the American Psychiatric Association) and is an organization that is primarily responsible to the American public. Its main mission is to ensure the public that ABPN diplomates are competent to practice neurology and psychiatry. It does this by first certifying candidates who have graduated from accredited residency programs and by developing methods to assess that practicing physicians continuously keep up with the rapid pace of medical advancement. Initial certification for neurologists and psychiatrists now consists of a computer based examination.

Interestingly, the ABPN is also a driving force behind residency education. Recently, the Accreditation Council of Graduate Medical Education (ACGME) decided that it would not accredit additional combined training programs. Instead of dissolving these programs (in which almost 200 residents are currently enrolled), the ABPN decided to review and approve these combined training programs which include neurology-internal medicine and neurology-psychiatry. While the ACGME establishes minimal requirements for neurology and psychiatry residency programs, the ABPN establishes the necessary pre-requisites a resident must have in order to be eligible to become board certified. Often the ACGME follows suit. For example, initially there was no ACGME requirement that a graduating neurology resident see a single critical care patient. The ABPN determined that an intensive care unit (ICU) clinical skill examination (CSE) would be required in order to apply for an initial board certification exam. Shortly thereafter, the ACGME adopted the ICU CSE as a requirement for residency accreditation.

A recent focus of the ABPN is supporting education and research activities of academic faculty. Given the increasing clinical demands on faculty, I noted that the ABPN grants for innovative education projects placed particular emphasis on ensuring that faculty had protected time to complete those activities. The ABPN will shortly begin another grant program to support research on issues relevant to its mission. In both of the ABPN grant programs, awardees are selected by panels of neurologists and psychiatrists that includes members from within the academic community with established expertise in education or research.

Crucial Issue Forums

The ABPN has also begun to host a yearly “Crucial Issue Forum”. These Forums focus on pressing issues central to the fields of neurology and psychiatry and are used to obtain feedback from professional organizations and others on those issues. Experts in the field, including program directors, department chairs, representatives of national professional organizations, residents, and fellows are invited.

The most recent Forum focused on residency education, and included a discussion about whether the process of the CSEs should be modified to produce a more meaningful educational experience. A growing body of literature has suggested that the CSEs are not as effective as they might be. These sentiments were echoed by several residents, including myself. After attending this Forum, it became clear to me how seriously the ABPN took this Forum. Had the attendees of the Forum voted for the ABPN to conduct site visits to monitor the CSEs at every institution, it is likely that we would have site visits. Conversely, if a clear consensus had been to abolish the CSEs, it is likely that they would no longer exist.

My fellowship

A requirement of the fellowship is a research project with the expectation of publication. Several opportunities exist towards this end, including use of the ABPN’s wealth of data on their initial certification examinations, maintenance of certification exam and CSEs. Given my preexisting interest in both headache and education I surveyed adult neurology residency program directors and chief residents to determine their views on the appropriate amount of headache education in neurology residency. The goal of this project was to determine if headache education had significantly increased from a decade ago when a similar survey had been done. I had the opportunity to present the results to the senior staff of the ABPN as well as at the American Headache Society Annual Scientific Meeting in June 2015. The manuscript was accepted for publication in Headache, The Journal of Head and Face Pain.

The most memorable moments of my fellowship were spent in Dr. Faulkner’s office for our weekly 10 o’clock meetings. These ‘one on one’ meetings typically lasted between 1-2 hours. Rarely was there a set agenda. We discussed everything from Dr. Faulkner’s top ten rules for financial investment, to the inexact science of hiring employees. We talked about the slim evidence base behind maintenance of certification (MOC) and the impetus to have an MOC program despite the lack of strong evidence. We explored why continuing medical education (CME) has not met the same opposition as MOC Part IV, despite the fact that CME is the most time intensive component of MOC.

Behind the backdrop of the formal curriculum, readings, and scheduled meetings, a large part of the fellowship consisted of informal education. Every moment of downtime with Dr. Faulkner was an opportunity for me to learn about the process of becoming a successful administrator. While we waited for our flights we would often talk about everything from family to how important it is to take care of oneself physically and mentally. As Dr. Faulkner put it, “If you fall apart, everything falls apart. If you’re not healthy, you won’t be able to fulfill your family, social, or work responsibilities.” He impressed upon me the importance of being on the same page as one’s spouse and family. We discussed the value of doing a few tasks, but doing them well. I understand now that the real value of this unique experience truly lay in the in-depth immersion that I had into all things administrative, from the ABPN, to academic departments, to professional organizations, and even to my family.

Finally, the fellowship gave me the opportunity to meet with some of the most influential leaders in neurology and psychiatry. It was eye-opening to see the work that goes on behind the scenes at organizations like the AAN, APA, and ACGME. Despite their different responsibilities, each of these national organizations and their respective leaders had the singular goal of furthering the fields of neurology and psychiatry through focused initiatives. I began to appreciate the extraordinary effort that went into the large annual professional meetings. I spent a day at the AAN in Minneapolis learning about their different sections and the spectrum of resources they provide for their members. It was humbling to realize that I could probably spend my whole life on the AAN website and still not be able to take advantage of all the resources they have to offer.

In the ABPN I found an organization that not only tried to uphold the standards that make our profession credible, but also an organization that was dedicated towards the advancement of neurology and psychiatry education. In Dr. Faulkner I found a leader who tried to be fair. He cultivated the potential of those around him into a kinetic energy that translated into a collective success. Much of his time was spent advocating for the best interests of neurologists and psychiatrists against those who would like to propose greater physician scrutiny and regulation.

The mounting pressures of lower reimbursement in the setting of higher patient volumes, the oft-repeated mantra of ‘Do more with less’, and the overwhelming paperwork often overshadow our initial motivation to become physicians. More than anything else, my time at the ABPN and my interaction with the leaders in neurology and psychiatry have given me hope and optimism that we can find our way through the pressured maze of bureaucracy and increasing scrutiny to an era where we will be able to provide the best care for our patients while seamlessly documenting the quality of our work. There are multiple initiatives towards this end, not the least of which is the commitment and support of leaders in neurology to the AAN Axon Registry. In summary, my experience at the ABPN taught me that our future is in our hands and that our collective involvement and effort will be crucial to effectuate the outcomes we desire.

References

ABPN Awards Program. Faculty innovation in education award. American Board of Psychiatry and Neurology website. Accessed December 21, 2015.

Aminoff, MJ. Faulkner RF. (2012). The American Board of Psychiatry and Neurology, Looking Back and Moving Ahead. Arlington, VA: American Psychiatric Publishing.

Kay, Jerald. (1999(. Administrative Mistakes Handbook of Psychiatric Education and Faculty Development. Washington D.C., American Psychiatric Press.

Schuh, L., London, Z., Neel, R., Brock, C., Kissela, B., Schultz, L., & Gelb, D. (2009). Education Research: Bias and poor interrater reliability in evaluating the neurology clinical skills examination Neurology, 73 (11), 904-908 DOI: 10.1212/WNL.0b013e3181b35212

Image via StartupStockPhotos / Pixabay.

]]>
/2016/05/29/a-residents-reflections-from-within-the-american-board-of-psychiatry-neurology-abpn/feed/ 0
On Mass Murderers /2016/05/11/on-mass-murderers/ /2016/05/11/on-mass-murderers/#respond Wed, 11 May 2016 15:00:40 +0000 /?p=21709 We observe the modern epidemic of mass murder in this country and are shocked. We can’t understand who these (mostly young) men are who take the lives of innocents for no apparent reason. What could possibly drive them to do it?

Seeking reassurance, we search for what makes these murderers different from us. In the wake of yet another horrific mass shooting, we must reassess our understanding of the underlying cause.

We conclude that these killers are mentally ill. Legislators devise laws to prevent people who have been committed to psychiatric hospitals or otherwise judged mentally ill from owning guns. Mental health experts demand more psychiatric services to identify and treat them. Even Dear Abby writes, “The triggers that have led to the plague of mass shootings in this country are the result of individuals with severe psychosis (Bangor Daily News, 11/23/ 2015).” It is satisfying to us to believe that we can identify mentally deranged people who commit these crimes, and that they are not like us.

In Europe and much of the rest of the world, there is another group of slaughterers called Islamic Jihadists. When the recent events in Paris unfolded, the world watched horrified as a small cell of ISIS terrorists indiscriminately gunned down scores of random people. We see this as a political-religious act by radicalized Islamists, not a product of mental illness. But how much difference is there, really, between American mass murderers and foreign jihadist ones?

A recent article in The New Yorker by Malcolm Gladwell (“Thresholds of Violence,” 2015) analyzed the genesis of school shooters in the US. Over the past 20 years, there has been a long series of cases following a similar pattern. One or two young men go into unprotected schools and randomly start shooting unarmed students and teachers. Gladwell points out that since 2012, there have been 140 school shootings in America. Some of these young men, such as Kip Kinkel, had bizarre paranoid fantasies and can be identified as psychotic, but some such as Eric Harris of Columbine fame were more appropriately described as psychopaths. Some came from chaotic and violent homes, but others were loved by their families and un-traumatized. Then there was Adam Lanza. What are we to make of him?

In December, 2012, 19-year-old Adam Lanza shot his mother, then went to Sandy Hook Elementary School where he murdered 20 children and six adults.

Much of what is known about his early life was reported by Adam’s father Peter to Andrew Solomon of The New Yorker. Peter described Adam as exhibiting odd behaviors such as sensory hypersensitivity and social dysfunction from an early age. At age 13, a psychiatrist diagnosed Adam with Asperger’s syndrome and recommended he be home-schooled. In his high school years, he became increasingly isolated and distant from his parents. The only emotion he displayed to them was distress in connection with having to socially engage. Perhaps distracted by the Asperger’s diagnosis and unable to penetrate Adam’s secrecy, neither his parents nor mental health professionals were alert for signs of impending violence.

From the clues he left behind, Adams emotions alternated from anger to despair. Anger may have been the only social emotion he was capable of comprehending. His anger was reflected in his increasing fascination with mass murder, which he expressed only online. In his late teens, he spent much of his time editing entries on mass murderers on Wikipedia. He was aware that he was failing in life and had no future. As Solomon put it, “The more Adam hated himself, the more he hated everyone else.”

It seems reasonable to speculate that his final act was to take the life of his mother, whom he blamed for his problems, and then the lives of children who had the promise he could never realize. If we are to look for causes of Adam’s murderous behavior, they do not lie in Asperger’s or mental illness. It seems clear enough that the key to Adam and the common element behind mass murders is rage.

For the Jihadist, the rage is religious and political. The non-believer is evil and an enemy. He must be destroyed or enslaved. The reward for killing the other is a place in heaven. For a mass murderer like Adam, the rage is interpersonal. It is against an enemy who is, in some way, oppressing or preventing the killer from getting what he deserves. The reward is achievement and fame. In either case, compassion has no place.  

Gladwell’s formulation emphasizes the under-appreciated power of situational or social factors in determining our behavior. He invokes a theory of social thresholds. Each of us has a certain threshold for engaging in various actions, be they violent or benevolent. Take, for instance, a riot. One person in a mob of people who has a very low threshold (perhaps due to a high level of anger) throws the first rock followed by someone with a slightly higher threshold. A social contagion may then set in where individuals who would not have considered rioting get caught up and become participants. If there is sufficient social reinforcement, some people become mass murderers.

One of the most famous social psychology experiments, Zimbardo’s Stanford Prison Experiment in 1971, showed the power of social influence and unchecked authority to turn ordinary people into malevolent prison guards or victimized prisoners. Zimbardo assembled a random group of seemingly normal young men and arbitrarily assigned them to the roles of guards or prisoners. Then, in an elaborate piece of theater, he created an isolated prison environment in which the men were told to follow the rules Zimbardo created. The astounding result was that both groups did not just play-act, but actually became the roles they were simulating. As Zimbardo described it, “the power that the guards assumed each time they donned their military-style uniforms was matched by the powerlessness the prisoners felt when wearing their wrinkled smocks with ID numbers sewn on their fronts.”

Although they stopped short of actual physical abuse, the guards behaved cruelly and with little regard for their prisoner-peers’ humanity. Even the kindly doctor Zimbardo assumed the role of prison supervisor. He was blind to the abusive behavior his social experiment had created until his future wife confronted him from an outsider’s perspective.

What Zimbardo showed was that under the right social circumstances, individuals with generally high thresholds for violent action can become Nazi Gestapo or Abu Ghraib prison guards. Unfortunately, this is the dark side of human social evolution that we have seen played out throughout history. There is nothing unusual in the phenomenon of one group of humans defining outsiders as others who do not warrant compassion or even respect.

The commonality between mass murderers and Islamic Jihadists is that both groups have low thresholds for joining in on unspeakable violence. They then join or are influenced by a social group that glorifies violence. Jihadists operate in response to the social contagion of religious extremism which grows with each atrocity. School shooters and similar murderers are increasingly influenced by a virtual social group and a script laid out by their predecessors such as the Columbine killers. Adam’s social isolation and rage lowered his threshold for joining a virtual group for whom murdering innocents becomes a heroic act.

We do not need to invoke mental illness. A personal sense of rage and social contagion is explanation enough.

References

Gladwell, Malcolm, (Oct. 19, 2015). “Thresholds of violence: How school shootings catch on.” The New Yorker.

Solomon, Andrew. (Mar. 17, 2014). “The reckoning: The father of the Sandy Hook killer searches for answers.” The New Yorker.

Zimbardo, Philip. (2008). The Lucifer Effect: Understanding How Good People Turn Evil. Random House.

Image via geralt / Pixabay.

]]>
/2016/05/11/on-mass-murderers/feed/ 0
Never Say Die – SELF/LESS from Science-Fiction to -Fact /2015/07/10/never-say-die-selfless-from-science-fiction-to-fact/ /2015/07/10/never-say-die-selfless-from-science-fiction-to-fact/#respond Fri, 10 Jul 2015 10:50:01 +0000 /?p=20170 In SELF/LESS, a dying old man (Academy Award winner Ben Kingsley) transfers his consciousness to the body of a healthy young man (Ryan Reynolds). If you’re into immortality, that’s pretty good product packaging, no?

But this thought-provoking psychological thriller also raises fundamental and felicitous ethical questions about extending life beyond its natural boundaries. Postulating the moral and ethical issues that surround mortality have long been defining characteristics of many notable stories within the sci-fi genre. In fact, the Mary Shelley’s age-old novel,  Frankenstein, while having little to no direct plot overlaps [with SELF/LESS], it is considered by many to be among the first examples of the science fiction genre.

Screenwriters and brothers David and Alex Pastor show the timelessness of society’s fascination with immortality. However, their exploration reflects a rapidly growing deviation from the tale’s derivation as it lies within traditional science fiction. This shift can be defined, on the most basic level as the genre losing it’s implied fictitious base. Sure, while we have yet to clone dinosaurs, many core elements of beloved past sic-fi films are growing well within our reach, if not in our present and every-day lives. From Luke Skywalker’s prosthetic hand in Star Wars Episode V: The Empire Strikes Back (1980) to the Matrix Sentinal’s (1999) of our past science fiction films help define our current reality to Will Smith’s bionic arm in I, Robot.

The resulting script of the Pastor brother’s own creative take on the timeless theme, is what grabbed the industry’s attention, after first being ignored and eventually making The 2011 Black List: of best unproduced screenplays.

Director Tarsem Singh had been looking tirelessly for the right thriller and with SELF/LESS he found his match. The result of this collective vision is a great example of a genre’s journey from science-fiction to -fact.

Damian Hale (Kingsley) is a billionaire industrialist dying of cancer. Lucky for him, he hears of a procedure called “shedding,” a newfangled process by which one transfers his consciousness into a different body. Albright (Matthew Goode of THE GOOD WIFE) is the dangerously charismatic and brilliant mind behind the secret organization that, for a dozen million or so, can grant this gift of life to folks like Damian. Albright’s, never say die motto, is an offer hard to refuse and Damian certainly does not. While touring the mysterious medical facility, Albright tells Damian he will be receiving, “the very best of the human experience.” The new body (Reynolds) Albright describes as an “empty vessel,” whose sole purpose is to provide new life — to those who can afford it. Damian is sold.

SELFLESS scanner

Damian goes through his “shedding” procedure, which has a shockingly chilling realism, resembling a super fancy MRI machine. Upon awakening he finds himself in his new body to which he slowly adjusts, after getting over “that new body smell.”

Ryan Reynolds (in foreground) stars as Young Damian and BenKingsley (in background) stars as Damian Hale in Gramercy Pictures'provocative psychological science fiction thriller Self/less, directed byTarsem Singh and written by Alex Pastor & David Pastor.Credit: Alan Markfield / Gramercy Pictures

After a bit of time enjoying his healthy, attractive new body, Damian begins to experience what he is told is a harmless side-effect: hallucinations. What he sees in these episodes – a woman, a young girl, a home, a family – begin to all feel too real. Soon, Damian’s suspicions grow into certainty: these are not random hallucinations; they are images of a past that really happened. In other words, they are memories. But, if the new body was supposed to be an “empty vessel,” whose past is Damian remembering?

Without providing too much of a spoiler (but just in case…SPOILER ALERT!), Damian discovers that his new body was never an “empty vessel” created in a lab. In actuality, his new body had a whole life previous to the procedure. Soon the notion arises, both to Damian and to the viewer: does the life that once owned all these memories, does that man who once had a wife, daughter and an entire life, does he have the change to regain them?

This discovery leads SELF/LESS into the action film realm, which it does quite effectively, complete with shoot-outs, hand-to-hand combat, car chases and yes, even explosions. Like any really good works of science fiction, issues are packaged in an exciting plot buoyed by plausible — albeit futuristic — science.

This is among the reasons SELF/LESS works. It brings up many meaningful issues regarding science and immortality. If people can be saved from disease, age and death will this only be available to those wealthy enough to afford such a procedure? Would there be a selection process where only those deemed “superior” would be given eternal life?

And what would that mean to us all as a society? If Einstein were still alive today, would he have unified gravity with the other forces allowing us all to be traveling around in time machines by now? Would we be receiving iPhone44 by now if the consciousness of Steve Jobs could have been preserved in a healthy body.

The answer to how society may have been affected if anything had gone differently is a definitively impossible question to answer. However, the deeper question, I believe does hold an answer:

Is there an alternative to recycling the genius of the past, and those that we are currently familiar with? Or, can we allow for the possibility that a new genius, perhaps of a mind that will impact society beyond any realm we can currently fathom? Essentially, can we allow new, fresh perspectives in new, never before worn “vessels” to impact even if without the assurance of progress?

For the record I vote for the later. While I admit that is my opinion (as I admit to believing it the the only correct opinion) we must encourage ourselves to all ask these questions, and never be so presumptuous as to think one can ever be fully satisfied with the belief they have discovered every answer or postulated every notion.

Immortality has been the stuff of dreams (and movies and books and plays, etc.,) going far too back to define the exact origin, in addition to the aforementioned FRANKENSTEIN to DRACULA (1931), SLEEPER (1973), and even STAR WARS EPISODE V: THE EMPIRE STRIKES BACK (1980), in which Han Solo is frozen in Carbonite, all have their unique takes on this topic.

In real life, Dr. James Bedford, a psychology professor at the University of California, became the first person to ever be cryogenically preserved on January 12, 1967. He even left money for a steel capsule and liquid nitrogen in his will.

Many of us have heard stories, such as the urban legend that Walt Disney had himself frozen. (By the way, this happens to be only legend so if you were wondering, know it is false).

While there are the famous instances of the known few as well as the even more infamous myths, there have been far more real-life attempts at immortality than many may know of. Perhaps they have gone unnoticed because these events did not deal with world-renowned “geniuses” or hold great wealth and fame. I cannot say anything for certain, but I can share a personal note here.

My father’s cousin, back in 1968, well before I was born, was cryogenically frozen. Steven had been ill all his 25 years, had found an ad for the Cryogenics Society in a science fiction magazine, and, when he died on an operating table, had arranged to be preserved. I won’t go into the details here, but let’s just say it did not work out (turns out it’s pretty expensive to keep a human being frozen, and being frozen after you’ve already died kind of defeats the purpose). But the fact that it freezes a family’s hope that something may happen to bring their loved one back in the future can be more than just a bit cruel. However, the science behind the actions that, in this case proved exceedingly wrong, does not make the discovery behind the science inherently wrong. In fact it is because of this story I believe more than most, that we must engage, and ask questions now, discuss the ethical, moral and pragmatic ramifications now.

I had the opportunity to sit down with some of the cast and crew to discuss the film and some of the issues it raises.

David and Alex Pastor opened up about how their creative process can often be motivated by their own fears, wishes and predictions. David pointed out that the desires present in Damian are feelings that can resonate with everyone.

I feel that everybody can relate to ‘I wish I had more time’. We wanted to write about a powerful character who has everything but whose body is failing him and who then finds that his money might be able to buy him a new life.

Natalie Martinez, who plays Madeline, a crucial character to the story (sorry, can’t tell you why – you’ll have to see for yourself), told me how she enjoyed doing nearly all of her own stunts. I believed her too: she showed me her arm, pointing to her newest bruises accrued in her latest project, WARRIOR, where she plays a female mixed martial arts fighter.

The film made a wise decision in how it represented the technology at the core of the story. This was not a film intended to provide a lesson on the technologies of the future. The filmmakers chose NOT to pack the film with elaborate, made-up scientific explication. Other films, such as 2014’s LUCY, try using data from the real world to explain the premise of their stories, but this typically only shows a complete lack of faith in the film’s storytelling abilities. Thankfully, SELF/LESS doesn’t fall into that trap.

SELF/LESS makes no unnecessary attempt to have its lead character, Damian,, serve as an example of our collective scientific and technological potential. To do so would have been impractical distracting and ethically irresponsible filmmaking. When a film pretends its science is all actual right now — rather than a “science fiction” that takes off from a base facts — it seeks to have its audience believe in its story for reasons other than filmmaking craftsmanship. That leads to serious misconceptions about science. SELF/LESS, while based in scientific fact, doesn’t need to pretend immortality is a current reality: you believe its story anyway.

Dr. Charles Higgins, an associate Professor in of Neuroscience at the University of Arizona and head of the renowned Higgs Lab, when asked whether the concept of transferring one’s consciousness from one body to another is possible, replied:

It is sure to be science future [not fiction]. It’s just a question of whether it’s 30 years or 300 years.

When I asked David and Alex Pastor how they chose to balance the technological realities with their creative vision they responded that although the “key” to the plot and story is:

A revolutionary new technology, we decided we would not get bogged down in technicalities and would keep our story as more of a fable than anything else. It was the moral consequences that interested us. The science fiction that [they] like to write explores moral and ethical issue…ideas tied in to universal themes.

I agree and appreciated hearing that. Whenever a film can instigate thought and raise questions, the result is typically an effective film. But even more so when a film can do that, all within the confines of an action-packed thriller, demanding your visceral attention as well as your active, intellectual engagement. In the end, what makes SELF/LESS a self-aware, unselfish, ethical piece of effective entertainment is that they used action as a device to propel the moral and ethical questions.

]]>
/2015/07/10/never-say-die-selfless-from-science-fiction-to-fact/feed/ 0