He isn’t being vulnerable, he’s crying

As a child, I grew up in a culture defined by rampant sexism, racism, and homophobia. While I now realize many of the people around me were gay, they were invisible to me at the time. At least, their sexuality was invisible to me. As a teenager, I made an intellectual decision that everyone had a right to equal dignity and expression. Living in a seemingly homogeneous society, though, I didn’t have the opportunity to experience my own implicit biases until later.

I strongly defended the rights of gay people to live, work, love, and express their lovepublicly, but my reaction to actual gay lives was untested. I was probably a bit too comfortable with myself and my choice for equality, for the first time I saw two men kissing, I was horrified to find that I looked away with feelings of discomfort and perhaps even disgust. I was then filled with shame for the latent feelings I obviously had, but I did my best to not turn away.

Over time, I was lucky enough to find many gay friends and to experience their love and affection in ways that seemed perfectly natural because they were perfectly natural. I’m sure I still have many implicit biases, and I keep trying to overcome them all, but at least now I can usually deal with people kissing with no internal conflict. (As I age, I have become painfully aware that many young people feel the same disgust when they see older people kissing.)

Unfortunately, many people react to a man crying in the same way I initially reacted to men kissing men—they turn away in discomfort or even disgust. It is widely assumed that it is men who are disgusted by other men crying (and I’m sure some are), but famed vulnerability researcher Brene Brown found that it is more often women who can’t accept men’s vulnerability. Obviously, being vulnerable means much more than just crying, but I would like to say that I think crying is really the single behavior that sets people stomachs to churning.

We find crying so shameful, in fact, that we often call it “being vulnerable” in order to avoid saying the word “crying.” I don’t mean this to be a criticism of researchers’ use of the word “vulnerability” while they discuss men’s emotional health. Rather, I mean to suggest that the rest of us have adopted the word “vulnerability” as a way of avoiding discussion of crying. Often we will only say that a man “was vulnerable,” because to say that he was “openly sobbing” would be to rob him of his dignity and bring shame to him. Paradoxically, by trying to protect him from judgment, we reinforce the judgment that all men face for being weak, sad, or emotional.

I should qualify that last statement. We don’t judge men so much for being emotional as we judge them for what particular emotions they express. Crying is acceptable for women and girls, but anger is reserved for boys and men. If a man loses his son or father, for example, he may seek revenge in various ways, and he is often honored for doing so, especially if the death was caused by malice or negligence.

Historically, revenge frequently took the form of actual violence, and vengeful violence has certainly not disappeared, but revenge can also take the form of lawsuits, public shaming campaigns, and other legal and socially acceptable forms. But the man who falls into a deep depression or cries uncontrollably for an extended period will face criticism. I once talked to a father who was told he needed to “get it together” at his own son’s funeral.

We pretend that men aren’t in touch with their feelings or that men are incapable of expressing their feelings. If these things are true, it is only because we have conditioned men to suppress their feelings through our own reactions of disgust. Boys are taught in their first months out of the womb that crying is unacceptable. The result is that men must either destroy themselves or destroy those around them in order to process their own feelings.

The price we pay is that the men we are around are emotionally drained, stressed to the breaking point, and prone to anger and destruction over empathy and connection. Of course, this is an oversimplification and is an exaggerated statement of what happens. We all know well-balanced men who are nurturing and emotionally connected. Some men are lucky that their lives have not burdened them with too much grief and sadness. Other men have, in spite of social programming, been lucky to find people who accept them and their emotions. And, finally, some men have the fortitude to find effective means of self-care.

Still, we can and should work to remove the shame and stigma from male weakness, and that begins with removing disgust from the sight of male tears. How do we do it?

  1. Don’t turn away. If a man is crying in your presence, do not avert your gaze. Continue to look at him and let him know that you are with him, free from judgment.
  2. If you are a man, openly discuss your own tears with both women and men. When we remove our own shame, the disgust of others cannot affect us.
  3. Stop saying, “boys don’t cry” to anyone, especially a child. Boys hear this almost as soon as people start talking to them. Support the full emotional range of boys.
  4. Stop mocking male tears. Some feminists seem to feel that making fun of male emotions is an acceptable response to centuries of male tyranny, but mocking male tears is a sure way to help perpetuate misogyny and the oppression of women.
  5. Create safe spaces for men. Men need opportunities to talk to other men about crying and weakness. Men need to let one another know that crying is not weakness. You can take care of your family, be a protector, or be a warrior and still take time to cry.
  6. Recognize the varied experiences of men. Adult men are often victims of childhood abuse whether it be physical, emotional, or sexual. Men are victims of domestic violence and abuse. While physical violence is a reality for many men, emotional battery is even more common. The victimization of men is not a joke, so please stop laughing at it.

Many men will reject my suggestions as being absurd and will suggest I should just “man up.” I ask those men to remember those words the next time, and it will happen, they are struggling to force back the knot forming in their throats as they build a dam against the tears threatening to break forth. Whether we choke the tears back successfully or not, the damage is done. We still feel the shame and disgust. We feel devalued and demoralized by our own natural emotions. We can be free and we can be whole. We just have to come out and be honest about what and who we are.

Business man blowing his nose

Diogenes Versus Plato: Who will set you free?

No one can question Plato’s writing and rhetorical abilities. He was a superstar of the ancient world, and the fact that his dialogs have endured for millennia attests to the fact of his beautiful writing. Of course, Bertrand Russell found it ludicrous to praise Plato’s ideas based on the quality of his writing, saying, “That Plato’s Republic should have been admired, on its political side, by decent people, is perhaps the most astonishing example of literary snobbery in all history.” Other famous thinkers of the ancient world weren’t as lucky as Plato; although their reputations survive somewhat through the words of others, we often have no copies of their original works or just a few remaining fragments. It may be that Plato was simply such a great writer that his works were preserved while the works of others were not, or perhaps other factors played a role in which works were saved and which were lost.

According to the biographer of philosophers, Diogenes Laertius, the Cynical philosopher Diogenes of Sinope (no relationship to the biographer), also wrote a number of books.* If he actually did, none survives today. The biography is here. The Cynic is infamous for masturbating in public, going naked, eating in the market, and carrying a lamp around in the middle of the day. As we don’t have the original works of Diogenes, we can’t be sure which of these stories might be true and which are apocryphal as they reflect how others saw him, not necessarily how he presented himself. The lack of surviving texts may be down to Diogenes himself, at least partly. When Hegesias asked todiogenes-800px read some of his writing, he reportedly replied, “You are a simpleton, Hegesias; you do not choose painted figs, but real ones; and yet you pass over the true training and would apply yourself to written rules.”

So, it seems that Diogenes, like Socrates before him, valued face-to-face interaction over the more passive learning that comes from reading. It is worth noting that Diogenes was a student of Antisthenes, who was in turn a student of Socrates. Although Antisthenes was reluctant to accept Diogenes as a student, Diogenes considered Antisthenes, not Plato, to be the true successor to Socrates.

According to Bertrand Russell’s History of Western Philosophy, Antisthenes enjoyed a comfortable and aristocratic life until the death of Socrates. After that, “He would have nothing but simple goodness. He associated with working men, and dressed as one of them. He took to open-air preaching, in a style that the uneducated could understand. All refined philosophy he held to be worthless; what could be known by the plain man.” Also, “There was to be no government, no private property, no marriage, no established religion.”  Diogenes, it would seem, followed the lessons of his teacher to their logical extremes, which lead Plato to describe Diogenes as “Socrates gone mad.”

When studying the history of philosophy, we generally follow the lineage from Socrates to Plato to Aristotle. We could just as easily follow it from Socrates to Antisthenes to Diogenes. With the former approach, we find justification for authoritarian rule over the ignorant unwashed masses constantly threatening the fabric of society. With the latter approach, we find a rejection not only of authority but of all the values that drive the totality of social regulation and empty social status.

It should be no surprise, then, which works were preserved. We know Socrates primarily through the works of Plato, which painted Socrates as a victim of ignorant Athenian leaders who rose to positions of power through a democratic process and not on their own merit. Threatened by the wisdom of Socrates, the thoughtless and insecure leaders sentenced Socrates to death. In response, Plato promised order could be secured under the direction of educated and dispassionate leaders who would tame the rabble, leading from their own realm outside the cave of illusion and delusion. The Cynics, on the other hand, would cause disruption, encouraging the working people to believe that they could take control over their own lives even without the aid of book learning and academic discipline. The Cynics valued reason, but not the well-healed reason of the aristocrats such as Plato and Aristotle.

Further, the Cynics encouraged citizens to question the value of everything that is supposed to motivate the working class. For Plato, workers driven by their appetitive elements would produce more goods in order to receive rewards to satisfy their hungers and desires. Diogenes rejected the value of expensive clothing, food, shelter or anything else, and often lived off what he could get through begging. Having almost no possessions and no desires for any more, how could anyone take control over him or threaten him with anything? When Perdiccas threatened Diogenes with death if he didn’t appear before him, Diogenes reportedly replied, “That is nothing strange, for a scorpion or a tarantula could do as much: you had better threaten me that, if I kept away, you should be very happy.” As Todd Snider said in his song, “Looking for a Job,” “Watch what you say to someone with nothing. It’s almost like having it all.”

Imagine if the working class (note: if you work for money, you are working class) now began to question the value of cars, wide-screen TVs, sports, clothing, and “good” neighborhoods. And if the poor of the world adopted Diogenes’s views on citizenship, who would fight our wars? Diogenes gets credit for coining the word “cosmopolitan,” which is usually taken to mean citizen of the world. People who travel the world, speak more than one language, eat varied cuisine, and are not, to put it simply, provincial, consider themselves cosmopolitan, but this is not what Diogenes meant. Diogenes considered himself a citizen of the universe with no political allegiance and without political rights. He was banished from his home for defacing currency or something, and he was what would now be described as a “man without a country.” Imagine everyone being that way (John Lennon thought it should be easy, if you try).

Examined rationally, as the Cynics would have us do, virtually nothing we hold dear has any intrinsic value. We spend our lives working for trifles while ignoring anything that make us genuinely happy. When Diogenes was told it is a bad thing to live, he said, “Not to live, but to live badly.” We can live well, but we may be thought mad.

* Diogenes Laertius says, “The following books are attributed to [Diogenes of Sinope]. The dialogues entitled the Cephalion; the Icthyas; the Jackdaw; the Leopard; the People of the Athenians; the Republic; one called Moral Art; one on Wealth; one on Love; the Theodorus; the Hypsias; the Aristarchus; one on Death; a volume of Letters; seven Tragedies, the Helen, the Thyestes, the Hercules, the Achilles, the Medea, the Chrysippus, and the Oedippus.”

Illness as Financial Ruin (US only)

Every human who has drawn a breath has faced illness, injury, and death. The universal experience of illness creates vulnerability, loss of identity, anxiety, diminished autonomy, and fear. The inescapable battle between health and illness defines human experience and shapes our personalities, our worldviews, and spiritual depth.

For most of the developed world, though, it does not mean financial ruin. In the United States, alone among developed nations, even a relatively minor injury such as broken bones or illness requiring a brief hospital stay can lead to economic disaster. As a result, when we in the US get sick, we don’t think about how we can recover, how we can endure the pain, or the spiritual significance of our pain; rather, we think of how we will pay for our bills.

poorunclesam-800pxAs we face our anxiety over possible diagnoses, we must constantly be prepared to battle with insurance companies, aggressive hospital billing agents, and doctors exhausted from dealing with insurance paperwork. Few things in life create as much anxiety as financial insecurity, and illness always brings the threat of insecurity to US residents. When people have serious accidents, they balk at calling an ambulance because they fear the bills—they worry over whether the ride will be covered and whether the ambulance will take them to a hospital that is in-network. As a result, many people suffering medical emergencies drive themselves to the hospital.

When it isn’t an emergency, Americans often forgo treatment altogether. A Gallup poll in 2014 found that one-third of Americans skip needed medical treatment because of cost concerns, even when they have insurance.  According to the report, “Some 34% of Americans with private health insurance say they’ve skipped out on care because it was too expensive, up from 25% last year. Additionally, 28% of households that earn $75,000 or more report that family members have delayed care, up from just 17% last year.” The Affordable Care Act succeeded in insuring more people, but it also created greater financial burdens for middle-income families through higher deductibles and co-pays. Many people who have been accustomed to being able to afford healthcare now find that it is out of reach.

While healthcare inflation has slowed a bit in recent years,  catastrophic medical events put the costs incurred out of the reach of most of us. The United States alone finds medical fundraisers to be normal and routine. According to an article in Journal News, the number of GoFundMe contributions for medical expenses “was up more than 293 percent in 2014, when more than 600,000 medical campaigns were launched, compared to just over 158,000 in 2013.”  Families with or without insurance cannot afford their medical bills. A serious accident or illness such as cancer creates an existential crisis while forcing people suffering from illness and their families to scramble to avoid destitution.

I don’t write this impersonally, my wife and I buy our insurance through the healthcare exchanges. We pay $682 per month ($8,184 per year) with a $4,000 deductible per person. The out-of-pocket limit on expenses is $13,700 per year. Balance-billed charges do not apply to the out-of-pocket limits, so there really is no upper limit to possible charges. Ignoring balance billing, my costs could easily exceed $20,000 per year.

I often hear the argument that universal healthcare coverage is too expensive and will require raising taxes on the middle class. As I see it, I would still benefit from a tax rise of $15,000 or even $20,000 each year. It is true that others are not in my position, but all Americans should realize they are at risk. No one stays young and healthy. Eventually, everyone will be at greater risk for catastrophic illness, but even those who are currently young and healthy can face illness and injury, though we may not like to think about it. Further, everyone’s income is subject to great variability. Those who have employer-provided health insurance may not want to pay in to a national system, but employer-provided insurance is never guaranteed. Employers may cut benefits, employees lose jobs through layoffs and termination, or illness can end employees’ ability to work.

The same is true for business owners. The tides of fortune shift. When the Affordable Care Act was passed, Mary Brown brought a lawsuit against it, saying she did not want to be compelled to purchase health insurance. Mary Brown owned an auto repair shop that went under due to the pressure of economic recession and the Gulf oil spill in 2010. Of note, her bankruptcy filing listed “among the couple’s unsecured creditors several providers of medical care – a hospital and a physician group in Florida; an anesthesiology group based in Mississippi; and an eye care center in Alabama.” https://newrepublic.com/article/98145/affordable-care-act-mandate-lawsuit-nfib-mary-brown-bankruptcy-court-standing

Like many people, when she was doing well, Mary Brown thought that guaranteed universal access to healthcare was something the government was providing to other people. It didn’t occur to her that she might ever be in a position where she could not pay for her own medical care, but that is exactly what happened. I recently had the opportunity to speak to a Swedish citizen about Sweden’s healthcare system. He was a middle-aged man who explained that healthcare was paid through higher taxes. He said he didn’t mind the taxes, though, because you never know when you will be the one needing care.

It seems many Americans are not able to make this basic calculation of risk. Most people, even those who consider themselves well off, are not immune from the financial ruin that illness and injury can bring. Once people realize their own vulnerability, they support universal coverage for healthcare. The time for a more sober and accurate assessment of risk is well past due. We must wake up to the fact that the US healthcare system is not sustainable, that it leaves us at risk of financial failure, that it makes the experience of illness exponentially more stressful, and that we can do better.

It will not be easy. The US spends far more than other developed nations on healthcare. Each excess dollar we spend is profit for an insurance company, hospital, testing facility, pharmaceutical company, biotechnology company, or other player in the healthcare industry. Many people profit from the dangerous, expensive, and inefficient system we have in the United States. Every reduction in healthcare spending will be a reduction in profit for someone, and each person (or business) facing a loss of income will argue vehemently and vociferously that such a loss of income is a horrible tragedy and an impossible feat.

We will be told that reducing healthcare spending will reduce the quality of care. We will be told it will reduce our choices and control. We will be told it is impossible. We already have little choice or control, and we already have higher mortality rates than the rest of the industrialized world, so we have nothing to lose and everything to gain. We have plenty of ideas on how to improve the system. What we lack is political will, but I think the will is growing. If we want universal coverage, we must demand it, and the time to demand it is now.

 

Stop infantilizing old people, please

As I write this, I am 55 years old. Like most people my age, I like to think I am a “young 55” or that I look good “for my age.” As I get older, I think I have become a little more patient, more accepting, less doctrinaire, and, yes, sadder and wiser. However, I have not become more adorable, precious, charming, or sweet.

Although I am not yet extremely old, I’ve already noticed that younger people I hardly know sometimes refer to me as “sweetheart” or “sweetie.” This seems to be a particular problem in healthcare settings. Some call it “elderspeak,” which is characterized by treating older people more as children than as fully functioning adults (I personally feel this demeaning language is often inappropriate for children as well, but I will take one thing at a time). For some reason, when people talk to older patients, they tend to slow their speech, raise the volume, and sing their sentences. In addition, every statement seems to become a question and second person pronouns are replaced with first-person plural pronouns ( e.g., “you” becomes “we”). You can read more about this phenomenon here.  At a time when nursing home workers are sharing explicit photos and videos of older adults on social media, complaining about “sweetheart” seems almost quaint, but both the diminutive terms and the more extreme demeaning media rob patients of their dignity and personhood.

Other people seem to think they are honoring older adults by treating them as mascots. Many videos on social media feature adults who are “adorable” or “precious” dancing, singing, or doing other activities they have no doubt done for their entire lives. The videos are presented with the exact same attitude behind videos of kittens, puppies, and babies. Samuel Johnson once said, “A woman’s preaching is like a dog’s walking on his hind legs. It is not done well; but you are surprised to find it done at all.” Videos of the elderly seem to take the same attitude: it is amazing that older people might still do the things they love. If they make the attempt to engage in the activities that make them happy, the are “so cute.”

The consequence of assuming adults become children once again in later life can have serious consequences. For instance, healthcare providers often ignore the sexual health of older patients. As this article states, “prevailing misconceptions among healthcare providers regarding a lack of sexual activity in older adults contribute to making elders an extremely vulnerable population.” The result of this ignorance, is that STD rates among the elderly are increasing at an alarming rate. Although about 80 percent of adults aged 50 to 90 years old are sexually active, they are infrequently screened for STDs.

I am more concerned, though, about the basic harm of a society that treats its elders as mascots for amusement. As we age we lose the respect of our fellow beings and we lose our status as persons. For the most part, younger people don’t mean any harm, even if they are doing harm; they are acting out of ignorance. That being the case, I am here to help. The following are things you should know about your elders:

  1. They have and talk about sex. In a movie, it is always easy to get a good laugh by having an old person, especially an old woman, make any kind of statement that indicates she knows what sex is. Apparently, many young people believe that when you hit a certain age you become an innocent and naïve virgin, completely unaware of how people reproduce.
  2. They curse. This is related to the first point, but it slightly different. If you curse now, you will probably curse in 10 or 30 years. At what point do you think it should become funny or cute? Old people have the same right to words that everyone else has. Language is a human right.
  3. They still know how to do things. It isn’t amazing that someone who has danced since he was seven still likes to cut the rug when he is 80. Our abilities may diminish over time (some do and some don’t), but we don’t suddenly forget everything we’ve learned over a lifetime.
  4. They are still rational and intelligent. I realize we all suffer some cognitive decline as we age and some are affected by diseases that accelerate or accentuate that decline, but young people also suffer brain injury, disease, and other limitations on cognitive ability. Age is not a sufficient reason to believe someone is stupid.
  5. They’ve won the battles you are fighting. Somehow, your elders have survived. If you can manage the same, you should be honored, as you should honor them now. Any old person can tell you it isn’t easy growing old. Someone who has survived had the wits and strength to overcome many adversities. They could teach you a thing or two.
  6. They are persons. Here, I am using the word “persons” in a philosophical sense of someone who bears human dignity and value. It does not diminish as you age. If anyone has value, you do.

In case you haven’t seen any of the videos I described above, here is an example:
[youtube https://www.youtube.com/watch?v=R7Br3-5L6hM]

Sparkle, Autonomy, and the Right to Die

Recently a woman in the UK known only as C won the right to effectively end her life by refusing dialysis treatment. Owen Bowcott, writing for The Guardian described it as a “highly unusual judgment,” but, in making the decision, the judge said, ““This position reflects the value that society places on personal autonomy in matters of medical treatment and the very long established right of the patient to choose to accept or refuse medical treatment from his or her doctor.”

The judge is correct; the right to refuse treatment is one of the bedrock principles of medical ethics. In most medical decisions, autonomy trumps all other considerations, including efficacy of possible treatment. In other words, you are not obligated to accept treatment simply because it will prolong your life. This is the newnhamm-MultiColored-Sparkle-fixed-2400pxway things work in the world of medicine, but there could be other approaches.

Given the facts of this case, it seems a suicidal person sort of “lucks out” when an unrelated medical issue arises. Unlike C, not everyone seeking death is able to find a legal way out. Those who are so physically incapacitated that they cannot possibly end their lives without help often find too many roadblocks to death to ever carry it out. Even when healthy people try to commit suicide, the rest of us are obligated to prevent it when possible. If we find someone who has taken a drug overdose, for example, we try to save him or her. If someone is trying to jump off a bridge, we try to prevent it. And if someone asks for drugs to commit suicide, only a few places in the world allow them to be prescribed.

It is clear that we do not always respect the autonomy of suicidal individuals. Even in the case of C, the judge said, “My decision that C has capacity to decide whether or not to accept dialysis does not, and should not prevent her treating doctors from continuing to seek to engage with C in an effort to persuade her of the benefits of receiving life-saving treatment in accordance with their duty to C as their patient.” The judge seems to feel that the doctors ought to continue trying to save C, even while recognizing that she has the right to refuse treatment.

Clearly, the law in this case is built around autonomy, but perhaps it shouldn’t be. Autonomy assumes a rational and unimpaired person making a fully informed decision. The judge notes that C is fully functional and has no cognitive impairments. At the same time, though, C is facing a diagnosis of breast cancer and a severely damaged self-image. It isn’t clear that she may not modify her view with a little time and, perhaps, psychotherapy.

If her mental health is impaired, she may not be fully autonomous in the first place. If she isn’t, then perhaps she needs care more than freedom. An Ethics of Care would possible guide us to respect her wishes as well as her needs. A little more time may be needed to assess whether her decision, which is not reversible, is truly the decision she wants to make. With a little time and support, she may come to believe that sparkle is still possible for her.

I also think a focus on capabilities might be relevant. An ethics focused on capabilities would try to enable her to have a fulfilling life by maximizing the abilities she still has. Care and capabilities both emerged as feminist approaches to ethics and justice. While on the surface, this may not seem to be a feminist issue, but the judge also said, “It is clear that during her life C has placed a significant premium on youth and beauty and on living a life that, in C’s words, ‘sparkles’.”

It is clear that C has operated under rather sexist values for most of her life. That is her choice, to be sure, but it might be possible to find new values. Many who have experienced crippling injuries have sought suicide only to later find their lives are valuable and meaningful even without the activities and relationships they once held dear.

Book Review: The Experiment Must Continue by Melissa Graboyes

We all have a complicated relationship with medical research. We know that every effective treatment or therapy that exists was once an experimental treatment or therapy. We know that some drugs have been so effective that they eradicated various diseases completely, and we also know that someone had to be the first one to try all those new drugs. On the other hand, most new drugs don’t work out. Some are simply not effective, some are effective but have serious side effects that make them all but useless, and others turn out to be deadly.

Medical research is plagued with problems related to consent, coercion, therapeutic misconception, benefit, and access. All these problems exist Medical-Research-800pxin North America and Europe with both well educated, affluent populations and with so-called “vulnerable” populations.

Informed consent is an example. Virtually everyone agrees that patients who participate in medical research should know about and agree to their own participation. Ethics committees, lawyers, and bioethicists have gone to great pains to develop procedures for proper informed consent procedures. Sadly, too many people talk to their doctors about treatment options, hear about ongoing research, and sign consent forms without actually realizing they have agreed to participate in a medical experiment. Despite the best intentions of everyone involved, patients believe they are receiving treatment that is expected to help them (therapeutic misconception).

I sometimes use the HBO film adaptation of Margaret Edson’s play, W;t, in my classes. The main character in the play agrees to experimental treatment, is informed of the side effects and goals of the research, and then goes on to suffer tremendously for her decision. When I have my students write about the movie, more than half of them still believe the doctors were trying to cure the cancer of the main character. Despite all the frank discussions of the research, they still don’t understand that the protagonist was never expected to benefit from the treatments she was receiving. Furthermore, the character never seemed to fully realize that her participation was never expected to benefit her in any way.

If these kinds of misunderstandings happen between researchers and research participants from the same culture speaking the same language, the problems are sure to be compounded by cross-cultural communication. In her book, The Experiment Must Continue: Medical Research and Ethics in East Africa 1940 – 2014,Melissa Graboyes explores ethical challenges and lapses in numerous studies conducted in East Africa. Her book is a refreshing attempt to shed “conventional wisdom” about research in Africa.

For example, I think anyone who has studied research ethics has heard that African chiefs would sometimes provide consent for all the people in a village to participate in research projects. Graboyes says she could find no evidence that anyone in any of the locations under study ever recognized the right of anyone to give collective consent for a group of people. Further, many describe African research participants as “vulnerable” populations with little to no agency. In the sense that many people lack adequate medical care, they are vulnerable, but Graboyes challenges the notion that they lack agency and gives several examples of Africans responding actively and rationally to both exploitative research and beneficial research. In short, she shows that they are actually persons with wills, minds, autonomy, and awareness.

Another common theme for those studying research ethics is the use of coercion to get people to enroll in trials. Many wring their hands worrying over whether offering payment or gifts might unduly coerce potential participants whose desperate poverty might drive them to enroll. Those who did enroll, however, were more concerned about inadequate compensation than undue coercion. Participants realized that others would benefit from research carried out on their bodies or in their homes. In exchange for participating, they felt some reasonable benefit was due, whether it be in the form of cash, medicine, or health services.

One possible benefit, of course, is access to medicines researchers commonly advertise that participants will receive a new treatment at no charge. Many African participants assumed they were trading their blood for research and in turn would receive medicines that would benefit them. In some cases, participants did receive helpful medications, but those medicines were then withheld from them at the end of research, even if it proved to be effective. Researchers say it isn’t their responsibility to provide the medications, which may or may not be expensive, but leaving people with the knowledge that an effective treatment exists without making one available seems to me to be a particularly cruel kind of harm

In the United States, people also expect access to new medications. When people find they have a terminal illness, they will often (I want to say usually) demand to receive experimental medicines. In the 1980s, AIDS activists in the US demanded that experimental treatments be distributed to HIV-positive individuals, and demands for quick approval for experimental drugs have become routine. In this sense, medical research may be a victim of its own success. Most people in either America or Africa fail to appreciate the risk they take with unproven medicines.

Although many researchers view Africa as a fertile field for research (many describe Africans and “walking pathological museums) for the abundance of diseases present and for the relative low costs involved compared to research conducted in Europe and North American. Graboyes describes both successes and failures in East Africa, but the failures can be depressing. In some cases the research never got off the ground, in some it never produces usable results, and in some it made conditions much worse.

Is it unethical to conduct research in Africa? Graboyes doesn’t think it is necessarily unethical to conduct research in East Africa, but she does feel some of the research has been unethical, some simply misguided, and some poorly designed. Many Africans do not trust researchers, which is frustrating to researchers who feel they are on a noble quest to end disease, but many of them fail to realize how many researchers have told outright and deliberate lies in East Africa. People do not forget so easily.

I don’t want to give away too many details of the book, as it can become something of a page-turner. One last thing I will mention, though, is the fact that Graboyes was aware that she was another researcher visiting East Africa asking for cooperation. Although she wasn’t taking blood, spraying insecticides, or injecting treatments, she still needed to ensure that she was proceeding ethically and had the trust of the people she was interviewing. Her efforts are admirable but remind us that any reporting of facts is a matter of interpretation and may be subject to modification.

This book is admirable and compelling, especially for those interested in the ethics of international research. In addition, her insights might help to develop better ethical practices for domestic research, as many of the issues are the same.

Reid Ewing and the Failure of Autonomy in Bioethics

Reid Ewing of Modern Family fame recently wrote publicly about his struggle with body dysmorphia in a personal essay on the Huffington Post. Ewing revealed that his dysmorphia led him to seek and receive several surgeries. He feels his surgeons should have recognized his mental illness and refused to perform surgery. He wrote, “Of the four doctors who worked on me, not one had mental health screenings in place for their patients, except for asking if I had a history of depression.”

The principle of autonomy is by far the most discussed principle of bioethics. Discussions typically focus on the rights of patients to refuse treatments, not to seek them. On either side, the issues can be thorny. If a depressed and suicidal patient refuses life-prolonging treatment, is it ethical to respect the patient’s autonomy or should mental health services be provided first? As in Ewing’s case, the ethical problem arises from the claim that the decision is driven by mental illness and not reason. If someone is mentally ill, they are not fully autonomous agents as they are not fully rational.

This is a problem with autonomy in general. Our ideas of autonomy come largely from Immanuel Kant, who claimed that all rational beings, operating under full autonomy, would choose the same universal moral laws. If someone thinks it is okay to kill or lie, the person is either not johnny-automatic-gloved-hand-with-scalpel-800pxrational or lacks a good will. How do we determine whether someone is rational? Usually, most of us assume people who agree with our decisions are rational and those who do not are not rational. If they are not rational, they are not autonomous, so it is ethical to intervene to care for and protect them.

Earlier this year, a woman named Jewel Shuping claimed a psychologist helped her blind herself. She says she has always suffered from Body Integrity Identity Disorder (although able-bodied, she identified as a person with a disability). Most doctors, understandably, refuse to help people damage their healthy bodies to become disabled, which can lead clients to desperate measures to destroy limbs or other body parts, sometimes possibly endangering others.

Jewel Shuping never named the psychologist who may have helped her, so it is impossible to check the story. It is possible to imagine, however, that some doctors would help someone with BIID in the hopes of preventing further damage to themselves or others. Shuping says she feels she should be living as a blind person, and she appreciates the help she received to become blind. In contrast, Ewing feels he should have undergone a mental health screening before he was able to obtain his surgery and that his wishes should not have been respected.

Plastic surgeons are often vilified as greedy and unscrupulous doctors who will destroy clients’ self-esteem only to profit from their self-loathing. On the other hand, these same plastic surgeons are hailed as heroes when they are able to restore beauty to someone who has been disfigured in an accident or by disease. Unfortunately, we do not have bright lines to separate needless surgery to enhance someone’s self image and restorative surgery to spare someone from a life of social isolation and shame. Some would argue the decision should not be up to the doctors in the first place but should be left in the autonomous hands of clients.

Many have similarly argued that doctors should refuse gender confirmation surgery to transgender men and women. As with BIID, many assume that transgender individuals are mentally ill and should see a mental health professional, not a surgeon. Transgender activists (and I) argue that transgender individuals need empowerment to live as the gender that best fits what they actually are. If surgery helps them along that path, they should have access.

All this leaves us with the question of when to respect autonomy and when to take the role of caregiver, which may involve a degree of paternalism (or maternalism for that matter). Is it more important for doctors who ensure the patient’s rights to seek whatever treatment they see fit, or is it more important to provide a caring and guiding hand to resolve underlying mental health issues before offering any treatment at all?

One of Ewing’s complaints is that he was offered plastic surgery on demand with no screening at all. The process for people seeking gender confirmation surgery, by contrast, is arduous. Before surgery, transgender people go through counseling and live as their true gender for an extended period of time. At the far end of the spectrum, people with BIID rarely find doctors willing to help them destroy parts of their bodies and resort to self-harm. These three cases are not the same, but make similar demands on the distinctions between respect for autonomy and a commitment to compassionate care.

It seems reasonable to accept Ewing’s claim that mental health screenings should be a part of body modification surgery, especially when someone has no obvious flaws that need to be repaired. In all these cases (dysmorphia, gender identity, and BIID), mental health support is necessary. In each case, patients describe depression, emotional turmoil, and, too often, thoughts or attempts of suicide. Mental health care does not require a violation of autonomy, but it may help a person’s autonomous decisions to form more clearly from deliberation and not desperation.

 

Tom Digby on Militarism, Sexuality, and Romance

In a post on how men can be better feminist allies, Emma Cueto advises men to avoid the temptation to put men’s issues first. She sums up the problem of “toxic masculinity” by noting, “is not fun for anyone and often limits men’s choices in terms of interests or self-expression, and it means that many men are never really given the tools to properly deal with their own emotions.”  She goes on to say that men are not sexually assaulted at the same rate as women, are not victims of domestic violence as often as women, are not victims of pay disparities or sexual discrimination as often as women, and aren’t confronted by laws designed to control their bodies. She is right on all counts, but Tom Digby’s book, Love and War: How Militarism Shapes Sexuality and Romance , helps show why it is impossible to separate culturally programmed masculinity from sexual assault, reproductive regulation, domestic violence, and job discrimination and why feminists must deal with how sexism affects both men and women simultaneously.

His thesis is that militaristic societies establish values and goals that require men to cut off their feelings of care for others and for themselves, see women’s freedom as a threat, and rely on violence to solve their problems. In order to achieve military objectives, subject both boys and girls with intense cultural programming from birth to encourage strength in boys and passivity in girls. With this thesis, he flips the script from what many assume: that men are violent and cut off from their feelings by biological programming. Early in the book, he offers two pieces of evidence that this assumption is faulty. First, men and women in some societies do not show the differences that are so prevalent in militaristic societies. Second, he shows that men often fight against their own biology to retain the appearance of stoicism. Indeed, almost all men have been cruelly taunted for their failure to maintain their composure (choking back tears) even before reaching adolescence. If biology prevented boys from crying, no one would have to keeping telling boys not to cry. The conditioning is relentless and severe.

War dependent societies must maintain ample supplies of expendable men as well as childbearing women who will provide future generations of warriors. This requires shutting down empathy in men, glorifying risk and violence, and valuing women according to sexual availability and passivity. To the extent that maintaining near constant war was the goal, this model worked for centuries, but things have changed. I wish I could say we are no longer reliant on war, but that is sadly not driving the change. Digby points out that while war is still with us, the need for individual warriors who do one-on-one combat, relying on brute strength, has greatly diminished. Combat is now highly mechanized, and what physical differences may exist between men and women often offer no benefit to either side or may even give an advantage to women (he notes the case of jet fighters).

As a result, most men do not experience direct combat, or any kind of combat, in their lives. Our warriors must find other outlets for their masculinity. They may do it through aggressive sports, war games such as paintball, or even through violent video games. Digby points out that while women may be attracted to warriors, the guy who dominates video games doesn’t get quite the accolades of war combatants.

Another change is the material relationship between men and women. In the past, women were materially dependent on men and would comply with men’s wishes in order to avoid poverty. As women have entered the workforce, many are now the primary wage earners for their families. As women earn college degrees and professional credentials at higher rates than men, it is inevitable that men will become increasingly dependent on women for material support. These social changes leave our masculine warrior with an identity crisis. One option is for him to change his identity, which requires becoming more dependent and empathetic. This would be to become more “feminine” (a horror to the warrior). Or, the second option is for him to become more strident and militant, which may account for increased attacks against feminism and women these days.

When we observe the vitriol in attacks against feminist women online, graphic violence against women in video games and movies, and actual physical brutality and murder of women, it is easy to see the desperation of the warriors who refuse to go down without a fight. The fact that their opponents wish them no real harm seems to be of no consolation. It took me awhile to read this book because I assumed I would agree with it, and I did. I already knew that men were programmed to cut off their empathy, to expect women to be passive, to have the greatest disdain for “feminine” men, and so on. This book does bring a new analysis to these facts, though. It gives a new understanding of how things have gotten where they are and how they may be different.

I have only one minor quibble with one claim in the book. In chapter two, Digby quotes Sandra Bartky to explain the transactional nature of heterosexual relationships. He quotes Bartky as saying, “He shows his love for her by bringing home the bacon, she by securing for him a certain quality of nurturance and concern.” The claim is that men are emotionally unavailable or unsuited for empathy and emotional nurturance. On the other hand, women are expected to provide comfort and emotional support for men. I do think it is true that men are more likely to seek emotional support from women than from men, but I do not think this transaction is so readily accepted in heterosexual relationships.

I’ve spent quite a bit of time talking to both men and women in grief. Many men are so conditioned to “be strong” that they will never ask for support from the women in their lives for fear of appearing weak. Also, many feel they must suppress their emotional needs for the good of the family. Because they succeed in appearing strong, the women around them believe they are strong and do not need emotional support. As a result, men too often face grief and depression in complete isolation. When they finally crumble under the pressure, many will say, “I had no idea things were so bad.” This may help explain why men commit suicide at higher rates than women. Sadly, I’ve heard too many women say that they, also, do not feel supported by other women. Increasingly, at least in the United States, I feel grief is becoming a solitary activity for both men and women.

I hope we can all begin to support one another by offering each other protection, emotional support, material support, and just human kindness.

Suffragette, Slavery, and the Appropriation of Suffering

Controversy erupted recently over a photo shoot in which the stars of the movie, Suffragette, wore t-shirts that said, “I’d rather be a rebel than a slave.” A group of white women wearing a shirt with a message comparing themselves to slaves was a problem to begin with, but people familiar with the fact that southern defenders of slavery in the US are known as Rebels only made things worse.

Defenders of the movie, the photo shoot, and the quote said the outrage was based on a misunderstanding of the quote, which comes from a speech by the British suffragette, Emmeline Pankhurst, rallying women to free themselves from the oppression of patriarchy. In the United States, abolitionists and suffragettes were sometimes, though not nearly always, the same people. The comparison of slavery to women’s oppression was noted by many, including former slave Frederick Douglas, who wrote, “In respect to political rights, we hold woman to be justly entitled to all we claim for man. We go farther, and express our conviction that all political rights which it is expedient for man to exercise, it is equally so for women.”

In the UK, people are less sensitive to comments about slavery and rebels. Some have suggested that the UK did not have slaves and that the quote is therefore not offensive. Time Out London, which published the photos, said in a statement: “Time Out published the original feature online and in print in the UK a week ago. The context of the photoshoot and the feature were absolutely clear to readers who read the piece. It has been read by at least half a million people in the UK and we have received no complaints.”

The UK does have a history with slavery, though. Unlike the US, Britain did not have a large workforce of slaves, but that doesn’t mean the UK had no involvement in slavery. Slavery was abolished in the UK in 1833 by the Slavery Abolition Act, which ended slavery throughout the British Empire with the exception of territories under control of the East India Company, Ceylon, and the island of Saint Helena. The exceptions were eliminated in 1843. In the US, President Lincoln issued the Emancipation Proclamation in 1863.

Having been neither a woman nor a slave, I hesitate to comment on the controversy of the use of the Emmeline Pankhurst quote, but it turns out that philosopher Elizabeth Spelman made an insightful and relevant commentary on the issue in her 1997 book, Fruits of Sorrow: Framing Our Attention to Suffering. In the first place, she points out that phrases such as “women and minorities” excludes and ignores the existence of minority women. Comparisons to slavery are a case in point. She says,  “Consider the talk about women being treated like slaves. Whenever we talk that way we are not only making clear that the ‘women’ we’re referring to aren’t themselves slaves; we’re making it impossible to talk about how the women who weren’t slaves treated those who were.” When a white woman suffragette declared her preference for rebellion over slavery, was she honoring the suffering of slave women or, indeed, setting herself apart from them?

Drawing on the work of Jean Fagan Yellins, Spelman continues, “The female slave is made to disappear from view. Although presumably it was the female slave’s experience that originally was the focus of concern, the other women’s experiences were made the focus.” Somehow, white women made use of the suffering of slaves without experiencing the actual realities of slavery, even if the oppression of white women was intolerable, it was not an experience shared with actual slave women.

When this relationship between white suffragettes and slaves is exposed an analyzed, of course white women will want to deny their privilege and insist that they were only honoring their sisters. They can say this with great honesty, because they are not aware of their privileged status. Further, Spelman says, “The deeper privilege goes, the less self-conscious people are of the extent to which their being who they are, in their own eyes as well as the eyes of others, is dependent upon the exploitation or degradation or disadvantage of others.”

When privilege is pointed out, it makes us uncomfortable. As a result, our reaction is motivated by shame. Self-awareness is necessary to effect change, but it is also painful. Spelman says, “Seeing oneself as deeply disfigured by privilege, and desiring to do something about it, may be impossible without feeling shame.” The shame provokes a defensive reaction, but it can also help to facilitate healing and solidarity–in some cases, anyway.

With the Emmeline Pankhurst quote used by the magazine, we can see the defensive reaction. Many people defended the quote as being taken out of context, as being somehow separate from slavery because it was British, or being a victim of PC culture gone mad. In the end, though, the outrage at the use of the quote helped spark a conversation about the suffragette movement, Britain’s role in slavery, and sensitivity to women whose experiences lie outside the realm of so-called “white feminism.”

Ethics of Grief: Profiting from the Pain of Others

Imagine you and a friend go to see a documentary (or even fictional film) about the plight of victims of famine, war, disease, or oppression, and you bawl uncontrollably throughout the film as your friend sits next to you unmoved and indifferent to everything happening on the screen. You think anyone who isn’t moved by the extreme suffering you’ve just seen must be some kind of monster (or a sociopath at the least). You feel, in short, that crying is more moral than just sitting there.

You will admit, of course, that your crying through the movie didn’t help the victims any and your friend’s indifference didn’t really hurt anyone. Still, it seems that a moral person should have feelings for those who are suffering, even if you can’t find any real benefit for these strong feelings for strangers who get no benefit from your tears, heartfelt as they are.

In fact, your friend might point out that you are getting all worked up for no reason, and it might be better to keep your emotions in check. Your wailing for these strangers won’t change anything for them, but it might impair your ability to attend to problems you can change. What good are you to your children, for example, if your mind is on the poor souls in some far corner of the world? You should get your head together, friend, and get on with the business of life.

But, you counter, if you learn to be indifferent and unmoved by the pain of strangers, you may become indifferent to the pain of others, including friends and, yes, your own children. You don’t want to become the kind of monster you now suspect your friend of being. You want to be the kind of person who is moved by the suffering of others. You may not be able to help in every situation, but you do not want to become callous and cold. You want to be a caring individual. It isn’t about what you can do but about what you are.

And now your friend points out that not only did you cry during the movie, but you seemed, in some sense, to enjoy it. In fact, you apparently went to the movie with the prior intention of being moved to tears. You chose the movie because it was described as “moving” and “emotionally riveting.” Will you be happy when your children fall ill because it will satisfy your need to “let it all out”? Perhaps you are the monster, after all?

You didn’t enjoy the pain, you object, but you enjoyed the high quality of the film and its ability to elicit the pain. It was beautiful in its ability to enlarge compassion and trigger a caring response. The film will help, if nothing else, audiences develop a greater sense of concern for others, even if it doesn’t affect everyone (with a sly and disapproving nod to your friend).

And your friend now points out that people had to suffer in order to expand compassion and develop a greater caring response, so the suffering of others is used as a means to your own ends. You are actually acting selfishly after all, and the film makers are also exploiting the suffering of these people in order to teach a moral lesson and even to make a profit and perhaps sit in the spotlight after receiving coveted awards. You can just imagine the director’s teary expressions of gratitude and exhortations for a more acts of compassion at the ceremony.

In 2012, comedian Anthony Griffith told the story of his daughter’s cancer in a moving performance for The Moth. The video quickly went viral. You can see the video here:

The video on YouTube now has more than 1.8 million views. It is almost impossible to watch the video without sobbing, and people shared it by promising that anyone watching should have some tissues on hand. For reasons that aren’t entirely clear, we enjoy experiencing his grief with him. It might be objected that we are emotional voyeurs watching a sort of grief porn. By watching, we are not helping his daughter, we are not preventing future cancer deaths, we are not improving medical care, and it isn’t clear how we might be improving ourselves.

Paradoxically, we simultaneously want to avoid our own pain but glom onto the pain of others. Watching the story enables us to experience the pain without having to actually experience the loss of child. Doing this while watching a fictional account of loss seems justifiable in many ways, but to seek out a chance to cry and experience this kind of pseudo-grief that is provided by the actual grief of another person certainly raises an ethical concern.

We might say that Anthony Griffith needed to talk about his loss, and we are providing him with an audience. We are doing him a great favor by listening. We are honoring his loss. And he may agree with us. In this case, he is using us to help him along his healing journey, but this doesn’t seem to be what is going on. We want to see and hear his story. We want to be part of his grief story without having to do any heavy lifting ourselves. We watch the video, feel emotional excitement, hug our loved ones because one never knows when they will be gone, and then we are done with it.

We might say that we want to hear the story because it is well written and well performed. Griffith is extremely talented as a story teller, and we appreciate his talent and courage to share such a personal story. When we watch the video, we are paying tribute to his writing and his acting. The only problem is that he really doesn’t seem to be acting. He has merely put his pain on view for the world. He is certainly talented, and the story is well-written, but most people will be moved by anyone’s story of a lost child. It is relatively easy to evoke strong emotions with a story of intense pain and grief.

It may be that we want to hear his story so we can prepare ourselves for the times our story might be the main event. Someday we will have to do the heavy lifting. If we can live through Griffith’s pain, maybe we can face our own. By experiencing Griffith’s grief, we see that we can also face it and live through it just as he has done. We finish the video feeling somehow more prepared.

Or we may be drawn to the stories of others because it provides an evolutionary advantage. By hearing stories of others, we develop compassion and care. Other than providing an audience, we may not be helping Griffith directly, but we may be better able to empathize with others in the future. We are preparing not only for how to face our own struggles but to help others through theirs. If this is true, then we are actually doing something noble and beneficial by watching such videos.

Or, maybe we are just seeking the thrill of an emotional roller coaster ride.

Comments are welcome below. I appreciate corrections to typos and so forth (randall@ethicsbeyondcompliance.com).