The U.S. healthcare system is a bit strange compared to the rest of the world. In almost every aspect, our focus is on direct treatment. While one can argue that this is a result of economic or legal incentives and the free market innovation, ultimately, I think it is about cultural attitude. Americans value choice, above all—the freedom to make their own decisions about what happens to them.
What this results in is the lack of preventive medicine, to the detriment of the population at large. Fewer and fewer physicians are going into primary care, instead choosing to pursue more lucrative specialties—and who can blame them? That’s what society seems to want.
A story I read in the New York Times illustrates the differences in perception:
I recently had a hysterectomy here in Munich, where we moved from California four years ago for my husband’s job. [. . .]
I brought up the subject of painkillers with my gynecologist weeks before my surgery. She said that I would be given ibuprofen. “Is that it?” I asked. “That’s what I take if I have a headache. The removal of an organ certainly deserves more.”
“That’s all you will need,” she said, with the body confidence that comes from a lifetime of skiing in crisp, Alpine air.
I decided to pursue the topic with the surgeon.
He said the same thing. He was sure that the removal of my uterus would not require narcotics afterward. I didn’t want him to think I was a drug addict, but I wanted a prescription for something that would knock me out for the first few nights, and maybe half the day.
[. . .]
“. . . but I am concerned about pain management. I won’t be able to sleep. I know I can have ibuprofen, but can I have two or three pills with codeine for the first few nights? Let me remind you that I am getting an entire organ removed.”
The anesthesiologist explained that during surgery and recovery I would be given strong painkillers, but once I got home the pain would not require narcotics. To paraphrase him, he said: “Pain is a part of life. We cannot eliminate it nor do we want to. The pain will guide you. You will know when to rest more; you will know when you are healing. If I give you Vicodin, you will no longer feel the pain, yes, but you will no longer know what your body is telling you. You might overexert yourself because you are no longer feeling the pain signals. All you need is rest. And please be careful with ibuprofen. It’s not good for your kidneys. Only take it if you must. Your body will heal itself with rest.”
I didn’t mention that I use ibuprofen like candy. Why else do they come in such jumbo sizes at American warehouse stores? Instead, I thought about his poetic explanation of pain as my guide, although his mention of “just resting” was disturbing. What exactly is resting?
I know how to sleep but resting is an in-between space I do not inhabit. It’s like an ambiguous place that can be reached only by walking into a magic closet and emerging on the other side to find a dense forest and a talking lion, a lion who can guide me toward the owl who supplies the forest with pain pills.
[. . .]
Come to think of it, I bring a lot of medicine with me from the United States, all over the counter, all intended to take away discomfort. The German doctors were telling me that being uncomfortable is O.K.
[. . .]
After a week, I took the tram to the doctor’s office to have my stitches removed. My doctor, with her usual cup of chamomile tea in hand, remarked on my progress. “I rested,” I told her. Normally, I would have said, “I did nothing,” but I didn’t say that. I had been healing, and that’s something.
Heuristics are incredibly useful to help us make decisions quickly and effectively. But, by definition, heuristics do not take into account the nuances and complexities of the real world.
One of the most widely known heuristics, especially in retail, is the left-digit bias. This describes a situation in which consumers notice the first digit, i.e. the left-most digit in the Arabic numeral system, and put more weight on its value as opposed to the rest of the number. In practice, this means that consumers might see a product that is 6.99 as substantially cheaper than a product that is 7.00 despite there only being a 1 cent difference. In fact, research from the University of Chicago shows that consumers perceive this difference as being worth almost 25 cents. Quite a lot more than most expect!
Image courtesy of Chicago Booth, 2019
This is especially frightening, however, in medicine, where we expect our physicians and caretakers to treat us without regard to anything else (including cost in America, though a separate issue).
In this article in the New York Times, Jena and Olsenki describe a few cases in which cognitive heuristics can have adverse effects on patients. For example, they show: (1)”that when patients experienced an unlikely adverse side effect of a drug, their doctor was less likely to order that same drug for the next patient whose condition might call for it, even though the efficacy and appropriateness of the drug had not changed” and (2) “that when mothers giving birth experienced an adverse event, their obstetrician was more likely to switch delivery modes for the next patient (C-section vs. vaginal delivery), regardless of the appropriateness for that next patient.”
In their own study, Jena and Olsenki report the impact of left-digit bias:
This is the bias that explains why many goods are priced at $4.99 instead of $5, as consumers’ minds round down to the left-most digit of $4.
We hypothesized that doctors may be overly sensitive to the left-most digit of a patient’s age when recommending treatment, and indeed, in cardiac surgery they appear to be. When comparing patients who had a heart attack in the weeks leading up to their 80th birthdays with those who’d recently had an 80th birthday, we found that physicians were significantly less likely to perform a coronary artery bypass surgery for the “older” patients. The doctors might have perceived them to be “in their 80s” rather than “in their 70s.” This behavior seems to have translated into meaningful differences for patients. The slightly younger patients, more likely to undergo surgery, were less likely to die within 30 days.
Anupam Jena and Andrew Olsenki
These issues might not be that surprising in retrospect, given that physicians are humans too, just like anyone else.
The civil liberties that have been granted by the Constitution are neither unlimited nor universal. This is clearly visible in the differing reactions of police, between the armed protests in Michigan against the COVID-19 lockdowns and the ironic, forceful police response to protests against police violence in the Black community.
Constitutional law scholars have shown time and time again that the individual rights of citizens must be balanced with the interests of society as a whole. This may mean infringing on the rights of individuals, or at least the appearance of doing so from certain perspectives. The specific virtues of doing so have been debated endlessly throughout the history of the nation. Likely, this debate will continue, especially in light of the underlying social issues that the COVID-19 pandemic and the killing of George Floyd have surfaced. Regardless, we are lucky to be in a society where despite such infringements, we can expect that these rights are quickly returned as a situation diffuses.
It interesting to look at the different responses to protests in order to glean a more comprehensive understanding of the rhetorical warrants that underscore the arguments presented by citizens, the current Administration, and scholars.
In response to the lockdowns from the COVID-19 pandemic, armed protesters with semi-automatic rifles assembled at the Michigan Capitol building. These protesters frame their methods as a way to uphold their rights to bear arms. While not inherently inimical, looking at the impact of their tactics can help elucidate their true intentions. As a result of the armed protests, the Legislature was adjourned. Weiner reframes their tactics, stating that “Those who [use] weapons to inhibit the business of government are better understood as armed rebels.”
Even so, the armed rebels expect that their rights are not infringed upon. They expect Constitutional protection despite actions that are directly opposed to the very purpose of the Constitution to “enable the peaceful resolution of disputes” and “prevent a resort to violence in politics” (Weiner, 2020).
“There is no reasonable claim that their weapons were necessary for self-defense — unless, that is, they planned to use them against law officers. The only reason to stand over a session of the State Senate wielding military-grade weapons is to intimidate its members, a goal in which the rebels succeeded” (Weiner, 2020). It is clear that their intentions were against the country and the Constitution.
If the armed rebels were truly concerned about an abusive government policy, there are Constitutional methods in which to remedy such abuse. The tactics used by armed rebels “inherently entails exiting the constitutional order, not claiming its protections” (Weiner, 2020).
Locke argues that “the majority have a right to act and conclude the rest.” This means that when the majority agrees, the government has the authority to make, and implement, decisions that all must follow. It is important to note, however, that these decisions may not always be “right.” When this happens, Weiner points out that, “in the end, the choices were to accept the constituted authority or to rebel against it.”
The question in Michigan and other scenes of armed protests against coronavirus restrictions is not whether states have struck the proper balance between public health and other considerations. Nor is it even whether governments have exceeded their legitimate authority. The question the would-be rebels must answer is whether social-distancing measures are so tyrannical that they are willing to take the extraconstitutional step of rebellion. They can either claim or relinquish the Constitution’s protections. They cannot have both.
Greg Weiner
In this respect, they differ in both motive and objective from the protesters raging against police violence in Minneapolis and elsewhere. The tactics of both groups have unraveled. But the complaint against police violence is against officers of the state breaking the law, not lawfully making policy in the first place.
Amidst the pandemic, there’s renewed interest in removing the SAT and ACT from the college admission decision process. There are certainly valid reasons why such an argument makes sense. But Riley makes a compelling counterpoint to one common refrain, that the SAT is itself biased.
Ultimately, the SAT is discriminatory by definition. It is designed to discriminate based on intellectual background and test ability. It is true, however, that people of poor backgrounds have a worse educational system to begin with, and so it makes sense that they would perform worse on the SAT when compared with their wealthier peers. This performance disparity is in fact quite staggering, and I’ve written about the College Board’s effort to alleviate this by creating an adversity index. This effort falls flat in many ways, but does help demonstrate that a lot of work needs to be done.
Despite all of the problems that the SAT does have (including the monopoly of the College Board with college admissions), there is one upside. Having a test that is taken by a large swath of the population provides significant insight into the education system.
I would argue that the SAT does a lot to show the disparity in a concrete and near-universal way. Riley argues the same, “Given [the] vast differences in upbringings, habits, attitudes and priorities across various groups, why would we expect to see anything approaching racial or ethnic parity in SAT scores? These disparities may become more apparent when we look at the test results, but that doesn’t mean the test is causing the results. And it doesn’t follow that scrapping the test will do anything to resolve the underlying disparities” (2020). Likewise, the SAT correlates well with college outcomes.
Putting more focus on the educational systems that are at the heart of the problem, though substantially more difficult and out of the control of colleges, will be a much more effective solution. As Riley summarily ends, “getting rid of the SAT will only obscure where they are, not change the discomfiting reality” (2020).
How much of your own experiences that shaped your character will be passed on to your children? Will they be able to learn the same lessons you learned the hard way, i.e. by experience, or the easy way, i.e. by you teaching them?
I think this is a struggle that every family in America must contend with, and is especially true with immigrant families, as Seema Jilani describes. In America, there is at once a sense of the nation being a “cultural melting pot,” but within those groups, there is a rising tide of resistance that those original cultures should not be amalgamated in entirety such that their ideals are erased forever.
Jilani has a powerful testament to her own upbringing in America, using the metaphor of the hyphen:
At the dinner table, my father once coached us, “When people ask you where you’re from, what do you say?” I guessed, “Pakistani-American?”
“Wrong. You are American. Period. Lose the hyphen.”
That hyphen held our traditions, our dichotomies, our complexities, our spicy food and an even spicier culture, rich with tradition. That hyphen was the bridge to our past.
Seema Jilani
I think that there are two meanings to this. The hyphen is important; it contains the history of a family and the culture and values they bring with them. Those should not be forgotten. But ultimately, we are Americans first.
We all have a hyphen; it may not be another country, but that hyphen is by definition a part of America, and that should not be forgotten either.
The appetite for written work has diminished. People don’t read nearly as much as they used to, choosing to watch or listen instead. Ironically, in an attempt to be more productive, people end up being less focused and therefore less engaged.
EDIT: I wrote this well before the COVID-19 pandemic came to full force. I think it is even more relevant now that we have been relegated to our homes, stuck to ponder. The downtime we have has sparked people to recognize that having purpose to life is critically important to sanity. While being “productive” in the common sense of the word is not necessary, it can imbue meaning. However, just relaxing during this time is perfectly fine. Just because you didn’t learn a new skill or read more books, does not mean that you wasted your time.
It also leaves little time for thought. So many fill their downtime—during a commute for example—with podcasts and audiobooks. With the advent of waterproof phones, even the once pristine shower has not been spared. We no longer have any sanctuaries of thought. Productivity seems to have taken over. This is why it can be important to throw productivity to the wayside, slow down, and be alone with your thoughts.
This is where music is useful. People listen to music all the time, but often only when doing something else mundane. And I think that robs music of its purpose. Music takes you on a journey of emotions. It allows you to feel emotion. Therefore it is valuable to listen to music and do nothing else. Allow your thoughts to wander and roam, guided along an open field by the music.
A few conflicting trends can be seen in society today. Over the past few decades, we have been a part of an unprecedented acceleration; one of technology, productivity and output, material consumption, pace of life, depression, and loneliness. Is all of that “forward” movement good? Or is it even movement at all?
Ross Douthat, in “The Age of Decadence,” argues that “the feeling of acceleration is an illusion, conjured by our expectations of perpetual progress and exaggerated by the distorting filter of the internet[.]” This American ideal that the future will be always be better is actually quite recent, and unique.
The peasantry in medieval Europe would not have such beliefs, but rather thought of a past in which Christ’s redeeming qualities were more than the abstractions described by the clergy. Even Renaissance writers described the exalted classics of Ancient Greece and Rome, believing that they would never be matched by their contemporary work. In this age, ironically when compared to today, only in the Islamic World did the future look bright, with scientific and literary advancements coming at a breathtaking pace.
When the transition from a hunter-gatherer society to an agrarian one occured, common wisdom says that this shift heralded a new age in which Humanity had finally began conquering the world. But it really wasn’t so for the average human alive. In fact, historians might argue that civilization conquered humanity.
This transition from a scavenger way of life instead resulted in a dramatic reduction in overall quality of life. While at times there was a surplus of food, farming became a monoculture; surviving on only one or two staples and their diets faltered on the front of diversity. People became stuck—attached to their land—and with that, imprisoned by the very crops they sought to domesticate. This lack of freedom became suffocating to culture and community.
Our modern culture may have removed the shackles of agriculture, but in that process, it put on the yoke of technology and in recent years tightened the reins.
Technological progress has continued to accelerate, or has it? In recent years it seems that progress has slowed; we’ve reached the peak, arriving at an unbreakable glass ceiling. Moore’s Law has broken down, it seems like every new innovation is only the tiniest bit of change from the previous generation. No lone individual can innovate within a field; it instead takes an interdisciplinary team to do so, as we’ve exhausted the possibilities and reached the limits of individual human expertise.
But there is still hope. Marques Brownlee compares this to cars, asking the question, “Are we at peak car?” Answering this shines a glimmer of light on a future where, despite stagnation, society can continue to grow.
All of these, taken into the context of society at large, have thus far resulted in a breakdown of our future outlook. Society at large has moved from thinking of the future as utopia to dystopia. The enormity of this change in outlook is reflected in our popular literature.
[W]e are aging, comfortable and stuck, cut off from the past and no longer optimistic about the future, spurning both memory and ambition while we await some saving innovation or revelation, growing old unhappily together in the light of tiny screens.
Ross Douthat
It is that gloomy light that now illuminates our thoughts as we drift off to a spiteful sleep. In the elite ranks, even the notion of sleep is frowned upon. After all, it only detracts from productivity and constant output. Ironically the elite work longer hours than ever before, and it is now the middle class that works less and less. This isn’t laziness; rather they can’t. The elite have broken up middle class jobs into component pieces and taken away any semblance of skill, leaving little behind. What scraps are left are handed out to the cheapest labor, and the remaining high level management is reserved for the elite. The mundane jobs left have therefore no path forward; no future outlook where those employed can work their way up the corporate ladder.
It is because of this economic stagnation that Douthat argues that society today has entered into a period of decadence.
The word “decadence” is used promiscuously but rarely precisely. In political debates, it’s associated with a lack of resolution in the face of threats. . . . In the popular imagination, it’s associated with . . . gluttony, . . . and chocolate strawberries. Aesthetically and intellectually it hints at exhaustion, finality — “the feeling, at once oppressive and exalting, of being the last in a series,” in the words of the Russian poet Vyacheslav Ivanov.
But it’s possible to distill a useful definition from all these associations. Following in the footsteps of the great cultural critic Jacques Barzun, we can say that decadence refers to economic stagnation, institutional decayandcultural and intellectual exhaustion at a high level of material prosperity and technological development. Under decadence, Barzun wrote, “The forms of art as of life seem exhausted, the stages of development have been run through. Institutions function painfully. Repetition and frustration are the intolerable result.” He added, “When people accept futility and the absurd as normal, the culture is decadent.” And crucially, the stagnation is often a consequence of previous development: The decadent society is, by definition, a victim of its own success.
Ross Douthat
Key to Douthat’s argument is that “decadence is a comfortable disease.” Society is doing fine.
With this stagnation comes social torpor. America is a more peaceable country than it was in 1970 or 1990, with lower crime rates and safer streets and better-behaved kids. But it’s also a country where that supposedly most American of qualities, wanderlust, has markedly declined: Americans no longer “go west” (or east or north or south) in search of opportunity the way they did 50 years ago; the rate at which people move between states has fallen from 3.5 percent in the early 1970s to 1.4 percent in 2010. Nor do Americans change jobs as often as they once did. For all the boosterish talk about retraining and self-employment, all the fears of a precarious job market, Americans are less likely to switch employers than they were a generation ago.
Meanwhile, those well-behaved young people are more depressed than prior cohorts, less likely to drive drunk or get pregnant but more tempted toward self-harm. They are also the most medicated generation in history, from the drugs prescribed for A.D.H.D. to the antidepressants offered to anxious teens, and most of the medications are designed to be calming, offering a smoothed-out experience rather than a spiky high. For adults, the increasingly legal drug of choice is marijuana, whose prototypical user is a relaxed and harmless figure — comfortably numb, experiencing stagnation as a chill good time.
Ross Douthat
To me, it seems that the averages are deceiving. Younger people are increasingly mobile, and less likely to set down roots in one area and stay long term, even if that’s what they desire. The elite are constantly in search of better opportunities. While it is the middle class, where “forced leisure,” i.e. unemployment as a result of increasingly elite skills needed, keeps them stuck.
Yet recently, a countermovement has emerged; a resurgence of age old ideals of humanity. It seems to be something that only the elites embrace; the middle class once again left behind. One can argue that that embrace is only superficial. Ultimately, the race is ongoing; people aren’t willing to step out as much as they are to slow down and relax, in effect preparing for the final sprint. One that might not ever come.
A century from today, what will this age be remembered as? An age of decadence? The precursor to dystopia? Or the beginning of a long and comfortable decline; a society ultimately on its way to a gradual death?
[T]rue dystopias are distinguished, in part, by the fact that many people inside them don’t realize that they’re living in one, because human beings are adaptable enough to take even absurd and inhuman premises for granted.
Ross Douthat
I can’t help but think, how close are we to the Hunger Games? But Douthat counters: decadence doesn’t necessarily lead to dystopia.
[Decadence], to be clear, [is] hardly the worst fate imaginable. Complaining about decadence is a luxury good — a feature of societies where the mail is delivered, the crime rate is relatively low, and there is plenty of entertainment at your fingertips. Human beings can still live vigorously amid a general stagnation, be fruitful amid sterility, be creative amid repetition. And the decadent society, unlike the full dystopia, allows those signs of contradictions to exist, which means that it’s always possible to imagine and work toward renewal and renaissance.
I’m not sure that I can fathom what 11,000 square miles looks like, let alone being the only physician to serve that entire area. But Saslow does an amazing job conveying the sense emptiness that Garner and Cummings might feel when being the only physicians available in a vast and desolate region in Texas. On a side note, I still don’t understand how their economy works?
“In the medical desert that has become rural America, nothing is more basic or more essential than access to doctors, but they are increasingly difficult to find. The federal government now designates nearly 80 percent of rural America as “medically underserved.” It is home to 20 percent of the U.S. population but fewer than 10 percent of its doctors, and that ratio is worsening each year because of what health experts refer to as “the gray wave.” Rural doctors are three years older than urban doctors on average, with half over 50 and more than a quarter beyond 60. Health officials predict the number of rural doctors will decline by 23 percent over the next decade as the number of urban doctors remains flat.
In Texas alone, 159 of the state’s 254 counties have no general surgeons, 121 counties have no medical specialists, and 35 counties have no doctors at all [emphasis mine]. Thirty more counties are each forced to rely on just a single doctor, like Garner, a family physician by training who by necessity has become so much else…”