Visual Rhetoric

The modern world hinges on the ability to create better user experience through carefully crafted design.

Elitist Jet Traffic at Bend

What are your thoughts on this? I’m curious.

Fingerprinting by Iambic Pentameter

How Identity is a Powerful Tool in Medicine

The role of a physician is a strange one: they diagnose, they teach, they advise, and they support, all at once. At times, however, these roles can conflict with one another and the task of the physician becomes balancing the various effects of each action with the ultimate goal to act in the best interest of the patient. It’s not always clear exactly how to do that, and there isn’t always a right way to do so.

Image courtesy of NBC News, 2019

One of the most important steps is to cultivate a good patient-physician relationship. First and foremost, this requires developing rapport with the patient. Analyzing this relationship through the lens of a rhetorician is invaluable—after all, this relationship is built on effective communication. In general, we can consider the physician as the rhetorical orator and the patient as the rhetorical audience.

Today, the power of identity has become well-known, especially as it relates to group identity and social psychology. Rhetoricians call this same concept, ethos. It is the foundation of effective communication. Before any type of persuasive argument can work, a physician must first establish a level of ethos, or trust, to build credibility with the patient. There are a number of ways to do so, for example, by speaking with a patient in a style that resonates with them, or pointing out connections or similar experiences. In fact, building ethos is exactly what YouTubers and social media influencers do: ethos is the product that they sell to advertisers and why their techniques are so valuable to brands.

As a result, their actions and outcomes become a natural manifestation of their identity.

In medicine, the patient-physician is uniquely infused with institutional ethos, which is the idea that society places its trust in physicians to take care of patients as a result of their expertise and the backing of the universities, medical schools, and accreditation boards that trained and certified them.

Once this baseline level of trust is established, patients feel comfortable sharing important and sometimes private details of their health with their physician and physicians are able to instruct their patients how to improve their health with the confidence that patients will continue in compliance. Studying the foundations of this relationship allows us to glean insight into the requirements needed to build better patient-physician relationships and help patients more effectively and appropriately conducive to their goals.

And this is key: “conducive to the patient’s goals.” In recent times, health has become more objective and to great effect. However, because of this idea of “chasing objective metrics,” the patient’s goals and needs can sometimes be forgotten in order to achieve an “ideal health.” One of the most salient examples of this is in the treatment of obesity.

Physicians tend to think and prioritize conditions based on their likelihood to harm the patient. This is not an exact science. In fact, a number of moral and ethical dilemmas present themselves through this line of thinking, but this has been discussed ad nauseum, and so I will refrain from it here. However, it has been proven time and time again that obesity is associated with other comorbidities and exacerbates the harmful effects of other conditions. But not all patients have a desire to prevent obesity nor is it always necessary to encourage patients to change.

I am fat. I am a fat activist. Like a lot of larger-bodied people, I have embraced the word fat. Doing so allows me to buy clothes that fit, rather than those that could fit if I changed. It gives me permission to go to spin classes (in pre-pandemic days) and worry only about trying to beat other people’s scores. It allows me to exist. The word fat, I have made clear to those around me and to myself, is not associated with a moral or intellectual or health failing. It is a descriptor. I have brown hair, blue eyes, excellent taste in caftans. I’m 5’9”, and I am fat.

Emily Duke

One example that I think is similar, but does not have the same associations as obesity is that of mental health and mental illness. Medicine—and society—as a whole has moved towards treating mental illnesses in the same way as a physical illness. The stigma surrounding mental health, while still there, is diminishing. However, even this progress, which is objectively a good thing, has some adverse effects and unintended consequences that physicians must reckon with when diagnosing patients with mental conditions, especially depression and anxiety. There are physical reasons why a person may present with sustained long-term depressive symptoms; these can be readily treated with medications and lifestyle modifications to great effect. The improvements to a patient’s quality of life can be astounding when treated effectively.

But alas, there is one other level of symptoms to contend with: long-term depressive symptoms that are not sustained, that is, the symptoms are not always present. This is problematic because often medicative treatment is not necessary or even ideal. Patients may present with these symptoms and think to themselves, “Oh yes, I have depression now, and so I need these medications to help treat them and improve myself.” They see just how effective it can be in some people and therefore believe that those effects can readily translate to themselves.

The apparent decline in childhood mental health is itself depressing. I suspect, however, that one of the factors driving it is increased diagnosis due precisely to the increasing prevalence of mental-health services. It is one thing to detect diseases with well-established biological bases; early detection of cancer has saved many lives. It is quite another to detect diseases on the basis of a rough group of ill-defined symptoms.

Crispin Sartwell

The unfortunate consequence of this is deeply embedded with the concept of identity. Patients believe that because they have depressive symptoms the best option is to treat it with medications. This is unfortunately misguided. There are absolutely times in our lives when we may be sad or down on ourselves. We may have depressive symptoms as well. It may even take a long time for these to resolve. But, that does not mean that we have depression. And therein lies the dilemma that physicians must grapple with. Patients who think this way identify themselves as patients with chronic depression needing treatment, even if that is not the case. The concern is that they become their own road block to improving their health. Patients often start going down the rabbit hole of thinking that there is nothing that they can do to help themselves overcome their symptoms. As a result, their actions and outcomes become a natural manifestation of their identity. They take on the mindset that their improvement lies in external factors beyond their control, such as medications, when in fact there are a number of concrete steps that they can take, including increasing exercise, better sleep schedules and routines, healthy coping mechanisms, and stress relieving activities, among others. When implemented, these can help patients drastically improve their health.

Of course, it is extremely challenging to do so. Sometimes patients need an external factor to help them. There are absolutely times when treatment will be very effective and allow patients to more easily begin and maintain lifestyle modifications. Ultimately, many might even be able to come off of medications because their overall health has so significantly improved. I must reiterate the importance of reducing the stigma of treating mental illnesses. There is an immense benefit to proliferating treatment to those that need it. However, I do think that it is valuable to highlight some unintended consequences of this major societal push that are important to keep in mind, especially as people in mainstream society are beginning to explore this concept of identity and recognize the power it can hold.

Sometimes, it can be salient to just eschew metrics all together and work directly with the patient to understand what their personal health goals are.

Larger-bodied people are often told their weight is the result of their mind: a lack of willpower, a lack of knowledge about nutrition, or in my case, an emotional weakness. Following my parents’ divorce at 5 years old, my pediatrician took a look at my BMI, told me that I was an emotional eater, and subsequently put me in therapy. Because I was a literal kindergartner, my food intake was largely controlled by the adults in my life. Being told my emotional issues were at fault for my size didn’t make a ton of sense, but I also didn’t have much of a choice at the time in my health decisions, so to therapy I went. In between playing board games with me, my Upper West Side shrink drew a body diagram with a jail located in the torso. The jail was full of anthropomorphized feelings labeled “anger,” “sadness,” “guilt,” and it was guarded by pieces of pizza and cake. From my understanding, childhood therapy is just playing Chutes and Ladders with a menopausal divorcée until you agree to stop eating carbs.

[. . .]

Getting to “healthy” involved a lot of shotgunning Diet Cokes to feel full. After I caved to hunger pangs and snuck a slice of Healthy Choice turkey at 1 a.m., I journaled that I was a “stupid fat bitch” under the glow of a clip-on reading light. This classic healthy behavior got me to a BMI in the dead center of the “healthy” range, and a LOT of compliments. No one questioned how it was happening until the numbers kept going down. Disordered eating affects people all across the size spectrum, and it has dangerous health consequences even for those “with obesity.” The only hint I had on this was a single Grey’s Anatomy episode in which Meredith punctured a heart during surgery—the patient was obese but had perilously thin organ walls because of undiagnosed anorexia.

No one noticed mine until I was Olsen twin–thin. Why was subsisting exclusively on carrot sticks and cherry tomatoes dipped in ketchup OK when I was 180 pounds but not when I was 103 pounds? The BMI chart reared its ugly head again, and I had to gain weight or I couldn’t play sports, or stay in school. This did not involve getting “healthy,” just continued attempts to game the chart. Instead of fasting and taking diuretics before stepping on a scale, I was chugging gallons of Poland Spring in the bathroom of my doctor’s office before a mandatory weekly weigh-in.

[. . .]

As far as I’m concerned, I was fat with BMIs ranging from 14 to 40. No matter how thin I got, there was always a larger-bodied person trying to get out. I can own being fat, because I have internalized that regardless of my weight, it is my chosen identity, an identity that ultimately gives me more freedom than it takes away. Obese is not an identity I give myself: It is a label put on me by others that I have no control over. But if briefly calling myself obese would make me safer, it was worth it. And more importantly, I realized it’s something I am capable of doing.

Emily Duke

The utmost task of the physician is to deeply understand these nuances, balance these conflicts, and make a determination as to what is in the best interest of the patient. Physicians must understand the goals of the patient in order to make recommendations that are appropriate. Having a rhetorical understanding of the patient-physician relationship allows communication to be more effective in both directions and will ultimately lead to better health outcomes for all patients.


“I Am Fat. To Get the Vaccine, I Had to Say I Am ‘Obese.’” by Emily Duke, March 2, 2021.

“When doctors fat-shame their patients, everybody loses,” by Kunal Sindhu and Pranav Reddy, August 24, 2019.

“Do Psychologists Cause ‘Mental Illness’?” by Crispin Sartwell, May 20, 2021.

The Meaning of Cinematic

Throughout the internet, or perhaps just in the spaces that I regularly visit, it seems that the word, “cinematic,” has become à la mode. In fact, every year, new trends show up on YouTube where people explain how to create the most cinematic visuals; at one point, it was using drone shots, at another, slow motion, and another, a widescreen aspect ratio.

I have always been a subscriber to the ethos that cinematic means like the movies. And if you pay attention to movies—or at least the ones that were actually great—you might find that very few used the same cinematic tricks that YouTubers emulate and often overuse. What made the movie cinematic was not a gimmick, but rather it was the story.

In “Show Your Work,” Austin Kleon writes:

The most important part of a story is its structure. A good story structure is tidy, sturdy, and logical. Unfortunately, most of life is messy, uncertain, and illogical. A lot of our raw experiences don’t fit neatly into a traditional fairy tale or a Hollywood plot. Sometimes we have to do a lot of cropping and editing to fit our lives into something that resembles a story. If you study the structure of stories, you start to see how they work, and once you know how they work, you can then start stealing story structures and filling them in with characters, situations, and settings from your own life.

Austin Kleon

So far, this is the clearest definition of the word, “cinematic,” that I’ve come across, especially from the perspective of a writer or storyteller. Cinema is supposed to explore a heightened sense of reality.

This is the job of the filmmaker or the writer—to take their own experiences, their own inspirations, their own dots that they’ve stumbled across, and put them together into a cohesive narrative. It involves cleaning up the story, removing extraneous information, and focusing heavily on the emotional journey told through the experiences of characters, settings, and plots.

Image courtesy of Trent Palmer, 2018

The best authors often incorporate elements of their own real life into their work. But they don’t simply retell events or use characters from the real world, they take their experiences and heighten them. Ultimately, the point of a film is to show something that is greater than reality—a little bit better and more perfect than what we might experience in the real world.


“Show Your Work!” by Austin Kleon, 2014.

“The Basis of Modern Innovation,” by Sahil Nawab, January 24, 2021.

The Expensive Intersection of Aviation and Medicine

The Basis of Modern Innovation

I’ve written before that one of Leonardo da Vinci’s fatal flaws was his extremely broad interests. He was often so compelled by new subjects that he usually failed to finish what he started. However, I think he was the embodiment of the couplet, “Jack of all trades, master of none; though oftentimes better than the master of one.”

The School of Athens, by Raphael, 1511. Image courtesy of Wikipedia, 2007

This concept of the “Renaissance Man” fascinated me during our discussion in history class. It was born from the philosophy of Renaissance humanism, that humans have limitless capacity for development. It became set in stone in the early 15th century when Baldassare Castiglione wrote The Book of the Courtier, where he describes the concept of the Renaissance Man and the ideals that make the perfect courtier. Key was the concept of sprezzatura, which Castiglione defined as as “a certain nonchalance, so as to conceal all art and make whatever one does or says appear to be without effort and almost without any thought about it.”

During the Renaissance, through careful and dedicated study, one person was all that was needed to know the entire breadth and depth of a subject and advance it further. Today, however, this advancement has become the domain of specialists. It takes significant knowledge and training to be at the forefront of a particular field. This is where modern innovation differs vastly from that of antiquity.

Recently, I came across a video from Thomas Frank that I found insightful, where he discusses a quote from an interview Steve Jobs did in 1996:

Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something. It seemed obvious to them after a while. That’s because they’re able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or that they have thought more about their experiences than other people.

Steve Jobs

Jobs goes on to explain that many people do not have enough “dots to connect.” In other words, their experiences do not have a sufficient breadth to innovate creatively and see the bigger picture. In Generating the Dots vs. Connecting the Dots, I discussed this very idea, echoing the argument of Vikram Mansharamani, who explains that “breadth of perspective and the ability to connect the proverbial dots (the domain of generalists) is likely to be as important as depth of expertise and the ability to generate dots (the domain of specialists).”

With breadth of perspective, seeing connections, finding connections, and even making connections becomes obvious. However, none of this is to say that being creative is easy. In fact, it’s quite the opposite. Building a system in place to continuously explore the world, gain new experiences, learn and be well-versed in a variety of different disciplines requires commitment and dedication in its own right. It requires dedication to being uncomfortable and pushing boundaries. This act of collecting dots by exploring the world is the fundamental basis of creativity.

This reminded me of the Renaissance Man, someone who has profound knowledge in a variety of diverse disciplines. This broad base of knowledge allows for creative problem solving and for examining issues from a variety of perspectives to understand in a way that no one else can. Ultimately, Frank argues, this comes from building the habit of exploring the world.

In his book Atomic Habits, James Clear mentions that he “felt like an imposter” when he began writing about habits. On his own admission, he was not an expert in the topic by the traditional means. He did not have a PhD in psychology, and had no authority to write one of the defining books on habit building, but rather learned through personal experimentation and experience. Over time, however, he did “become known as an expert on habits.”

Adopting this label, he says, was “uncomfortable,” but this concept of identity became a key principle in his book. One example he mentions is the idea of using identity instead of goal setting to facilitate building effective habits. Instead of setting a goal of, say, reading two books per month, it is more useful to take on the identity of a “reader.” A reader would, instead of watching TV or a movie at night, open up a book and read before going to sleep. Thinking of habits in the context of identity is an incredibly powerful psychological tool. At the same time, it seems odd that all of this advice is not necessarily scientifically backed, but rather is the opinion of one random person, however effective it may be.

I like to contrast James Clear with Adam Savage. In most regards, Adam Savage considers himself to be a generalist, someone who dabbles a little bit in all sorts of different disciplines, but not necessarily a master at any one in particular. He takes great pains to say, for example, that he is not a machinist, preferring to be called a “machine operator.” I think that this is a valid argument. One that shows a level of humility and self awareness. It shows respect for the immense work and journey that the talented people have undertaking to gain their expertise. I appreciate that.

Thomas Frank provides an excellent rebuttal to Adam Savage’s line of thinking. It’s okay to take on other labels and identities. It provides an opportunity to think with the mindset of that identity.

And you might feel this kind of pressure as well, maybe from several labels. Your profession, your gender, your age. In one way or another, each one of these whispers, “Stay in your lane.” Well, we might feel safe if we stay in our lane, but we sure as heck are not going to feel inspired. So as you explore and as you pursue things that interest you, don’t allow labels to give you tunnel vision. Don’t let a label push you away from something that interests you. A label can be useful as a communication device. It can help guide your audience, potential clients, potential employers to you, it can help you to find your niche, but it can also put your mind in a box if you let it. So if you can avoid doing that, if you can keep pursuing the things that interest you, if you can keep broadening your array of experiences and learning sources, you’re going to find the quality of your creative output going up. But moreover, you’re going to find yourself more inspired and more often getting lost in the work that you do.

Thomas Frank

This was an internal dilemma for myself as well. For example, I always debated whether or not to call myself a filmmaker, a writer, a storyteller. I never really felt up to the mark, so to speak. However, it was an important step in helping remind myself to post on this blog more frequently and to not be afraid to put my thoughts on the page. Over time, it actually became a reality, especially as I’ve built a repertoire on this blog.

Ultimately, taking on an identity is a useful tactic to help build the habits that are necessary to actually achieve that identity.


“Generating the Dots vs. Connecting the Dots,” by Sahil Nawab, July 28, 2020.

“Jack of All Trades,” by Sahil Nawab, May 31, 2019.

“Harvard lecturer: ‘No specific skill will get you ahead in the future’—but this ‘way of thinking’ will,” by Vikram Mansharamani, June 15, 2020.

“Atomic Habits,” by James Clear, October 16, 2018.

What is the Value of Academic Titles?

The recent op-ed by Joseph Epstein deriding Jill Biden’s credentials and right to use the title “Doctor” rekindled the debate surrounding the use of such titles for doctorates and medical doctors alike. His article is “ostensibly a foray into an ongoing debate over whether only medical doctors can claim the title.” This is certainly an interesting question, especially from a journalistic perspective, but upon further read, it becomes clear that this piece is not a logical argument, but is in fact a personal tirade against the way that academia has modernized and progressed through the years.

Putting aside the misogynistic and elitist views for a moment, Epstein complains that “getting a doctorate was then an arduous proceeding” where a secretary would sit outside the room with a pitcher of water “for candidates who fainted.” He goes on to say that dissertation defenses are now akin to friendly, social gatherings. The fact that academia in days past was lauded for putting students through abuse, mistreatment, and elitism for the purpose of “building character” shows just how much it has changed.

This antiquated view on the education system is partly to blame for the continued challenges that women and people of color face in academia. For them, “an academic title can be a tool to remind others of their expertise in a world that often undermines it.”

Weirdly, rather than creating a sound argument predicated on logic and reason, Epstein simply regresses into attacking Biden’s credentials. Instead of engaging in debate on the merits of using academic titles, perhaps arguing that it might seem arrogant, he dismisses her degree and dissertation.

On the first day of class, Debbie Gale Mitchell, a chemistry professor at the University of Denver, introduced herself to her students, telling them about her Ph.D. and her research. She told her students they could call her either “Dr. Mitchell” or “Debbie.” A male colleague had told her that he went by his first name and that students were friendlier as a result, so Mitchell decided to try it. Many students chose to call her “Debbie.”

Then one day a student asked if she thought she’d ever get a Ph.D.

“I discovered that for me, the use of my title is VITAL to remind students that I am qualified to be their professor,” Mitchell wrote on Twitter.

Allie Weill


“Is There a Doctor in the White House? Not if You Need an M.D.” by Joseph Epstein, December 11, 2020.

“A Wall Street Journal op-ed about Jill Biden pairs virulent sexism with academic elitism,” by Cameron Peters, December 12, 2020.

“Professor FLOTUS: How Jill Biden would redefine what it means to be first lady,” by Kate Andersen Brower, November 7, 2020.

“Whom does The New York Times consider a doctor?” by R.J. Lehmann, October 27, 2015.

“Is There a Doctor in the House?” by Mariana Grohowski, March 26, 2018.

“Should All Ph.D.’s Be Called ‘Doctor’? Female Academics Say Yes,” by Allie Weill,

“Today’s College Classroom Is a Therapy Session,” by Joseph Epstein, August 28, 2020.

India is Losing its Soul

The astonishing diversity of people and cultures in India is one of the defining characteristics of the nation. People with various cultural backgrounds, including language, customs, music, dance, food, and even religion, all coalesce together in big cities. They attend one another’s weddings, celebrate one another’s achievements, grieve with one another during funerals, and support one another through hardships. The fact that people from many different cultures live together creates a sense of appreciation for others and their stories. This, I believe, is the soul of India.

I relay this sentiment through the stories that my parents told me of their upbringing in Bombay. These stories greatly informed my own outlook, especially in light of the vitriolic rhetoric we face in both the U.S. and India. There are so many similarities between the two countries, and while certainly the two are not perfectly analogous, I think it is worthwhile to compare and contrast the two to gleam insight into how each grapples with the issues surrounding diversity and discrimination in all aspects of society.

Image courtesy of Wikipedia, 2007

There are clearly a number of parallels between the two nations. Both were once British colonies that have since developed into secular, constitutional democracies. Although 170 years separate their independence, the histories of each were shaped by one another. The secular ideals of the U.S. Constitution heavily influenced its Indian counterpart. Civil disobedience and the non-violent protests first championed by Mahatma Gandhi were instrumental to Martin Luther King Jr.’s vision of the civil rights movement.

These shared ties originate because both nations are built from a mosaic of diverse cultural heritage, with significant minority populations. While today some celebrate this diversity, others use it to sow discord into society, magnifying the differences between people to turn them against one another. Catalyzing turmoil is a tactic is used by tyrants to advance their own agenda at the expense of society. Pitting friends against one another allows them to hide their true intent and act with impunity. Dehumanizing the “other side” lends them the necessary credence to act with impunity.

Whether through policies such as the Citizenship Amendment Act and the National Register of Citizens, or by revoking the statehood of Muslim-majority Kashmir and Jammu, it is increasingly clear that Modi and the BJP thrive on pitting Hindus and Muslims against one another to rally support for their own personal agendas.

However, the Hindu-Muslim rivalry was not always this fervent. For the majority of Indians, religion was not nearly as much a defining characteristic as other factors such as language, customs, music, and food. India was mosaic of “dizzyingly diverse” and multicultural communities. Completely ignoring this concept of an Indian identity, the British decided hastily to separate the country into two, sowing the seeds of discord for decades to come.

This brutal process, called the Partition, separated families and communities on the basis of religion, which had little to do with how people distinguished themselves. Two families from across the border may have more in common with one another—from language to customs to food—than one family from Eastern India and another from Western India. This is especially true between North India and South India, where languages and customs are markedly more important to identity.

Many Muslim families split over whether to leave for this imagined separate homeland or to remain in India, where, despite the brutality of partition, the ardently secular Nehru reassured them that they had a home. He articulated his ideal of a composite Indian citizen, who was enriched and shaped by all the heritages that flowed through the world’s most diverse society.

As a child of the neighboring Islamic republic (and a steady consumer of Indian popular culture), I grew up admiring that multilingual, kaleidoscopic country. Later, I pursued my education at American universities, in classrooms led by the children of Nehruvian India, and my professors’ stories of religious coexistence inspired me to want to visit that alternative version of South Asia. From afar, India always seemed to be a symphonic banquet of possibilities, in contrast with the monochromatic vision of Pakistan’s religious leaders.

Bilal Qureshi

This is the same vision of India that I grew up with. The stories that my parents told me of their own upbringing in Mumbai, where people of all different cultures, languages, and backgrounds mix were instrumental in shaping my own understanding of the “melting pot” culture of the United States. Looking at the U.S. through this lens, the role of multiculturalism and tolerance seemed to be a big component of American growth and rise to power. Importantly, it was fascinating to be at the intersection many different cultures, including living in a mostly-white and Protestant suburban town and going to public school, then going to Catholic school for two years, then a public STEM school, and having friends from throughout India and Pakistan.

This perspective was invaluable to my understanding of the world. It was important to me that I connect to people who are different from me and learn from them and their stories. Whether that difference was in culture, religion, or viewpoints on political issues, speaking with other people taught me the immense nuance that is required to effectively develop an opinion on any given topic.

Creating a society where people are tolerant of one another is an important part of this process. The history of India has shown just how successful a government that encourages multiculturalism and social discourse can be. Today, a secular government that protects the civil liberties of its people is absolutely critical.

[In] recent years Prime Minister Narendra Modi and his Bharatiya Janata Party (BJP) have systematically dismantled Nehru’s vision for India. This month, India’s Parliament passed a new bill that enshrines in law a religiously inflected definition of who belongs in India. The Citizenship Amendment Bill provides a path to citizenship for undocumented migrants from Bangladesh, Afghanistan and Pakistan who are Hindus, Christians, Jains, Parsees and Buddhists, but explicitly excludes Muslims.

[. . .]

The law is more than just the latest in a series of Hindu-nationalist and Islamophobic policies. In a nation that is home to the largest Muslim population outside Muslim-majority countries, the bill extends an ideological project that breaks the very promise of India.

Bilal Qureshi

The very promise of India is written in the preamble of its Constitution, just like the U.S., being a secular democratic republic. However, unlike the U.S., there are no explicit guarantees of a separation of “church and state.” Therefore, it is paramount that Indian citizens continue their fight against an oppressive government that insists on a religious basis of citizenship. Take examples of other religious states—Saudi Arabia, Israel, Iran, even Pakistan—and the serious flaws of state-sanctioned religion become obvious.

The Constitution is meant to be a protector of rights and a protector of people. When first introduced, the Indian Constitution was “regarded as some as an elite document drafted in an alien language.” While that was true, having been written in English and not comprehensible to the vast majority of Indians, its promise became a beacon of hope for countless Indians, looking for an escape from the brutality of the Partition and solidifying a multicultural and tolerant society.

Today, Modi is starting a siege on the constitution. Rohit De reminds us that “the Constitution was ‘not just dull, lifeless words . . . but living flames intended to give life to a great nation, . . . tongues of dynamic fire, potent to mold the future.'”

Modi’s sudden takeover in Kashmir is the fulfillment of a long ideological yearning to make a predominantly Muslim population surrender to his vision of a homogeneous Hindu nation. It is also a way of conveying to the rest of India — a union of dizzyingly diverse states — that no one is exempt from the Hindu-power paradise he wants to build on the subcontinent. Kashmir is both a warning and a template: Any state that deviates from this vision can be brought under Delhi’s thumb in the name of “unity.”

Those who believe that such a day will never come — that India’s democratic institutions and minority protections will assert themselves — also never thought that someone like Modi would one day lead the country. Modi once seemed destined to disappear into history as a fanatical curio. As the newly appointed chief minister of Gujarat, he presided over the worst communal bloodletting in India’s recent history in 2002, when 1,000 Muslims, by a conservative estimate, were slaughtered by sword-wielding Hindus in his state over several weeks. Some accused Modi of abetting the mobs; others said he turned a blind eye to them. The carnage made Modi a pariah: Liberal Indians likened him to Hitler, the United States denied him a visa, and Britain and the European Union boycotted him.

Kapil Komireddi

The erosion of norms by both Trump and Modi is a stark reminder that despite the protections afforded by the Constitution or by political norms, it is up to the people to maintain their own government.

Muslims in India “have faced lynchings, lethal riots, and social and political disenfranchisement,” especially in recent years. The response of both Black Americans during the civil rights movement and Muslims in India today have been similar: a strengthened commitment to the ideals held in the Constitution.

When minorities are pushed to such walls, they may retreat into a siege mentality that breeds radicalization. But India’s Muslims have not come up with calls for violent jihad, nor chants for Shariah law. Instead, they have embraced and emphasized the blessings of liberal democracy by placing their faith in the Constitution of India and insisting on their constitutional rights as citizens.

[. . .]

The B.J.P.’s propaganda machine depicted Muslim protesters as “traitors” and “anti-nationals,” but they were wearing headbands saying, “I love India.” waving Indian flags, and repeatedly singing the national anthem.

Mustafa Akyol and Swaminathan S. Anklesaria Aiyar

It is important that protesters not fall into the trap of violence. Relying on civil disobedience may not seem to work, especially in the face of tyrants that thrive on bigotry and division. But, the struggles, and victories, of both Gandhi and King showed that non-violence and unity create a force of good that simply cannot be reckoned with in the long term.

This is fundamentally why both the Indian independence movement and the U.S. civil rights movement were so successful.

BJP leaders, in addition to marginalizing religious minorities, are erasing Nehru’s secular vision. They have crafted an alternative national narrative that recasts the country’s Hindu majority as victims and its era of Muslim empires as one of loss and shame. In the words of historian Sunil Khilnani, they have “weaponized history,” rewriting a period of composite Muslim dynasties such as the Mughals, who built the Taj Mahal and governed with multicultural courts, as a time of conquest by outsiders.

Bilal Qureshi

The whole point of India is that it is a secular nation. I find it quite odd that the Hindu nationalist government is instituting religious ideals in their policies. Isn’t that what Pakistan is for? His policies and vision for India are turning a once secular democracy into a plutocratic, “ethno-religious state.”

India’s story could hold lessons for Muslims elsewhere. Across the border, Pakistan long ago established what India’s B.J.P. seeks: an ethno-religious state dominated by the majority. In Pakistan’s case, this means the hegemony of Sunni Muslims at the expense of minorities such as Shiite Muslims, Ahmadis or Christians.

Mustafa Akyol and Swaminathan S. Anklesaria Aiyar

If India continues on this trajectory, Modi will have devolved India into the equivalent of Pakistan. India will then have lost it’s soul.


“Why India’s Muslims Reach for Liberalism,” by Mustafa Akyol and Swaminathan S. Anklesaria Aiyar, October 30, 2020.

“India once welcomed Muslims like me. Under Modi, it rejects us as invaders.” by Bilal Qureshi, December 17, 2019.

“The Kashmir crisis isn’t about territory. It’s about a Hindu victory over Islam” by Kapil Komireddi, August 16, 2019.

“We are Witnessing a Rediscovery of India’s Republic,” by Rohin De and Surabhi Ranganathan, December 27, 2019.

“What is Article 370, and Why Does It Matter in Kashmir?” by Vindu Goel, August 5, 2019.

“India’s Muslims: An Increasingly Marginalized Population,” by Lindsay Maizland, August 20, 2020.

The Value of Good Air Traffic Controllers

I can only say that I am incredibly impressed by the controller and his professionalism, guidance, and calmness. Wow!

What does the Future of Primary Care Look Like?

Given the extreme shortage of primary care physicians in the United States, it’s no surprise that patients are more and more being seen by nurse practitioners or physician assistants. While virtual visits and tele-health were slowly rising, the COVID-19 pandemic greatly accelerated their acceptance by patients and healthcare providers alike. The primary care office, therefore, is ripe for disruption and change. Laura Landro invites us to imagine a new future of primary care:

After uploading data from your home blood pressure monitor and electronic scale, you get a call from your health coach to talk about getting more exercise. To help with anxiety issues, you schedule a virtual visit with your mental-health social worker. When it’s time for an in-person checkup, you head to the clinic to be evaluated by the nurse practitioner or physician assistant. And it’s all covered by your health plan.

The traditional experience of getting health care is shifting away from the solo doctor with limited time to spend with each patient and few incentives to promote wellness. Instead, in the future, patients will be more likely to see a team of health-care professionals whose compensation is linked to keeping patients healthy. That team may be led by a doctor, but with a growing shortage of physicians, a nurse practitioner is increasingly likely to be in charge. Patients will also receive more care virtually and in nontraditional settings such as drugstore clinics.

Laura Landro

At first glance, it may seem to be a pleasant future—one without the long wait times and hassles of trying to cram in a year worth of ailments into a 20 minute visit with a primary care physician. Upon deeper thought, however, it becomes clear that not only does this imagined future require patients to continue coordinating their own care between an even greater number of offices, but it also requires them to see a multitude of different providers and specialists to take care of them.

Caring for patients where each aspect of the body is treated completely separately often leads to miscommunication between providers, incomplete and ineffective care, and overlooked concerns. It also requires more travel, more coordination, and more collaboration between disparate teams with incompatible systems.

Imagining a future in which technology “optimizes” healthcare into an assembly line where patients are shuffled from one area to another is not at all appealing. However, there is still some hope. Many providers are experimenting with other approaches to healthcare.

Although a majority of primary-care doctors work for large health systems, independent doctors are forming their own networks or testing new approaches to offering care. Some are creating so-called direct primary care practices that bypass insurers and charge patients a monthly fee—a more affordable version of concierge medicine. Doctors are also linking up with retail clinics. Over the next five years, Walgreens Boots Alliance and VillageMD plan to open 500 to 700 physician-led clinics attached to Walgreens drugstores in 30-plus markets. Their teams will include pharmacists, nurse practitioners and physician assistants; patients will get custom care plans, annual wellness visits and 24/7 access to providers via telemedicine.


While it may add some convenience that patients can be seen at different times and places according to their own schedule, it is not conducive to effective healthcare. But this issue should not be addressed by technology allowing people to come in whenever they want, but rather by better and more flexible workplace policy that more easily allows employees to visit healthcare providers, care for their children and families, and better transportation systems that allows people to easily get around to wherever they need. Solving these root problems will immediately improve the healthcare of the entire population.

Medicine is about treating patients as human beings. Let’s bring that attitude into every aspect of policy-making.

The key is to get away from a system of paying providers a fee for each service, and only for in-person visits. The federal Centers for Medicare and Medicaid Services and private insurers are moving closer to value-based purchasing such as paying providers a fixed monthly fee per patient for a range of services, often with incentives for better managing diabetes, heart disease, asthma and other chronic ailments.


This method, when used carefully, can help bring down the administrative costs of healthcare. But it can also be used to game the system, as I’ve written about in the past and thought about extensively while doing research into developing metrics for free clinics in Worcester.

A future in which primary care is provided in non-traditional settings, for example at walk-in retail clinics, urgent care centers, and virtual appointments, is disjointed. It invites important information to go unnoticed. Disjointed care increases the administrative costs, as each individual care center must maintain some amount of overhead. Instead of consistent staffing, patients bounce back and forth between providers. Patients don’t benefit, but insurance and healthcare companies do.

Modern medicine has become so large in scope that it is simply impractical for one stand alone family medicine physician to be adequately trained and able to treat the immense variety of symptoms and diseases that patients present with. Therefore, specialists are necessary.

My work in producing short films for the Admissions Department taught me the importance of relying on the expertise of others. I think that there are many parallels between the film industry and medicine. I found that just as a director leads the filmmaking team of camera operators, lighting specialists, sound specialists, editors, actors and actresses, a physician leads the healthcare team of specialists, nurses, and other providers. Both are responsible for guiding their team towards a shared vision, one a completed film, and other the complete care of a patient. There is a need for a physician to guide patient care.

Physician groups have pushed back against removing restrictions on nurse practitioners and physician assistants. The American Academy of Family Physicians, for example, contends that there is no equivalency between a doctor and someone who isn’t one, and that patient safety requires doctors to be in the lead in medical teams, to step in if patients have complex problems or there is uncertainty over treatment.


Given the 14 year training process for physicians, it is clear that “there is no equivalency between a doctor and someone who isn’t one. NP and PA training is more limited in scope and is focused on treating patients rather than acutely understanding disease processes and the science behind them. Physicians’ training is built on the old ideal of nosology, and is founded on the idea that physicians diagnose and nurses carry out the physicians orders. This certainly isn’t meant to disparage nurse practitioners or physician assistants, many of whom perform some of the same roles as physicians. In fact, most tasks that physicians perform do not require their level of training, especially with the increased administrative burden of modern medical practice.

I envision two potential scenarios for the future of medical education.

The first is that, given the severe shortage of primary care physicians, medical schools and residency positions will begin to open up more seats and expand their programs. In fact, this is already happening. The American Association of Medical Colleges reports that since 2002, allopathic medical schools have increased enrollment by 31 percent. When combined with osteopathic medical schools, which have been growing astonishingly quickly, overall medical school enrollment is up by 52% in the same time period.

A lot has been written about the importance of the physician’s “touch.” Humans need physical contact in order to comfort one another. Touch is an incredibly important way to understand the body, and physicians use touch often to help diagnose problems and comfort patients. Unfortunately, with the rise of diagnostic imaging, we are moving away from this. Virtual healthcare makes this shift even more pronounced.

This was a constant discussion point in my human factors of medicine and medical writing courses. Modern medical students are more reluctant to touch the patient. This inhibits creating an effective patient-physician relationship.

Image courtesy of Proto Magazine, 2020

The second potential future, which I think is far more likely, is that primary care physicians will be replaced by primary care nurse practitioners and physician assistants. Many states already allow both nurse practitioners and physician assistants substantial autonomy in treating patients without the supervision of a physician. More states are likely to follow.

As medical schools become more and more competitive and specialist salaries increase while primary care physicians salaries remain stagnant, the individuals who are motivated to study through medical school will almost all specialize. Therefore medical school applicants will self-select to become specialists. Those that do want to practice primary care will look elsewhere. Rather than becoming a “doctor” with an M.D. degree, they will instead pursue other healthcare pathways and become nurse practitioners or physician assistants.

This future will mean that medical school will continue to become more and more competitive, with increasing MCAT scores, GPAs, and extracurriculars. It will completely shift the way that medicine is practiced and doctors will be even more focused on diagnosing diseases and less on treating the patient as a person.

the problem is that we are creating jobs that overlap significantly in their roles, but with different training requirements and standards

if someone wants to become a nurse, which is a completely separate profession that involves a focus on patient care, they should be able to do so

it’s an important profesion with it’s own specific requirements for what it needs
if someone wants to become a physician, which focuses on treatment and management of diseases, they should be able to do so without artifiical barriers to prevent them from doing so.

if someone wants to be doing the role of a physician, they should not first need to become a nurse and then later do all of this additional training to be ale to be a physician without having the ufll responsibility

Imagining a future in which physicians look at patients as puzzles to be solved and where disjointed technology replaces a physician’s visit does not seem at all appealing. Instead it is pushing doctors even further away from primary care because they are left with the task of administration and not patient care. The reason that most physicians go into medicine is because medicine is about treating patients as human beings.

Ultimately, both solutions are stopgap measures. The actual solution is to increase the number of residency positions available, reduce administrative costs and overhead, and proportionally increase the compensation of primary care physicians so that it becomes more attractive than other specialties.

The solution is to make medical education more inclusive less competitive and remove unnecessary steps like the MCAT and VITA, or at least make them less unnecessarily difficult for the purpose of “weeding people out.” Allow the same people who would otherwise be nurse practitioners to instead become physicians and gain even training and more breath of practice

All together, these solutions allow more people to become primary care physicians. We all benefit from that.


“The New Doctor’s Appointment,” by Laura Leandro, September 9, 2020.

“U.S. Medical School Enrollment Surpasses Expansion Goal,” July 25, 2019.

“Where Can Nurse Practitioners Work Without Physician Supervision?” October 26, 2016.

“The Primary Problem,” by Linda Keslar, January 27, 2020.

Page 2 of 13

© 2017 - 2021 | Sapience Laboratories