Visual Rhetoric

The modern world hinges on the ability to create better user experience through carefully crafted design.

The Expensive Intersection of Aviation and Medicine

The Basis of Modern Innovation

I’ve written before that one of Leonardo da Vinci’s fatal flaws was his extremely broad interests. He was often so compelled by new subjects that he usually failed to finish what he started. However, I think he was the embodiment of the couplet, “Jack of all trades, master of none; though oftentimes better than the master of one.”

The School of Athens, by Raphael, 1511. Image courtesy of Wikipedia, 2007

This concept of the “Renaissance Man” fascinated me during our discussion in history class. It was born from the philosophy of Renaissance humanism, that humans have limitless capacity for development. It became set in stone in the early 15th century when Baldassare Castiglione wrote The Book of the Courtier, where he describes the concept of the Renaissance Man and the ideals that make the perfect courtier. Key was the concept of sprezzatura, which Castiglione defined as as “a certain nonchalance, so as to conceal all art and make whatever one does or says appear to be without effort and almost without any thought about it.”

During the Renaissance, through careful and dedicated study, one person was all that was needed to know the entire breadth and depth of a subject and advance it further. Today, however, this advancement has become the domain of specialists. It takes significant knowledge and training to be at the forefront of a particular field. This is where modern innovation differs vastly from that of antiquity.

Recently, I came across a video from Thomas Frank that I found insightful, where he discusses a quote from an interview Steve Jobs did in 1996:

Creativity is just connecting things. When you ask creative people how they did something, they feel a little guilty because they didn’t really do it, they just saw something. It seemed obvious to them after a while. That’s because they’re able to connect experiences they’ve had and synthesize new things. And the reason they were able to do that was that they’ve had more experiences or that they have thought more about their experiences than other people.

Steve Jobs

Jobs goes on to explain that many people do not have enough “dots to connect.” In other words, their experiences do not have a sufficient breadth to innovate creatively and see the bigger picture. In Generating the Dots vs. Connecting the Dots, I discussed this very idea, echoing the argument of Vikram Mansharamani, who explains that “breadth of perspective and the ability to connect the proverbial dots (the domain of generalists) is likely to be as important as depth of expertise and the ability to generate dots (the domain of specialists).”

With breadth of perspective, seeing connections, finding connections, and even making connections becomes obvious. However, none of this is to say that being creative is easy. In fact, it’s quite the opposite. Building a system in place to continuously explore the world, gain new experiences, learn and be well-versed in a variety of different disciplines requires commitment and dedication in its own right. It requires dedication to being uncomfortable and pushing boundaries. This act of collecting dots by exploring the world is the fundamental basis of creativity.

This reminded me of the Renaissance Man, someone who has profound knowledge in a variety of diverse disciplines. This broad base of knowledge allows for creative problem solving and for examining issues from a variety of perspectives to understand in a way that no one else can. Ultimately, Frank argues, this comes from building the habit of exploring the world.

In his book Atomic Habits, James Clear mentions that he “felt like an imposter” when he began writing about habits. On his own admission, he was not an expert in the topic by the traditional means. He did not have a PhD in psychology, and had no authority to write one of the defining books on habit building, but rather learned through personal experimentation and experience. Over time, however, he did “become known as an expert on habits.”

Adopting this label, he says, was “uncomfortable,” but this concept of identity became a key principle in his book. One example he mentions is the idea of using identity instead of goal setting to facilitate building effective habits. Instead of setting a goal of, say, reading two books per month, it is more useful to take on the identity of a “reader.” A reader would, instead of watching TV or a movie at night, open up a book and read before going to sleep. Thinking of habits in the context of identity is an incredibly powerful psychological tool. At the same time, it seems odd that all of this advice is not necessarily scientifically backed, but rather is the opinion of one random person, however effective it may be.

I like to contrast James Clear with Adam Savage. In most regards, Adam Savage considers himself to be a generalist, someone who dabbles a little bit in all sorts of different disciplines, but not necessarily a master at any one in particular. He takes great pains to say, for example, that he is not a machinist, preferring to be called a “machine operator.” I think that this is a valid argument. One that shows a level of humility and self awareness. It shows respect for the immense work and journey that the talented people have undertaking to gain their expertise. I appreciate that.

Thomas Frank provides an excellent rebuttal to Adam Savage’s line of thinking. It’s okay to take on other labels and identities. It provides an opportunity to think with the mindset of that identity.

And you might feel this kind of pressure as well, maybe from several labels. Your profession, your gender, your age. In one way or another, each one of these whispers, “Stay in your lane.” Well, we might feel safe if we stay in our lane, but we sure as heck are not going to feel inspired. So as you explore and as you pursue things that interest you, don’t allow labels to give you tunnel vision. Don’t let a label push you away from something that interests you. A label can be useful as a communication device. It can help guide your audience, potential clients, potential employers to you, it can help you to find your niche, but it can also put your mind in a box if you let it. So if you can avoid doing that, if you can keep pursuing the things that interest you, if you can keep broadening your array of experiences and learning sources, you’re going to find the quality of your creative output going up. But moreover, you’re going to find yourself more inspired and more often getting lost in the work that you do.

Thomas Frank

This was an internal dilemma for myself as well. For example, I always debated whether or not to call myself a filmmaker, a writer, a storyteller. I never really felt up to the mark, so to speak. However, it was an important step in helping remind myself to post on this blog more frequently and to not be afraid to put my thoughts on the page. Over time, it actually became a reality, especially as I’ve built a repertoire on this blog.

Ultimately, taking on an identity is a useful tactic to help build the habits that are necessary to actually achieve that identity.


“Generating the Dots vs. Connecting the Dots,” by Sahil Nawab, July 28, 2020.

“Jack of All Trades,” by Sahil Nawab, May 31, 2019.

“Harvard lecturer: ‘No specific skill will get you ahead in the future’—but this ‘way of thinking’ will,” by Vikram Mansharamani, June 15, 2020.

“Atomic Habits,” by James Clear, October 16, 2018.

What is the Value of Academic Titles?

The recent op-ed by Joseph Epstein deriding Jill Biden’s credentials and right to use the title “Doctor” rekindled the debate surrounding the use of such titles for doctorates and medical doctors alike. His article is “ostensibly a foray into an ongoing debate over whether only medical doctors can claim the title.” This is certainly an interesting question, especially from a journalistic perspective, but upon further read, it becomes clear that this piece is not a logical argument, but is in fact a personal tirade against the way that academia has modernized and progressed through the years.

Putting aside the misogynistic and elitist views for a moment, Epstein complains that “getting a doctorate was then an arduous proceeding” where a secretary would sit outside the room with a pitcher of water “for candidates who fainted.” He goes on to say that dissertation defenses are now akin to friendly, social gatherings. The fact that academia in days past was lauded for putting students through abuse, mistreatment, and elitism for the purpose of “building character” shows just how much it has changed.

This antiquated view on the education system is partly to blame for the continued challenges that women and people of color face in academia. For them, “an academic title can be a tool to remind others of their expertise in a world that often undermines it.”

Weirdly, rather than creating a sound argument predicated on logic and reason, Epstein simply regresses into attacking Biden’s credentials. Instead of engaging in debate on the merits of using academic titles, perhaps arguing that it might seem arrogant, he dismisses her degree and dissertation.

On the first day of class, Debbie Gale Mitchell, a chemistry professor at the University of Denver, introduced herself to her students, telling them about her Ph.D. and her research. She told her students they could call her either “Dr. Mitchell” or “Debbie.” A male colleague had told her that he went by his first name and that students were friendlier as a result, so Mitchell decided to try it. Many students chose to call her “Debbie.”

Then one day a student asked if she thought she’d ever get a Ph.D.

“I discovered that for me, the use of my title is VITAL to remind students that I am qualified to be their professor,” Mitchell wrote on Twitter.

Allie Weill


“Is There a Doctor in the White House? Not if You Need an M.D.” by Joseph Epstein, December 11, 2020.

“A Wall Street Journal op-ed about Jill Biden pairs virulent sexism with academic elitism,” by Cameron Peters, December 12, 2020.

“Professor FLOTUS: How Jill Biden would redefine what it means to be first lady,” by Kate Andersen Brower, November 7, 2020.

“Whom does The New York Times consider a doctor?” by R.J. Lehmann, October 27, 2015.

“Is There a Doctor in the House?” by Mariana Grohowski, March 26, 2018.

“Should All Ph.D.’s Be Called ‘Doctor’? Female Academics Say Yes,” by Allie Weill,

“Today’s College Classroom Is a Therapy Session,” by Joseph Epstein, August 28, 2020.

India is Losing its Soul

The astonishing diversity of people and cultures in India is one of the defining characteristics of the nation. People with various cultural backgrounds, including language, customs, music, dance, food, and even religion, all coalesce together in big cities. They attend one another’s weddings, celebrate one another’s achievements, grieve with one another during funerals, and support one another through hardships. The fact that people from many different cultures live together creates a sense of appreciation for others and their stories. This, I believe, is the soul of India.

I relay this sentiment through the stories that my parents told me of their upbringing in Bombay. These stories greatly informed my own outlook, especially in light of the vitriolic rhetoric we face in both the U.S. and India. There are so many similarities between the two countries, and while certainly the two are not perfectly analogous, I think it is worthwhile to compare and contrast the two to gleam insight into how each grapples with the issues surrounding diversity and discrimination in all aspects of society.

Image courtesy of Wikipedia, 2007

There are clearly a number of parallels between the two nations. Both were once British colonies that have since developed into secular, constitutional democracies. Although 170 years separate their independence, the histories of each were shaped by one another. The secular ideals of the U.S. Constitution heavily influenced its Indian counterpart. Civil disobedience and the non-violent protests first championed by Mahatma Gandhi were instrumental to Martin Luther King Jr.’s vision of the civil rights movement.

These shared ties originate because both nations are built from a mosaic of diverse cultural heritage, with significant minority populations. While today some celebrate this diversity, others use it to sow discord into society, magnifying the differences between people to turn them against one another. Catalyzing turmoil is a tactic is used by tyrants to advance their own agenda at the expense of society. Pitting friends against one another allows them to hide their true intent and act with impunity. Dehumanizing the “other side” lends them the necessary credence to act with impunity.

Whether through policies such as the Citizenship Amendment Act and the National Register of Citizens, or by revoking the statehood of Muslim-majority Kashmir and Jammu, it is increasingly clear that Modi and the BJP thrive on pitting Hindus and Muslims against one another to rally support for their own personal agendas.

However, the Hindu-Muslim rivalry was not always this fervent. For the majority of Indians, religion was not nearly as much a defining characteristic as other factors such as language, customs, music, and food. India was mosaic of “dizzyingly diverse” and multicultural communities. Completely ignoring this concept of an Indian identity, the British decided hastily to separate the country into two, sowing the seeds of discord for decades to come.

This brutal process, called the Partition, separated families and communities on the basis of religion, which had little to do with how people distinguished themselves. Two families from across the border may have more in common with one another—from language to customs to food—than one family from Eastern India and another from Western India. This is especially true between North India and South India, where languages and customs are markedly more important to identity.

Many Muslim families split over whether to leave for this imagined separate homeland or to remain in India, where, despite the brutality of partition, the ardently secular Nehru reassured them that they had a home. He articulated his ideal of a composite Indian citizen, who was enriched and shaped by all the heritages that flowed through the world’s most diverse society.

As a child of the neighboring Islamic republic (and a steady consumer of Indian popular culture), I grew up admiring that multilingual, kaleidoscopic country. Later, I pursued my education at American universities, in classrooms led by the children of Nehruvian India, and my professors’ stories of religious coexistence inspired me to want to visit that alternative version of South Asia. From afar, India always seemed to be a symphonic banquet of possibilities, in contrast with the monochromatic vision of Pakistan’s religious leaders.

Bilal Qureshi

This is the same vision of India that I grew up with. The stories that my parents told me of their own upbringing in Mumbai, where people of all different cultures, languages, and backgrounds mix were instrumental in shaping my own understanding of the “melting pot” culture of the United States. Looking at the U.S. through this lens, the role of multiculturalism and tolerance seemed to be a big component of American growth and rise to power. Importantly, it was fascinating to be at the intersection many different cultures, including living in a mostly-white and Protestant suburban town and going to public school, then going to Catholic school for two years, then a public STEM school, and having friends from throughout India and Pakistan.

This perspective was invaluable to my understanding of the world. It was important to me that I connect to people who are different from me and learn from them and their stories. Whether that difference was in culture, religion, or viewpoints on political issues, speaking with other people taught me the immense nuance that is required to effectively develop an opinion on any given topic.

Creating a society where people are tolerant of one another is an important part of this process. The history of India has shown just how successful a government that encourages multiculturalism and social discourse can be. Today, a secular government that protects the civil liberties of its people is absolutely critical.

[In] recent years Prime Minister Narendra Modi and his Bharatiya Janata Party (BJP) have systematically dismantled Nehru’s vision for India. This month, India’s Parliament passed a new bill that enshrines in law a religiously inflected definition of who belongs in India. The Citizenship Amendment Bill provides a path to citizenship for undocumented migrants from Bangladesh, Afghanistan and Pakistan who are Hindus, Christians, Jains, Parsees and Buddhists, but explicitly excludes Muslims.

[. . .]

The law is more than just the latest in a series of Hindu-nationalist and Islamophobic policies. In a nation that is home to the largest Muslim population outside Muslim-majority countries, the bill extends an ideological project that breaks the very promise of India.

Bilal Qureshi

The very promise of India is written in the preamble of its Constitution, just like the U.S., being a secular democratic republic. However, unlike the U.S., there are no explicit guarantees of a separation of “church and state.” Therefore, it is paramount that Indian citizens continue their fight against an oppressive government that insists on a religious basis of citizenship. Take examples of other religious states—Saudi Arabia, Israel, Iran, even Pakistan—and the serious flaws of state-sanctioned religion become obvious.

The Constitution is meant to be a protector of rights and a protector of people. When first introduced, the Indian Constitution was “regarded as some as an elite document drafted in an alien language.” While that was true, having been written in English and not comprehensible to the vast majority of Indians, its promise became a beacon of hope for countless Indians, looking for an escape from the brutality of the Partition and solidifying a multicultural and tolerant society.

Today, Modi is starting a siege on the constitution. Rohit De reminds us that “the Constitution was ‘not just dull, lifeless words . . . but living flames intended to give life to a great nation, . . . tongues of dynamic fire, potent to mold the future.'”

Modi’s sudden takeover in Kashmir is the fulfillment of a long ideological yearning to make a predominantly Muslim population surrender to his vision of a homogeneous Hindu nation. It is also a way of conveying to the rest of India — a union of dizzyingly diverse states — that no one is exempt from the Hindu-power paradise he wants to build on the subcontinent. Kashmir is both a warning and a template: Any state that deviates from this vision can be brought under Delhi’s thumb in the name of “unity.”

Those who believe that such a day will never come — that India’s democratic institutions and minority protections will assert themselves — also never thought that someone like Modi would one day lead the country. Modi once seemed destined to disappear into history as a fanatical curio. As the newly appointed chief minister of Gujarat, he presided over the worst communal bloodletting in India’s recent history in 2002, when 1,000 Muslims, by a conservative estimate, were slaughtered by sword-wielding Hindus in his state over several weeks. Some accused Modi of abetting the mobs; others said he turned a blind eye to them. The carnage made Modi a pariah: Liberal Indians likened him to Hitler, the United States denied him a visa, and Britain and the European Union boycotted him.

Kapil Komireddi

The erosion of norms by both Trump and Modi is a stark reminder that despite the protections afforded by the Constitution or by political norms, it is up to the people to maintain their own government.

Muslims in India “have faced lynchings, lethal riots, and social and political disenfranchisement,” especially in recent years. The response of both Black Americans during the civil rights movement and Muslims in India today have been similar: a strengthened commitment to the ideals held in the Constitution.

When minorities are pushed to such walls, they may retreat into a siege mentality that breeds radicalization. But India’s Muslims have not come up with calls for violent jihad, nor chants for Shariah law. Instead, they have embraced and emphasized the blessings of liberal democracy by placing their faith in the Constitution of India and insisting on their constitutional rights as citizens.

[. . .]

The B.J.P.’s propaganda machine depicted Muslim protesters as “traitors” and “anti-nationals,” but they were wearing headbands saying, “I love India.” waving Indian flags, and repeatedly singing the national anthem.

Mustafa Akyol and Swaminathan S. Anklesaria Aiyar

It is important that protesters not fall into the trap of violence. Relying on civil disobedience may not seem to work, especially in the face of tyrants that thrive on bigotry and division. But, the struggles, and victories, of both Gandhi and King showed that non-violence and unity create a force of good that simply cannot be reckoned with in the long term.

This is fundamentally why both the Indian independence movement and the U.S. civil rights movement were so successful.

BJP leaders, in addition to marginalizing religious minorities, are erasing Nehru’s secular vision. They have crafted an alternative national narrative that recasts the country’s Hindu majority as victims and its era of Muslim empires as one of loss and shame. In the words of historian Sunil Khilnani, they have “weaponized history,” rewriting a period of composite Muslim dynasties such as the Mughals, who built the Taj Mahal and governed with multicultural courts, as a time of conquest by outsiders.

Bilal Qureshi

The whole point of India is that it is a secular nation. I find it quite odd that the Hindu nationalist government is instituting religious ideals in their policies. Isn’t that what Pakistan is for? His policies and vision for India are turning a once secular democracy into a plutocratic, “ethno-religious state.”

India’s story could hold lessons for Muslims elsewhere. Across the border, Pakistan long ago established what India’s B.J.P. seeks: an ethno-religious state dominated by the majority. In Pakistan’s case, this means the hegemony of Sunni Muslims at the expense of minorities such as Shiite Muslims, Ahmadis or Christians.

Mustafa Akyol and Swaminathan S. Anklesaria Aiyar

If India continues on this trajectory, Modi will have devolved India into the equivalent of Pakistan. India will then have lost it’s soul.


“Why India’s Muslims Reach for Liberalism,” by Mustafa Akyol and Swaminathan S. Anklesaria Aiyar, October 30, 2020.

“India once welcomed Muslims like me. Under Modi, it rejects us as invaders.” by Bilal Qureshi, December 17, 2019.

“The Kashmir crisis isn’t about territory. It’s about a Hindu victory over Islam” by Kapil Komireddi, August 16, 2019.

“We are Witnessing a Rediscovery of India’s Republic,” by Rohin De and Surabhi Ranganathan, December 27, 2019.

“What is Article 370, and Why Does It Matter in Kashmir?” by Vindu Goel, August 5, 2019.

“India’s Muslims: An Increasingly Marginalized Population,” by Lindsay Maizland, August 20, 2020.

The Value of Good Air Traffic Controllers

I can only say that I am incredibly impressed by the controller and his professionalism, guidance, and calmness. Wow!

What does the Future of Primary Care Look Like?

Given the extreme shortage of primary care physicians in the United States, it’s no surprise that patients are more and more being seen by nurse practitioners or physician assistants. While virtual visits and tele-health were slowly rising, the COVID-19 pandemic greatly accelerated their acceptance by patients and healthcare providers alike. The primary care office, therefore, is ripe for disruption and change. Laura Landro invites us to imagine a new future of primary care:

After uploading data from your home blood pressure monitor and electronic scale, you get a call from your health coach to talk about getting more exercise. To help with anxiety issues, you schedule a virtual visit with your mental-health social worker. When it’s time for an in-person checkup, you head to the clinic to be evaluated by the nurse practitioner or physician assistant. And it’s all covered by your health plan.

The traditional experience of getting health care is shifting away from the solo doctor with limited time to spend with each patient and few incentives to promote wellness. Instead, in the future, patients will be more likely to see a team of health-care professionals whose compensation is linked to keeping patients healthy. That team may be led by a doctor, but with a growing shortage of physicians, a nurse practitioner is increasingly likely to be in charge. Patients will also receive more care virtually and in nontraditional settings such as drugstore clinics.

Laura Landro

At first glance, it may seem to be a pleasant future—one without the long wait times and hassles of trying to cram in a year worth of ailments into a 20 minute visit with a primary care physician. Upon deeper thought, however, it becomes clear that not only does this imagined future require patients to continue coordinating their own care between an even greater number of offices, but it also requires them to see a multitude of different providers and specialists to take care of them.

Caring for patients where each aspect of the body is treated completely separately often leads to miscommunication between providers, incomplete and ineffective care, and overlooked concerns. It also requires more travel, more coordination, and more collaboration between disparate teams with incompatible systems.

Imagining a future in which technology “optimizes” healthcare into an assembly line where patients are shuffled from one area to another is not at all appealing. However, there is still some hope. Many providers are experimenting with other approaches to healthcare.

Although a majority of primary-care doctors work for large health systems, independent doctors are forming their own networks or testing new approaches to offering care. Some are creating so-called direct primary care practices that bypass insurers and charge patients a monthly fee—a more affordable version of concierge medicine. Doctors are also linking up with retail clinics. Over the next five years, Walgreens Boots Alliance and VillageMD plan to open 500 to 700 physician-led clinics attached to Walgreens drugstores in 30-plus markets. Their teams will include pharmacists, nurse practitioners and physician assistants; patients will get custom care plans, annual wellness visits and 24/7 access to providers via telemedicine.


While it may add some convenience that patients can be seen at different times and places according to their own schedule, it is not conducive to effective healthcare. But this issue should not be addressed by technology allowing people to come in whenever they want, but rather by better and more flexible workplace policy that more easily allows employees to visit healthcare providers, care for their children and families, and better transportation systems that allows people to easily get around to wherever they need. Solving these root problems will immediately improve the healthcare of the entire population.

Medicine is about treating patients as human beings. Let’s bring that attitude into every aspect of policy-making.

The key is to get away from a system of paying providers a fee for each service, and only for in-person visits. The federal Centers for Medicare and Medicaid Services and private insurers are moving closer to value-based purchasing such as paying providers a fixed monthly fee per patient for a range of services, often with incentives for better managing diabetes, heart disease, asthma and other chronic ailments.


This method, when used carefully, can help bring down the administrative costs of healthcare. But it can also be used to game the system, as I’ve written about in the past and thought about extensively while doing research into developing metrics for free clinics in Worcester.

A future in which primary care is provided in non-traditional settings, for example at walk-in retail clinics, urgent care centers, and virtual appointments, is disjointed. It invites important information to go unnoticed. Disjointed care increases the administrative costs, as each individual care center must maintain some amount of overhead. Instead of consistent staffing, patients bounce back and forth between providers. Patients don’t benefit, but insurance and healthcare companies do.

Modern medicine has become so large in scope that it is simply impractical for one stand alone family medicine physician to be adequately trained and able to treat the immense variety of symptoms and diseases that patients present with. Therefore, specialists are necessary.

My work in producing short films for the Admissions Department taught me the importance of relying on the expertise of others. I think that there are many parallels between the film industry and medicine. I found that just as a director leads the filmmaking team of camera operators, lighting specialists, sound specialists, editors, actors and actresses, a physician leads the healthcare team of specialists, nurses, and other providers. Both are responsible for guiding their team towards a shared vision, one a completed film, and other the complete care of a patient. There is a need for a physician to guide patient care.

Physician groups have pushed back against removing restrictions on nurse practitioners and physician assistants. The American Academy of Family Physicians, for example, contends that there is no equivalency between a doctor and someone who isn’t one, and that patient safety requires doctors to be in the lead in medical teams, to step in if patients have complex problems or there is uncertainty over treatment.


Given the 14 year training process for physicians, it is clear that “there is no equivalency between a doctor and someone who isn’t one. NP and PA training is more limited in scope and is focused on treating patients rather than acutely understanding disease processes and the science behind them. Physicians’ training is built on the old ideal of nosology, and is founded on the idea that physicians diagnose and nurses carry out the physicians orders. This certainly isn’t meant to disparage nurse practitioners or physician assistants, many of whom perform some of the same roles as physicians. In fact, most tasks that physicians perform do not require their level of training, especially with the increased administrative burden of modern medical practice.

I envision two potential scenarios for the future of medical education.

The first is that, given the severe shortage of primary care physicians, medical schools and residency positions will begin to open up more seats and expand their programs. In fact, this is already happening. The American Association of Medical Colleges reports that since 2002, allopathic medical schools have increased enrollment by 31 percent. When combined with osteopathic medical schools, which have been growing astonishingly quickly, overall medical school enrollment is up by 52% in the same time period.

A lot has been written about the importance of the physician’s “touch.” Humans need physical contact in order to comfort one another. Touch is an incredibly important way to understand the body, and physicians use touch often to help diagnose problems and comfort patients. Unfortunately, with the rise of diagnostic imaging, we are moving away from this. Virtual healthcare makes this shift even more pronounced.

This was a constant discussion point in my human factors of medicine and medical writing courses. Modern medical students are more reluctant to touch the patient. This inhibits creating an effective patient-physician relationship.

Image courtesy of Proto Magazine, 2020

The second potential future, which I think is far more likely, is that primary care physicians will be replaced by primary care nurse practitioners and physician assistants. Many states already allow both nurse practitioners and physician assistants substantial autonomy in treating patients without the supervision of a physician. More states are likely to follow.

As medical schools become more and more competitive and specialist salaries increase while primary care physicians salaries remain stagnant, the individuals who are motivated to study through medical school will almost all specialize. Therefore medical school applicants will self-select to become specialists. Those that do want to practice primary care will look elsewhere. Rather than becoming a “doctor” with an M.D. degree, they will instead pursue other healthcare pathways and become nurse practitioners or physician assistants.

This future will mean that medical school will continue to become more and more competitive, with increasing MCAT scores, GPAs, and extracurriculars. It will completely shift the way that medicine is practiced and doctors will be even more focused on diagnosing diseases and less on treating the patient as a person.

Imagining a future in which physicians look at patients as puzzles to be solved and where disjointed technology replaces a physician’s visit does not seem at all appealing. Instead it is pushing doctors even further away from primary care because they are left with the task of administration and not patient care. The reason that most physicians go into medicine is because medicine is about treating patients as human beings.

Ultimately, both solutions are stopgap measures. The actual solution is to increase the number of residency positions available, reduce administrative costs and overhead, and proportionally increase the compensation of primary care physicians so that it becomes more attractive than other specialties.

The solution is to make medical education more inclusive less competitive and remove unnecessary steps like the MCAT and VITA, or at least make them less unnecessarily difficult for the purpose of “weeding people out.” Allow the same people who would otherwise be nurse practitioners to instead become physicians and gain even training and more breath of practice

All together, these solutions allow more people to become primary care physicians. We all benefit from that.


“The New Doctor’s Appointment,” by Laura Leandro, September 9, 2020.

“U.S. Medical School Enrollment Surpasses Expansion Goal,” July 25, 2019.

“Where Can Nurse Practitioners Work Without Physician Supervision?” October 26, 2016.

“The Primary Problem,” by Linda Keslar, January 27, 2020.

The Changing World Wide Web

Are Healthcare Metrics only for Billing?

Health policy is an important way of effecting change in society at scale. When acting at scale, it is necessary to make assumptions and generalizations that do not take into account all of the nuances and complexities of a society that is not inherently equal and just. As a consequence, health policy that may seem effective at first glance may actually be complicit in creating inequities that perpetuate society’s injustice.

Today, the care of patients in hospitals and outpatient clinics, like anything else, is a business. Businesses are focused on efficiency and optimization of profits. It therefore follows that almost all metrics in healthcare are designed from an administrative or billing perspective. However, the institutional bias that results from policy, despite being much more subversive, can be addressed by careful design of metrics, separating compensation from outcomes (at least until a more equitable society is formed).

Navathe and Schmidt show one of the pitfalls of using metrics where the perceived rhetorical purpose to outsiders and perhaps even the designers—improving care for all patients—is mismatched to the practical purpose to hospitals and clinics—optimizing profits. In order for metrics to be effective, it is necessary to consider the possible rhetorical uses of the metrics while designing them.

When metrics are used to determine compensation, people will always attempt to game the system. Instead of allowing physicians to be rewarded for treating patients, they are incentivized to treat their “score.” It would be naïve to assume otherwise. Unfortunately, tying healthcare outcomes to compensation is fundamentally flawed in a society where hospitals can “refuse” to see patients in implicit, less obvious ways.

The byproduct of tying outcomes to compensation and using metrics to quantify outcomes results in a situation where hospital administrators limit “unprofitable services like psychiatry wards either by keeping only a small number of spots for patients or by simply not offering a dedicated psychiatry ward at all.” These metrics “create incentives for hospitals to avoid patients from these groups” because patients in minority populations are “economically unattractive to hospitals.” Chronically understaffing preventative care offices has deep repercussions in the form of worse patient care and increased overall costs over a patient’s lifetime.

With each of these types of payment models, the initial intention regarding social justice may be unclear, unknown or even aimed at promoting it. A value-based payment reform model seems as innocent as a daisy and worlds apart from the most overt forms of structural racism, such as segregated transportation or drinking fountains. Yet, far too often, such models share the consequence of systematically disadvantaging some groups, whether as a result of the design of policies or culturally ingrained behavioral patterns.

Amol S. Navathe and Harold Schmidt

With today’s data-focused society, healthcare metrics are indeed critical tools to assess the functioning of our healthcare system. However, it is important to keep in mind that they have limitations and can easily be flawed Ultimately, metrics must be used with extreme care to ensure that these unintended consequences are fully thought through.


“Why a Hospital Might Shun a Black Patient,” by Amol Navathe and Harold Schmidt, October 6, 2020.

How Light Meters Work on SLR Cameras

The Rich vs. The Elite

Popular media, in whatever form it takes, is an incredible resource to analyze the aspirations and goals of a society. It serves as a mirror for its intended audience to look into and can offer a glimpse through a window into another world for those whom it’s not directly aimed at.

For Rob Henderson, this reflection was done through television. He spent his early life shuffling between foster homes, eventually joined the military, went to Yale, and is now a doctoral student at Cambridge University. He was able to directly experience the vast differences between social classes in the United States and beyond. This perspective is invaluable.

Like many people, he initially thought that social class was dependent on money. But, “‘The Fresh Prince of Bel-Air’ taught me that it wasn’t,” he posits. This insight into the differences between monetary wealth and “cultural capital,” as Elizabeth Currid-Halkett discusses, shows that each broadly defined social class has its own set of values that are often reflected in the popular media. The fictional stories that Henderson refers to are perhaps less fictional than they might at first seem.

Early on, I thought of television as a window into another world. I would watch it to escape the one I was in, and to learn more about others. Later, though, it became more like a mirror. The more I saw, the more I learned what I wanted; the shows I chose to watch, in turn, reflected my desire to build a better life for myself, and I took my cues from them on how to construct it. Either stay like this, I thought, as I gazed at the TV, or try to live like that.

Rob Henderson

In fact, both Henderson and Currid-Halkett have discussed similar themes in the past.

Strangely enough, as we ascend that ladder, we encounter a fork—it seems that the “elite” definition becomes bifurcated by wealth and education. Currid-Halkett attributes this to the idea of “cultural capital,” which is a separate form of wealth than monetary capital. This is where education plays a significant role.

Paul Fussell argues that the criteria we use to define the tiers of the social hierarchy are in fact indicative of our social class. For people near the bottom, social class is defined by money — in this regard, I was right in line with my peers when I was growing up. The middle class, though, doesn’t just value money; equally important is education.

Rob Henderson


“Everything I Know About Elite America I Learned From ‘Fresh Prince’ and ‘West Wing,’” by Rob Henderson, October 10, 2020.

Page 1 of 12

© 2017 - 2021 | Sapience Laboratories