The Eternal Sick Day
Sickness has become characteristic of both impunity and plausible deniability.
Children born in the United States learn an interesting lesson very early in life: if you are found to be sick, you can stay home from school, eat ice cream, and watch TV, regardless of the degree of your illness. Very soon after our first experience with being sick or with witnessing a sibling stay home sick, it becomes obvious that you can merely state that you do not feel well, and, if this statement is convincing enough, you “get to” miss school and enjoy the lazy pleasures of staying home.
This knowledge forms the basis of an ironclad social contract that extends into adulthood. It also presents a responsibility loophole that very few individuals find themselves above taking advantage of, at least once in a while. If you don’t want to do something (including your job), “calling in sick” is your get-out-of-jail-free card, seen not just as a human right but also as a moral duty to not spread sickness. Therefore, in contemporary society, minor illness has major purchase for anyone: it gets you a day off for free. Provided you don’t sick-out beyond what is considered reasonable by your cohort – and provided you are not an on-the-clock worker who does not have the luxury of playing sick without an attendant financial penalty – claiming sickness will relieve you of many obligations without the penalty of losing pay, credibility, or social standing. This makes nearly all claims of minor illness both morally unquestionable and deeply suspect.
But how did this happen? How did sickness become so characteristic of both impunity and plausible deniability? The germ or pathogen theory of disease arose in the nineteenth century with French chemist Louis Pasteur, beating out the lesser-known “terrain” theory invented by another French chemist, Antoine Béchamp.
For Pasteur, sickness came from without, in the form of germs invading and infecting the systems of larger animals. Béchamp’s terrain theory on the other hand put forth that microbes were affected by the declining health of the body they inhabited – their “terrain.” Perhaps because germ theory coincided with the rise of measurable scientific data as a repository for truth, it became the relied-upon methodology, relegating the relatively immeasurable terrain theory to a pseudoscience.
Composed of the alien unseen, germ theory is imbued with an air of mystery, often translated allegorically as outside force taking control: grippe, the old-fashioned term for influenza, literally means to seize or hook. And perhaps because we have chosen, in illness, to disidentify with our bodies, our terrain, the medical industry has been given near-complete authoritative control over our diagnostic conceptions of both health and illness. As we have chosen to imbue the medical industry with ultimate authority in these realms, we have also submitted to its control. Thus, our sneakiness around sick days dovetails with the fact that the medical establishment has taken up agentic residence in our collective psyches to a degree that we do not feel capable of viewing our own health through a process of self-assessment. We might know what we want to watch on TV or eat for dinner, we might proclaim whom we find attractive and where we want to live, but we cannot discern on our own whether we are sick or well. We need a test for that.
Because those in the medical field have been granted and have assumed this authoritative role, the average American has largely divested themselves of the view that their mental and physical health is within their personal responsibility to assess, uphold, and maintain. Instead, many view maladies as events that just happen to them, something for a doctor to figure out and fix, in the same manner that the mechanically challenged bring their car to the shop when it’s making that funny noise again. The average Westerner feels helpless to the vicissitudes of disease, and can do nothing but call the doctor, get a diagnosis, pick up a prescription, or schedule a procedure. This has resulted in a new phenomenon: a population that views health not as the general set point of many if not most minds and bodies, but as something external that must be prescribed and consumed.
In grand nursing theory, self-care is used to describe the responsibility an individual has to the management and maintenance of their own health. It is essentially self-doctoring that takes individual responsibility along with body-sovereignty as a given. This theory also puts forth the idea that a patient should not become overly dependent on the medical system. Instead, patients must practice self-care by looking after themselves until a “self-care deficit” occurs – and this is the point at which a nurse should step in. This belief structure has been endemic to nursing since its inception. In her 1859 book Nursing: What It Is and Is Not, Florence Nightingale states that,
It is often thought that medicine is the curative process. It is no such thing; medicine is the surgery of functions, as surgery proper is that of limbs and organs. Neither can do anything but remove obstructions; neither can cure; nature alone cures. Surgery removes the bullet out of the limb, which is an obstruction to cure, but nature heals the wound. So it is with medicine; the function of an organ becomes obstructed; medicine, so far as we know, assists nature to remove the obstruction, but does nothing more. And what nursing has to do in either case, is to put the patient in the best condition for nature to act upon him. Generally, just the contrary is done.
Since 1986 it’s been a legally defined aspect of American social values that the ill must be stabilized and cared for to some degree, whether the patient can afford this care or not. Yet, where once medicine was regarded – by Nightingale and others responsible for the formulation of foundational theories of care – as an assistant to health, it is now typically thought of as health, full-stop. Thus, it has become an obligatory rite of passage for young people to receive their first identity-defining diagnosis – ADHD, depression, anxiety – and to be set forth on a life’s journey of consumer-based health care composed of prescriptions and check-ups.
To avoid this diagnostic christening would be seen as negligence, possibly even child abuse. And while technology's manifest destiny has resulted in an abundance of innovative medical “cures,” we cannot deny that (given the significant amount of medical and pharmaceutical treatment the average American consumes) we are not a people who are overwhelmingly “healthy” on our own – we are often simply medicated into living longer lives of mediocre health. The result of medical imperialism is the diagnoses of ever-increasing existential threats; illness is the marauder who knocks upon the door at midnight. Industrial medicine heroically fights this enemy through endless courses of treatment designed to conquer and control looming health catastrophes, rarely considering that the reliance on this form of care might be the problem in the first place.
There is something under the surface here, related to body sovereignty, physical ownership, and the fact that nine-to-five jobs often pay one not for their products, performance, or ideas, but for the seconds removed from the duration of that person’s lifespan. In a sense, your job owns your body. At least sometimes. And health insurance – which is almost ubiquitously tied to employment in the United States – ensures that a company’s investment is well tended to. As philosopher Ivan Illich posited in his incisive critique of the medical industry, Medical Nemesis, “The higher the salary the company pays, the higher the rank of an apparatchik, the more will be spent to keep the valuable cog well oiled. Maintenance costs for highly capitalized manpower are the new measure of status for those on the upper rungs. People keep up with the Joneses by emulating their ‘check-ups’...”
But health insurance is a relatively new, twentieth century invention, and its alliance with employment began as a perk rather than as a necessity or a human right. As reported by Alex Blumberg on NPR,
By the late 1920s, hospitals noticed most of their beds were going empty every night. They wanted to get people who weren't deathly ill to start coming in. An official at Baylor University Hospital in Dallas noticed that Americans, on average, were spending more on cosmetics than on medical care. “We spend a dollar or so at a time for cosmetics and do not notice the high cost,” he said. “The ribbon-counter clerk can pay 50 cents, 75 cents or $1 a month, yet it would take about 20 years to set aside [money for] a large hospital bill.”
So, the entrepreneurial Baylor hospital started to offer a payment structure that was familiar to the everyman of a consumer society. They created a fifty-cent-a-month plan with a focus on Dallas-area teachers. This modest plan ended up benefiting Baylor in a time of economic catastrophe, helping them to avoid the patient-load declines in hospitals across the country sparked by the Great Depression. As a result Baylor’s plan became immensely popular and expanded into what is now Blue Cross.
Blue Cross proved to be the blueprint for the employer-based health insurance system, but, writes Blumberg,
[During World War II], factory owners needed a way to lure employees, so owners turned to fringe benefits, offering more and more generous health plans. The next big step in the evolution of health care was also an accident. In 1943, the Internal Revenue Service ruled that employer-based health care should be tax-free. A second law, in 1954, made the tax advantages even more attractive. By the 1960s, seventy percent of the population was covered by some kind of private, voluntary health insurance plan. By [then] Americans started to see that system – in which people with good jobs get health care through work and almost everyone else looks to government – as if it were the natural order of things.
The paternalistic “naturalness” of employer-provided health care, which almost always takes the form of health insurance, has created a society that believes that it must be kept healthy via extrinsic forces like “a great health care package” (complete with unlimited sick days to get us out of doing a job we've likely taken to acquire health care in the first place). We are not a society that wants the ultimate responsibility of keeping itself healthy. The original concept of “self-care” – which, notably, bears little resemblance to the contemporary misuse of the term, which is much more in-line with pampering and cosmetic maintenance – has been replaced with a dependency-based mindset that passively refuses to take responsibility for its own physical, mental and emotional health – and also refuses to believe, as Nightingale states, that healing is a process performed by the body, versus the medicine or doctoral procedure.
As a consumerist society, we have decided that life itself must be consumed, rather than created. We have outsourced our health, and created a system wherein we must buy it back via the time we spend on the clock at work. Or steal it back, through sick days.