Ban Bossy: When Good Liberals Go Bad

I understand that our culture still has residual messaging that favors boys in leadership roles. I don’t worry about this so much in surface cultural productions, which overcompensate to the point that every action movie must have the requisite 110-pound female who destroys linebacker males in hand-to-hand combat. But it’s still in the cultural unconscious, in the brainstem, in the fairy tales and playground games that persist for centuries below the radar of surface cultural changes.

Liberals do well to focus attention on cultural formations that should encourage girls and boys alike to leadership roles. The “Ban Bossy” campaign led by Sheryl Sandberg and others, however, seems the wrong way to go about it. My experience is that bossiness is evenly distributed, and equally undesirable, in both genders. An effort to provide more narrative and real-life models of female leadership (and I don’t mean the female action roles cited above) would be a liberal agenda that makes sense. Banning the word “bossy” in application to girls, less so. It merely suggests that since women are underrepresented in positions of power, we should hereby allow girls (and not boys) to be bossy with impunity.

The tack of granting special linguistic privileges to a protected class of people, albeit well received by those listeners already in the choir, is not going to win new friends for liberalism. It’s in a way similar (although I don’t claim the analogy holds in every respect) to the common liberal position that one race (black), due to historical conditions, should be allowed to used elements of the language, or make fashion choices, that are off limits to other races (e.g., white). I understand the historical reasons that certain language is particularly offensive coming from certain races, but you always lose more than you gain when you put regulatory language walls between people based on race or gender (not to mention it’s anathema to my anarchist tendencies, which push the margin of error toward freedom of expression rather than toward restricted expression). You’re basically throwing in the towel, saying that our shared humanness is trumped by our demographic differences and so we may as well lock in those differences by linguistic fiat. In the case of black and white, you are probably reifying racial differences between two groups whereas in reality race is a continuum, if not entirely a social construct. A better case could be made for two genders (or at least two primary sexes), but still, banning the adjective “bossy” as a descriptor for women will appear to many potential allies as promotion of a double standard based on gender, and smacks a little too much of the “separate but equal” strategy used by segregationists in the Civil Rights era.

Again, I’m not saying that the problem doesn’t exist. I think Sandberg rightly spotlights the dearth of women in leadership roles. From what I’ve heard of the Lean In movement she has championed, I support many of the same goals. I believe that our culture, at least deep down in its brainstem if not in its current best practices, does encourage leadership in boys more than in girls. But if we’re going to come at the formations of leadership through the access point of “bossiness,” I’d prefer that we teach both genders the difference between bossiness and real leadership. This might save the next generation – male and female alike – a lot of grief that my generation had to put up with from boorish faux leaders in hierarchical places.

Two Critiques of Materialism

1. What are you most certain of and how do you know it?  Let’s say you’re pretty sure about many things but most certain about mathematical truths (e.g., an equilateral triangle has three 60 degree angles).  Let’s say to prove this you draw a triangle on the chalk board.  But the triangle on the chalk board is imperfect: the lines are grainy and not quite straight, etc.  In fact, every physically produced triangle will fall short of the perfect triangle.  But all of your mathematical knowledge about triangles is based on the perfect triangle—the one in which the lines are perfectly straight and the angles exactly what they should be.  That ideal triangle, the one that doesn’t exist in the material world, is the only one you really know anything about, but luckily that knowledge carries over, albeit imperfectly, in the material world.  And mathematical knowledge, being the most certain, is a model for other kinds of knowledge.  So according to this theory, the material world does exist, but for us to have any real knowledge about it requires assumption of an ideal world that transcends this or that material object.

2. The second critique disagrees with the first, and assumes that knowledge is sensory based.  All knowledge begins with the five senses.  But what knowledge do you gain from the five senses?  Knowledge about the material world?  Not at all.  “The pillow is red” is not a proposition about the pillow itself, but about how the pillow registers in the retina.  “The pillow is soft” says nothing about the pillow in itself but only addresses how the pillow registers tactile sensations in neurons under the skin.  In fact, we may agree that the pillow is largely empty space with tiny atoms bombarding each other. But this knowledge too has been derived from empirical studies that rely in the first instance on sensory data.  So all we know about are the imprints made upon our own body’s sensory registers.  If there are material things out in space which exist independently of our sensory judgments, we can know nothing about them in themselves—we can only know about them post-processed, as it were.  So according to this theory, the material world may exist, but we’ll never know, because all we know about are the products of our own subjective processing plant.

———————————

The first critique was favored in the 4th century BC by Plato, who deliberately rejected empiricism (and the Greeks had already theorized that the world was nothing more than billions of tiny particles called atoms bombarding one another and creating the appearance of solid shapes—a conclusion they drew through philosophical inquiry without any need for scientific instruments, and a materialist conclusion Plato had studied and rejected on the grounds that it could only give us low-level knowledge accessible to the senses, knowledge of material reality, which is only a shadow cast by a more ideal reality accessible to reason).

The second critique was favored by the empiricist, David Hume, who was presumably following the empiricism of 18th-century scientists to its logical conclusion—that the material world was either non-existent or utterly unknowable.  (There are counterarguments—and rebuttals—Kant, e.g., would pick up exactly where Hume left off. But it’s interesting to see how the ultimate empiricist, Hume, and the ultimate anti-empiricist, the rationalist Plato, both conclude that a purely materialist world view is untenable.)

 

Without God

Without God, all things are permissible.

This is a recurring idea in Dostoevsky’s Brothers Karamazov and it gets a lot of airplay today in social media and popular culture. The problem with this supposed argument for Christianity is that it seems uniquely designed to make the humanist look better than the Christian. If you tell me that without God we would have no reason to behave morally, aren’t you telling me that in your heart you would just as soon screw me but refrain only out of fear of God? Doesn’t this make the humanist — who is nice to others because of a heartfelt recognition of the intrinsic value of being nice — more admirable, more noble, more trustworthy in a pinch? (This is not an attack on Christianity en masse, just on a troubling line of reasoning employed by some Christians.)

Interestingly, Nietzsche may make an equal and opposite mistake. (Perhaps my readers who are more up on Nietzsche can affirm or deny or elaborate.) Nietzsche also suggests that God’s demise puts us “beyond good and evil,” but unlike our Christian interlocutor, he sees this as a good thing, a liberation of the human spirit from dogmatic ethical constraints (at least for those who are strong enough to handle the implications). The problem is that Nietzsche’s conclusion, like the Christian’s, rests on the premise that morality (or systems of good and evil) founder, or devolve into purely personal prejudices, in the absence of God.

Our humanist combatant of the first paragraph disputes this premise, arguing that rational principles and intuitive sympathies can provide a basis for ethics equal or stronger than any God-based ethic. I can’t say for sure that our humanist is correct, but I can say that in times of moral crisis I’d rather have her at my side than Nietzsche or the Christian of our example (although I might opt for Nietzsche when a crisis of wit is at hand).

The Cat Who Wore a Cactus for a Hat

Here’s a read-aloud for my 2 – 6 year old audience*

There once was a cat who wore a cactus for a hat.

This cat’s name was Jack.

When Jack marched down the alley, rats, lizards, birds, and bats would clear out.

This made Jack feel important.

It also made Jack sad because no other animals would come near him.

“Stay back,” they would say when they saw Jack.

One day Jack was marching past the bakery.

The baker was loading loaves of bread on a rack.

Jack reached in between two shelves, pulled out a loaf, and ran.

But what is missing?

Jack’s cactus hat got stuck in a rack.

The baker scratched his head when he saw a cactus stuck to his bread.

Jack sat down to eat on the steps of his favorite shack.

He ate without his hat. Poor Jack.

He was shocked to see a rat on his lap.

The rat looked at the bread and looked at Jack.

Jack shared some bread with the little rat.

Then came a possum, a pigeon, and a turtle with a teapot.

All the friends had bread and tea.

Jack like having friends at the shack better than he liked his cactus hat.

So he left his hat on the baker’s rack.

I hope no one ate it for a snack.

* Pero especialmente para mi amiga pequeña in Playa del Carmen, Uma, en esperanza que su segundo nacimiento :)

Tristram Shandy’s Faux Postmodernism

From time to time, my literati friends put forth Laurence Sterne’s Tristram Shandy (1759) as if it were a postmodernist novel 200 years ahead of its time. This is understandable, considering all the reflexivity and discontinuities built into the structure of the text. But on the core issue of human identity, Sterne is no postmodernist.

Human identity, to the postmodern, is essentially fragmented, incoherent, all colliding and discordant surfaces without any stabilizing interior or deep anchor. Sterne may superficially anticipate these postmodern preferences, insofar as human identity in Shandy, more so than in contemporaries in the period from Henry Fielding to Jane Austen, is whimsically determined by each character’s hobbyhorse. Sterne, however, is a man of sentiment, and his sentimental world view is at odds with postmodernism’s intellectually austere view of human identity. In Sterne’s case, the sentimental side wins. Although human identity in Shandy may seem random, even infinitely displaced by hobbyhorsical identity, this arbitrariness is underwritten in the text by a sense of private identity. (And insofar as private identity is perhaps more “private” in Sterne than in other writers of his age, he may anticipate Freud more than he anticipates postmodernism per se.)  It may be true that once we get Uncle Toby’s “military apparatus out of the way . . . the world can have no idea how he will act,” but Tristram suggests that the reader has a clearer vision than “the world”: “You have seen enough of my Uncle Toby” to know his “singleness of heart . . . plainness and simplicity” (italics mine).  For those who would appropriate Sterne for postmodernism, it may be tempting to see no stable identity behind the hobbyhorse.  (Compare to my snippets on Gertrude Stein or Robbe-Grillet.) A careful reading, however, suggests that there is private human identity, and that it is urgent that we recognize it as such, despite appearances, for this private identity is the real locus of the sympathetic passions at the heart of the 18th-century Cult of Sensibility, of which we might call Sterne a charter member.

Hobby Lobby’s Day in Court

I think it’s today that Hobby Lobby makes its case at the U. S. Supreme Court. I don’t know all the details, but from what I hear it seems pretty clear. Obamacare mandates that health care plans meet certain minimum benchmarks, including contraception coverage. Hobby Lobby does not want to provide contraception coverage, which the owners consider immoral. They claim that this requirement violates their religious freedom. But if Hobby Lobby wants to play in the marketplace, they should meet the same minimum benefit benchmarks as everyone else. Do we really want our insurance coverage to change every time we change jobs, subject to the various moral preferences of our employers? Should we really be under pressure to conform to our employer’s morality when it comes to our own health care?

This is not about religious freedom. Churches already have an exemption from the law for core-mission employees. If Hobby Lobby had the least interest in religious freedom, they would provide the same options as everyone else and let every woman make her own reproductive health decisions based on her own faith and conscience. They would not be imposing their religious preference onto their employees without regard to employee beliefs. They would not be claiming “religious freedom” to justify religious compulsion.

Bottom line: The owners of Hobby Lobby are fully entitled to practice their religious beliefs in whatever legal manner they choose; they are not entitled to make that call for every individual they hire.

Professionalism and Alienation

I recently heard (or perhaps instigated) someone at work talking about how proper attire promotes professionalism. My faithful readers will recall that I, as a fashion anarchist, have commented on Jeffrey Tucker’s suggestion that people should dress properly at work (Bourbon for Breakfast, Chapter 37).

Now to tackle the tangent idea that a dress code promotes professionalism. First, if professionalism is meant in the narrow sense of an individual’s competence to complete the tasks at hand with rigor, efficiency, and integrity, the fashion anarchist wins this one easily. Obviously, my engineering or accounting or design skills are not affected every time I change clothes.

If professionalism is meant in the general sense – the sense that it is generally easier to maintain professional relations where people are dressed professionally – this is a little trickier. On this level, I say good riddance to professionalism, which has been a scourge on human contact for some 300 hundred years.

The Age of Bourgeois Capitalism, which began in roughly the 18th century, could also be called the Age of Professionalism.  In the previous age, the frame of reference for human relations was the landed hierarchy of commoners, gentry, aristocracy and various subsets. Doctors and lawyers and such were generally commoners, subject to much mirth and ridicule in the literatures of the day. Even where respected, their professions (or one might call them “occupations” in that pre-professional age) conferred no class status. As bourgeois capitalism replaced landed hierarchies as the defining scaffold of power, the “professions” came to confer the kind of class status we see today, with grandmas encouraging grandkids to grow up to be doctors or lawyers (and not, on good authority of Waylon and Willie, cowboys, those residual personae of the land). The old frame of reference for human relations in the landed order – things like de facto respect for those above you in the hierarchy and generosity towards those below you in the hierarchy – was replaced by the public sphere paradigm to “behave professionally.”

“Professional behavior” presupposes human connections that are less vertical and more horizontal/democratic, and that may well be a step forward toward the ideal of a human community of mutual fulfillment, but it comes at a cost. The cost is alienation. Human relations becomes the “business of human relations.” When Karl Marx says that under capitalism “human relations take on the fantastic form of relations between things” (Capital, Vol. 1), this can be applied on the social as well as the economic level.  Human relations become a little bit icier. The other person is objectified, which enables us to treat him or her as an object in some market-driven game and not as a concrete human being. One scene in The Godfather (dir. Francis Ford Coppola, screenplay Coppola and Mario Puzo) nicely encapsulates human relations in the Age of Professionalism. Tessio has betrayed Michael and now realizes that Michael has discovered the deed and set him up to be killed. Tessio, knowing the end is near, tells Tom: “Tell Mike it was only business. I always liked him.” Tom replies with some pathos, “He understands that,” and then goes forward with the hit. Lift the veil on professionalism’s polite exterior, and this is the model of human relations you have underneath. It brings everyone one step closer to the version of human identity manifested in the “officials” of Kafka’s novels, who epitomize ad absurdum the sloughing off of all human responsibility in the execution of the office.

The alienation that takes place in the Age of Professionalism indeed gives us another reason to look to the Luddite/technophobe point of view. In particular, the technophobe distrust of mechanization may raise valid points about the impact of technology not just on labor markets but on human relations generally. If professionalism takes a subjective toll on the fullness of human relations, new technologies, without moral steerage, can give a kind of exoskeleton to the process of alienation, abstracting us from the human warmth and human consequences of our actions. The person who pushes a button in Nevada to launch a drone strike on a Pakistani village and then stops by Walmart on the way home probably does not see his actions the same way as one who had to stand toe to toe and push the steel blade into his opponent’s belly.

Now for the optimistic conclusion: In our collective reach for higher ideals, professionalism has served its purpose, weaning us away from hierarchies that were antithetical to the fullest form of human relations and giving us a basis for something more democratic and fully reciprocal. But we have paid a cost in terms of the objectification of, and alienation from, our fellows. It’s time take the next turn, put professionalism to bed, and reinvest full humanness into our relationships, even into our relationships in the workplace and with remote clients and customers. And one way to start that slow tectonic shift is to gently undermine the professionalism paradigm by bringing, so far as we can manage it, a little fashion anarchy into the workplace.  It might look funny, but it beats becoming characters in a Kafka novel.