Sticks, Stones, and the Effect of Language

About a million years ago, a proto-human picked up a stick and bludgeoned a deer with it, and voilà! dinner. And ever since, this was the way of mankind. Rather than talk things through, we communicated with sticks, some pointy, some thick and club-like with stones lashed to them. Later, we developed language, a way of describing all those sticks. We needed words as a delivery system for our more complex thoughts. And eventually, we would develop insults, and later, passive-aggressive tones. Hooray!

About a million years after this stick incident occurred, we were taught a pearl of wisdom that said words used against us were not going to harm us. This “sticks and stones” maxim reminded us that insults were the last refuge of the ignorant, and no words could injure us, no matter how harmful. It was a way of fending off bullies, by equating them to knuckle-dragging, thick-sculled cave men. The thing is, those insults and verbal jabs do take their toll. I argue that being punched does less harm in most cases, especially when I was a middle-schooler being punched by a pint-sized assailant.

But the words are sometimes used as ammunition by people other than our classmates and peers. Teachers and parents were capable of much more harm to us. I never would have believed that grown-ups could inflict such cruelty, but I was twelve, and I grew up believing that adults knew what was best for us. But they were as clueless as any 30-something today, perhaps even more so. At least now, people have a wealth of information at their disposal, practically the entire repository of human knowledge by way of the internet. You might expect there could be no excuse for being ignorant, and yet many of us are. We should know better, but we don’t.

Back to that sticks and stones analogy. Physical injuries tend to heal completely. There are of course cases where lasting damage occurs. Broken bones, like those in the little phrase, may heal, but it might affect the way you move further on. I broke my thumb when I was 14 (I was practicing throwing punches after being tripped earlier that day, and my thumb caught the edge of the chair and went “crack!”) That hurt like hell, and I felt really, really stupid. But the physical pain went away after a while, and my body “forgot” the pain. Decades later, when the barometer falls or when geese migrate, my thumb gets stiff or a little sore. It’s not my body “remembering” the original injury. Instead, this is a lasting result. Be that as it may, this injury troubles me a lot less than some of the things people said to me over the years. Even though the words dissipated in the atmosphere just after being spoken, they still echo in my mind to this day.

You see, words can indeed cause long-term emotional pain, far beyond what a physical injury might have. We must therefore be extremely careful when choosing our words. We might start by developing effective feedback skills. This might be one of the most important parts of being a manager or any kind of leader. Saying the wrong thing can create problems further down the road. It is very difficult to undo the damage once this happens. You cannot un-say the wrong thing. Positive yet constructive criticism is like a precious resource, because it is rare that we receive it, and not very many people know how to deliver it. Saying something like, “I’d like to tell you where I see your strengths,” rather than, “do you know what your problem is?” for instance.

Once we have delivered valuable feedback, we can encourage others to learn this skill, as they begin to appreciate its worth. It will be like currency in a world where good communication is rare yet valuable. Right now I fear it is rare but unappreciated. Eventually, as we mature, we do see the value of it, but by then a subsequent generation has already been at the helm of our society. It’s important for teachers to develop this skill and pass it on. It needs to begin early on in a child’s development, earlier than we thought in the old way of thinking. Back then, children were not regarded as contributors to our society, but our understanding of the brain’s development has improved, and we know better now, so we tell ourselves.

Words have amazing power. We use them to inspire one another, to incite crowds, to soothe, and to charm. The right words are absolutely necessary for certain events, like toasting the bride and groom, delivering a eulogy, or giving someone bad news. Our words can injure. Words can be a weapon. Words can even heal. With the right words, a skilled negotiator can change the world more than any number of rockets and tanks. And the simple statement of, “whatever” can stop some of us in our tracks. Yes, sticks and stones may break your bones, but words can injure you for a lifetime. So be careful with what you say.

Advertisements

Violent

Plenty has been published in literature, produced in films and television to depict (or predict) a world where violent behavior had all but been eliminated due to a draconian system of justice, where even petty theft or vandalism could result in severe penalties. It goes without saying this is the west’s impression of the Democratic People’s Republic of Korea (North Korea), true or not. In the 1980’s we were instructed on what life in the Soviet Union was like, how the Russian people had no freedoms, no choice. The indoctrination of American youth during the Cold War could have been as equally oppressive as any Communist regime we imagined.

Singapore is famous for administering harsh punishments for seemingly insignificant offenses like littering or vandalism. They are very proud of their low crime rate, and it should be obvious why that is. So one might ask himself, why don’t more countries do this? Before attempting to answer this, I am reminded of something that puzzled me for years. Japan has very strict gun laws, severely restricting gun ownership, limiting sales, and granting the government shockingly sweeping authority regarding firearms, at least by American standards (but the US has fairly relaxed gun laws by comparison to most of the world). In Japan, perhaps as a result of these policies, nearly all gun violence has been eliminated. The big question is whether restricting gun ownership has resulted in a reduction of gun-related incidents, or was it something else?

I posed this question to someone who lived in Japan, and he told me something I did not expect. He is a gun rights advocate, and, like me, has had experience with firearms from an early age. Despite this, he and I don’t agree on every aspect of gun control. That said, he told me that the reason there is almost no gun violence in Japan is not because guns are hard to get hold of, but that the Japanese culture figures significantly into the equation. Of course, there are guns in Japan. But even the Yakuza gang rarely uses guns to commit crimes. Consequently, homicides in Japan are pretty uncommon. It’s worth noting that gun violence in Canada is also rare, even with dramatically fewer restrictions over gun ownership. What, then, is the explanation?

As a means to deter crime, I suppose courts in the US could throw people in prison for nearly anything, like spitting on the sidewalk or jaywalking. Many municipalities have passed some crazy laws that stay on the books, but we typically don’t incarcerate people for overdue library books. (I’m reminded that I need to write about Emmett Till, the black teen who was brutally murdered for allegedly making sexual advances toward a white woman, Carolyn Bryant Donham. In the Jim Crow South, and elsewhere in the US, simply being black was a crime.) Looking at our past, one must conclude that America possesses a most violent culture, one that can barely be contained. The US is chock full of guns. Our art is violent. We have been in a continual state of war since the beginning of the 21st century. We are constantly exposed to violence through video games, television, and film. Simply put, we are a violent people.

Could we eliminate crime by making the punishments so severe that they would serve as a deterrent? Many states still administer the death penalty, and yet capital crimes are still committed. This doesn’t appear to be the solution. One thing that is certain: violence tends to bring about more violence. I admit I have thought about making certain individuals wish they’d never been born. I won’t go into details. Is this something in the human genome? Are we taught to be violent? Can we unlearn this tendency? This may not be something we can overcome in the next 100,000 years. If that depletes your last hopes, do not despair. Humanity should be able to progress if we don’t destroy ourselves first. Carl Sagan was confident we could reach the stars with this contingency in mind. We are continually evolving, but that takes time, and our evolutionary gains have not kept pace with our technological advances. In other words, we’ve become efficient killers with our advanced weapons, but we haven’t developed the ability to conquer our base instincts. We are dangerous animals until that happens.

Disconnected

I just got back from an epic road trip halfway across the North American continent. Unfortunately, we drove across several southern states where everything is deep-fried. Oh well, it was only 10 days. But in that time we witnessed a total solar eclipse, took part in Cherokee rituals, saw elk sightings, a bent tree, and many other strange and beautiful wonders.

During this time, I realized the 21st century has a stranglehold on us. We are constantly connected to our world via mobile devices and wifi internet. For most of us, this is a relatively new phenomenon; many of us were born before the web was fully realized, and we can remember when instant messaging meant passing notes in class. But by the mid-90’s, things were changing quickly. The generations that followed may not feel the change, like that proverbial frog in the pot of boiling water. For anyone born in the 1990’s, their expectation is that information is perpetually within reach, and like we modern, post-industrial, space-age humans who never knew a world without electricity, there is no going back. At least not willingly.

Deliberately ditching your mobile for a week is harder than you think. Being among the various parts of Appalachia, Great Smoky Mountains, Blue Ridge, Pisgah, and so on, where wireless coverage is spotty at best makes it easier to keep one’s resolve to remain disconnected. I must admit, I failed to maintain absolute isolation; my phone would periodically find a signal every other day, and a deluge of messages would drain the battery, forcing me to scramble for my charging cable. As a result, I actually turned off the device – yes, it is possible – when I could not find the cable. Problem solved: no signal, no phone. The device was reduced to a pocket calculator and a low-resolution digital camera.

This idea that being in continual contact with the rest of the world is to me a little absurd. Bear in mind I remember a time when being unreachable was a distinct possibility when leaving the house. Before we all had mobile internet in our pockets, going out into the world untethered was not as scary as it might seem to some of you. Pay phones were ubiquitous, and you always carried some change in case you needed to call someone to check in or ask for a ride. By the way, I saw more than a few pay phones in Appalachian North Carolina. Apparently, this is still a good way to connect. Wifi was available in our motel. And I took advantage of it to plan a route back home. But I felt a little guilty doing this, even though we really needed help finding our way out of the mountains. Like I said, I wasn’t perfect.

2017 08 26_4390_edited-2
Chimney Rock viewed from Lake Lure, North Carolina

I have to recommend trying this for a few days at least. Go to the Smoky Mountains or Chimney Rock or any of the small, isolated communities surrounded by peaks, and when you realize maintaining a connection is pointless, simply turn off the phone. After one or two days you may see things differently. I am not saying that these devices are inherently evil, although some have gone as far as to blame mobile phone use for an increase in brain cancer. Maybe we are too dependent on mobile devices. It seems tragic that we forgot how to follow a map using a compass. Maybe we have devolved a bit by losing certain skills. Without our phones, what skills do we truly have?

Most striking, I found that without my connection to the internet, and thus, no ability to instantly share my experiences, I enjoyed savoring the moments in real time. The pictures I snapped would simply have to wait until I returned. The stories, updates, comments –  everything – were being stored mentally. The experience was just mine. Naturally, I shared the moments with my wife, and in terms of the eclipse, that was a mass event, so that was pretty cool. Also, we rode the Great Smoky Mountains Railway, and we listened to stories from the people with us on the train. These moments are what life’s all about. They can be documented digitally, but they become the planar, two dimensional aspect, less than an echo, and the experience cannot be transferred with the degree of fidelity as first acquired. In other words, you had to be there.

I have been converted. I am a believer now. I’m sold on the notion of unplugging, disconnecting if only for a few hours. I was fortunate to have been compelled into isolation. That made it impossible to cheat, at least for a while. But now there is a larger question looming: if being disconnected makes life a little better for a short time, should that be our natural state? I spend upwards of 50 weeks all year getting stressed out, then take off for a few days here and there to “unwind.” Why would I not want to live my life unwound? Well, some of us have to work for a living. But it does seem a shame to put off living until retirement.

How to Eat Breakfast

One summer ago, we had our roof re-shingled. Some people call it having a new roof installed. I think that’s a strange saying, because I envision a crew removing the rafters, the physical framework of the upper part of my house. But in this case, they simply mean that the shingles and the underlying protective layer are being replaced. Here in Texas we have extremes in weather, intense sun and heat, high winds, and hail. These elements really do a number on asphalt shingles. We hired a small crew to install the new roof, and they arrived every morning for four days, shortly before sun-up. As soon as there was a hint of daylight, several men, and one woman, were on our roof, stomping around, dragging cases of shingles and tools across its surface. There was no way to sleep through this.

I was never what you would call a “morning person.” I typically spend late nights working on little projects, writing, sometimes playing video games. Occasionally I stay up late with work. But I’ve always found something to keep from going to bed at a decent hour. But then here came these roofers, plodding riotously just above my head. Since there is a logical flow of events beginning with the emergence of daylight and culminating with the clamor of office work – phones ringing, chatter, and the tell-tale nervous laughter of hyperextended workaholics – once awake, I needed to get up. That time in between, this morning Thoreau spoke of, is meant to be relished, accepted with joy and dare I say, exhilaration, because morning is truly inspiring. Just ask all those dead poets and philosophers. Yeah, I thought so.

Inasmuch as I am a night owl, mornings do hold a certain mystique that I am still learning to appreciate. Things happen in the morning that you cannot reenact. One of these is breakfast. Breakfast, from the late Middle English for break and fast, in other words, a meal following a brief fasting period, albeit only 10 hours or so, is truly intended for mornings. I’ve had breakfast foods – omelette, waffles, etc. – at various times of the day and night. Yes, night. Something about IHOP at 11:30 pm is just kind of cool, or dorky.

My wife and I, therefore, were compelled to have breakfast together each morning. And even though this clamor of rooftop ballet lasted only a few days, we have continued to make and eat breakfast together every morning ever since. Breakfast in the US usually consists of eggs and bacon or ham. Some prefer pancakes. Our regimen includes oatmeal with fruit, coffee, and grapefruit juice. I prefer steel cut oats, but they take 30 minutes to cook. We sit at the kitchen table and actually talk about things – the expectations of the impending day, weird dreams we might have had, stuff we want to share – and we eat said breakfast.

I used to say that I didn’t have time for this, even though the idea that breakfast is the most important meal of the day has been drilled into my consciousness for decades. Whether or not this is true, the ritual of sharing a morning meal has enriched my life. We carry it into the weekend, where additions are afforded, like sausages and eggs. on rare occasions, waffles. Each morning, preparations are made, and time is carved out for the spectacle. We talk about what’s going on with us, what plans we’ve made for the day. We compare schedules and talk about upcoming events. Quickly then, we clean up, and I get ready to leave. But I’m not in a hurry because I’ve carved out this time. It’s our time, not theirs. And that’s the beauty of breakfast.

I know very few people who have this luxury. But I see it as a necessity. Not the food, but the time spent relaxing and enjoying it; the ritual, the act of breaking bread. My perspective has in turn made it less of a luxury and more of a right, a privilege. I feel entitled to having a meal. I mean, food is a human necessity. Why do we feel we have to defend ourselves for making time to eat? I see my coworkers actually skipping lunch because of work. They say they have no time to take a lunch break. Not only is this absurd, but it is actually in violation of OSHA standards. There’s that precious time, that elusive time, the subject of many poems and songs. Why do we deny ourselves what is our fundamental right?

I still don’t think of myself fully as a morning person. Caffeine is a main source of my morning energy. But I have become somewhat of a creature of the morning now. The night still calls me, but lately I’ve found I actually look forward to sleep, and the following morning with that reward of coffee and and English muffin. Suddenly, the night has less appeal. It’s strange to see such a change in oneself. But these things happen. And I don’t lament saying goodnight to my old ways.

In Search of the Walking (not Dead)

Summer began abruptly this week in Texas. Later in the week it was spring again. It has been said that if not for air conditioning, the population of Dallas would be much smaller. The population of Plano, Texas in 1960 was 3,695. By 1970, the population had increased by almost five times. (Latest estimates are now between 260,000 and 278,000). If you drive through Plano you will notice a couple things:

  1. Most of the city was designed around the automobile
  2. There is no central district; “Downtown” Plano is actually a revived, gentrified area on the east side, filled with trendy bars and restaurants, as well as several novelty shops.

One of the most frustrating aspects of cities like Plano is that they are laid out in such a way as to make walking from place to place not only impossible, but it seems that cities make a concerted effort to discourage it. Pedestrians are seldom seen, and it is rare that they are spotted along the road, like Spring Creek Parkway, for instance. (By sharp contrast, people in Washington, DC are often seen walking along crowded sidewalks.)

If you live in a city that was built before 1950, you probably haven’t seen the kind of urban sprawl in cities like Plano or Phoenix, AZ. After the end of WWII, especially during the prosperous decade of the 1950’s, cities were transformed, and with low gasoline prices, owning a car shifted from being a luxury to a necessity, especially when urban planning was encouraging some people to live in the suburbs, at longer distances away from the city center. Eventually, businesses would move out of the city to the ‘burbs, triggering further expansion – read “white flight.” All the while, this pattern would make walking to work something of a quaint oddity. Nowadays, everyone must have a car. Larger cities have public transportation, but riding a bus is seen as indication of lower economic status. Walking is worse. If you are on foot in certain communities – and not wearing activewear – one might assume you are a homeless person.

In my neighborhood, I do see people on foot a little more than elsewhere. It’s kind of encouraging, and I don’t know exactly what to make of it. I see people of various ostensible means, young and mature, walking along certain streets, apparently to and from the shops nearby. Well, the big-box stores, anyway. But it’s a start. On that note,  my version of a perfect world may be unwelcome to the next person. I might like to have shops within walking distance from my front door. The downside of that is that you must live close to where many people might congregate. There would be noise at all hours, and there might be an increase in crime from the temptation of so many people with money to spend. This is what city living is supposed to be, and suburbs have tried to manage the dichotomy of both urban life and country living.

Cities need to step up efforts to encourage fitness and community among their citizenry. Constructing sidewalks and installing drinking fountains are a good start. Walking is one of the best forms of exercise available to everyone. It doesn’t require special equipment other than decent shoes, and it costs absolutely nothing to participate. Perhaps walking is not so popular by design. Fitness centers would not be making money if everyone knew they could get the same results at no cost. But walking outdoors has hazards. The sun can be harsh (especially here in Texas), and there is the rain (which we don’t see much of). Traffic can make walking a risky activity. My advice: leave the headphones at home. You need to be able to hear what’s going on around you.

When you lace up your walking shoes and head outside for a stroll, remember that people have been doing this for hundreds of thousands of years. It is the original means of transportation. We were meant to walk. Not walking is in fact bizarre and unnatural. You don’t have to be in a hurry. You can walk as quickly – or as slowly – as you wish. And there is no clock or finish line. Protect your skin from direct sunlight as much as possible, and drink plenty of water. And if you come to Texas, be prepared for some heat, especially during summer. Well, my Fitbit is telling me to get off my ass. Ciao!

 

 

Is it Safe?

I was in a restaurant the other day when I caught a whiff of ammonia as one of the employees was spraying Windex liberally on tables and other surfaces to clean them after diners left. The whole place smelled of ammonia, and the fumes irritated my eyes and my throat. I mentioned it to a friend who told me it wasn’t such a big deal, and they needed to disinfect the tables after people ate there. I reminded my friend that you can disinfect using distilled vinegar. He said he didn’t like the smell. Okay, but the “smell” is not a toxic compound produced the chemical giants like P&G or Dow. White or distilled vinegar, among other varieties, are not only nontoxic, but you can actually ingest them in small quantities without any harmful reaction. The fact is, I make glass cleaner from an ingredient I could use in salad dressing. And it has been shown to be an effective disinfectant. Plus, it’s cheaper.

Chlorine is also widely used in restaurants as a cheap disinfectant. I admit it is quite effective in preventing the spread of bacteria like salmonella. For the kitchen and restrooms this is perfectly acceptable in protecting the public from harmful pathogens, and restaurant staff should take such measures after the establishment is closed for the night. Exposing patrons to ammonia or chlorine is potentially problematic, but if these chemicals are combined, the results can be quite toxic, and the combination should be avoided in all circumstances. I think it’s fine to mop the kitchen and dining room with a bleach water solution after closing time. A little chlorine goes a long way. Ammonia as a glass cleaner is not absolutely necessary. See this California Childcare Health Program article for more information.

I routinely clean my house with non-toxic solutions. I make a glass and surface cleaner from a mixture of distilled vinegar, water, and a drop or two of mild dish soap. This is surprisingly effective in cleaning dirt and residue from surfaces. I use other less-toxic solutions for disinfecting, and I use chlorine-based cleaners for sanitizing the bathroom fixtures and the kitchen sink. I’m kind of a stickler about what can be called “clean”. I eat off dishes that I consider clean, and I generally do not use bleach to get to that level of cleanliness. But if I were to eat mac & cheese off my kitchen floor, you’d better believe I’m going to scrub that son of a bitch down. Is it largely psychological, the fact that my dishes are not nearly as clean as my floor, and yet I find it repugnant to eat off the floor? Yes, I’m sure of it. I will not be dining dal pavimento anytime soon.

In the meantime, I’m comfortable cleaning with my vinegar solution. Ammonia is overkill, and it makes my eyes and throat sting. Oh, did I mention that my wife has multiple chemical sensitivity? Some people don’t believe this is real, but besides any doubt many people have, there is no denying that chemicals are used in increasing quantities and concentrations. The unfortunate side effect to the public is becoming desensitized to these harmful agents, except for the growing number who for unexplained reasons become more sensitive to them. Living in a toxin-free environment (or as close to one as I can be in the 21st century) has made me more aware of the onslaught of chemicals encountered in the supermarket. I think I was not aware how noxious the detergent aisle was until recently. Meanwhile, vinegar doesn’t bother me at all.

Some of my ancestors lived beyond 105 years. And that was before anyone knew about microorganisms. They did not have modern cleaning products in the 18th century, and yet they lived ostensibly healthy lives. Of course this is not to say that people in the 18th century didn’t contract illnesses due to bacterial infections. But maybe people had higher resistance to germs because they didn’t use hand sanitizer every fifteen minutes. I think we are so afraid of getting sick, we are in danger of making ourselves more guarded against the bug. Perhaps we can embrace it. Just don’t get too complacent.

So for the time-being, I hope restaurants would at least stop exposing people who are trying to eat to harmful chemicals. You can still douse the tables and booths with super-concentrated Clorox after everyone has left. Just use the buddy system in case you get a little too much of a good thing. Or better yet, think of alternative cleaning methods.

 

Perseverance

I was raking leaves in my front yard one day when I stopped to notice the bustle on my neighborhood street around me. Cars were driving by, and people waved at me as they passed my house. Kids on bicycles and skateboards drifted along, while others played basketball in the street, occasionally interrupted by a passing car. I started thinking about how idyllic the scene was, yet surely not everyone would share my joy for what I took as the perfect day. While I felt like there was hope, perhaps another felt despair. I relished in the simple joys of the perpetual struggle against the cycle of nature, while someone else might perceive it as eventual defeat. Nature always wins.

Must we always think of things in terms of being successful or failing? I thought of the saying, “slow and steady wins the race.” But what race? When shall we say, “I have won?” Naturally, there are moments when we do compete: when interviewing for a job, in a debate, or playing a sport. Of course you can be declared a winner in many situations, but oftentimes there is nothing to win. Take gardening, for instance. As I raked the leaves, or as I pulled weeds and grass out of flowerbeds today in preparation for planting, it occurred to me that it will never end. As long as I want to have a garden, I must work to keep nature from taking over. Year in and year out, I return to the flower beds, get down on my knees and toil. All summer, too, I struggle to keep the unwanted plants out, while fighting to maintain the ones I want. I clip and prune, mow, and mulch. Slow and steady, yes. But winning is not possible.

Some things don’t seem worth the trouble. When I see the results of my determination, however, I realize giving up was not an option. All summer I get to enjoy the flowers and watch the bees and butterflies hop from one to the next, rejoicing in the richness in the array of beauty.IMG_9251_lgIn a few months it would all fade away, and I would be faced with the task of preparing for the next season. The show was fantastic, and the denouement deflating. But I convince myself to start again from scratch each year, knowing I won’t “win”.

Looking at the picture above, I am inspired again. It amazes me what can result from simply planting seeds smaller than the tip of a pencil. But gardening is not an activity for the slacker. It requires dedication and perseverance. You must keep at it; otherwise your beds will be overrun by invasive roots, vines, weeds, and ants. Pretty soon, you have anarchy.

I often like to use this as an analogy for working hard in spite of the obstacles, but sometimes a flower bed is just a flower bed. And I’m losing daylight.