Is Telekinesis Real?

I think I’ve been considering whether telekinesis was a real thing ever since I saw “Escape to Witch Mountain”, which came out in 1975. In the movie, two children who possess remarkable mental powers are pursued by nefarious grown-ups who probably want to dissect them, for science. The children can move objects with their minds, thus defeating the bad guys, as I remember it. But the question shouldn’t be whether telekinesis is real; rather, we should ask if it is at least possible. It may not be scientifically responsible, but I’m going with it.

The popular basis for such powers is in that they come from the mind, from brain waves. This has been consistently and repeatedly disproven as the source of any Yoda-like power to move objects using only thought or “the force”. Purely brain-based manipulation does not appear to be at work here, if it’s really happening at all. So far, there’s no real evidence.

The Ted presentation in the first link above made a strong case for the non-existence of psycho-kinetic ability in humans. There is simply no real evidence that it’s real. But maybe it’s possible. The reason I say this is we are just now beginning to unravel the mystery of the human brain. This complicated organ has been beyond the realm of understanding for most of human history. Only recently has any real progress been made toward a breakthrough; yet we still aren’t sure what’s behind diseases like Alzheimer’s and Parkinson’s.

The brain is certainly amazing and misunderstood. But the probability for the human mind to have control over objects, as with telekinesis, is not likely. Brain waves have not proven to be able to extend beyond the human body. But what if it’s not the brain at work here? Well, at least not directly. In the way we can draw a connection that leads to brain activity, from the taste of an orange, the ability to tell when milk is sour, and the subtleties of textures like parchment paper or a worn out dollar bill, many of these senses are transmitted via nerve endings on fingertips, olfactory tissues, and tastebuds. But, yes, the brain deciphers the input. What sends these signals? It’s electrons.

We are all powered by electricity. Our brains, nerves, muscles, everything – they all receive electrochemical signals, which are essentially electrons moving through the body. We are made up of atoms, and the electrons are the tiny particles that move around the universe. Actually, according to quantum theory, and something proposed by Albert Einstein,  you can be in two places at once. Naturally, this sounds like science fiction, but a team of physicists recently proved Einstein was right. Therefore, if it is possible for part of your physical being to travel beyond your immediate perimeter, that is, farther than your reach, why then is it so unlikely that telekinesis and psychokinesis could be a reality?

If such a thing is truly possible, how would we control it? This is where the idea of mental ability comes in. Quantum states are not likely regulated by brain waves, but perhaps there are things we do not yet understand about how the brain works. We’ve already accepted this when it comes to diseases. And mental illness is not only misunderstood, but its treatment is still in the dark ages, relatively speaking.

A study in the 1980’s did confirm that Tibetan monks were capable of controlling their body temperatures.  Was this the result of disciplined manipulation of the blood vessels? If that’s all (and that’s no small feat), it could be possible to control other physical aspects, like how much electrical energy emanates from the body. Far-fetched though this may be, we simply do not know what we are not capable of at this point in our evolution. And isn’t that a wonderful and terrifying place to be?

I think the most exciting part of this quest is the unknown. A hundred years ago, transmitting images via microwaves was unthinkable. Now, television is starting to become obsolete. Change is fast and unpredictable. We’re making new discoveries frequently, and they often shatter our preconceptions about what we thought we knew.

Okay, sleep well.

 

 

Advertisements

Is it Safe?

I was in a restaurant the other day when I caught a whiff of ammonia as one of the employees was spraying Windex liberally on tables and other surfaces to clean them after diners left. The whole place smelled of ammonia, and the fumes irritated my eyes and my throat. I mentioned it to a friend who told me it wasn’t such a big deal, and they needed to disinfect the tables after people ate there. I reminded my friend that you can disinfect using distilled vinegar. He said he didn’t like the smell. Okay, but the “smell” is not a toxic compound produced the chemical giants like P&G or Dow. White or distilled vinegar, among other varieties, are not only nontoxic, but you can actually ingest them in small quantities without any harmful reaction. The fact is, I make glass cleaner from an ingredient I could use in salad dressing. And it has been shown to be an effective disinfectant. Plus, it’s cheaper.

Chlorine is also widely used in restaurants as a cheap disinfectant. I admit it is quite effective in preventing the spread of bacteria like salmonella. For the kitchen and restrooms this is perfectly acceptable in protecting the public from harmful pathogens, and restaurant staff should take such measures after the establishment is closed for the night. Exposing patrons to ammonia or chlorine is potentially problematic, but if these chemicals are combined, the results can be quite toxic, and the combination should be avoided in all circumstances. I think it’s fine to mop the kitchen and dining room with a bleach water solution after closing time. A little chlorine goes a long way. Ammonia as a glass cleaner is not absolutely necessary. See this California Childcare Health Program article for more information.

I routinely clean my house with non-toxic solutions. I make a glass and surface cleaner from a mixture of distilled vinegar, water, and a drop or two of mild dish soap. This is surprisingly effective in cleaning dirt and residue from surfaces. I use other less-toxic solutions for disinfecting, and I use chlorine-based cleaners for sanitizing the bathroom fixtures and the kitchen sink. I’m kind of a stickler about what can be called “clean”. I eat off dishes that I consider clean, and I generally do not use bleach to get to that level of cleanliness. But if I were to eat mac & cheese off my kitchen floor, you’d better believe I’m going to scrub that son of a bitch down. Is it largely psychological, the fact that my dishes are not nearly as clean as my floor, and yet I find it repugnant to eat off the floor? Yes, I’m sure of it. I will not be dining dal pavimento anytime soon.

In the meantime, I’m comfortable cleaning with my vinegar solution. Ammonia is overkill, and it makes my eyes and throat sting. Oh, did I mention that my wife has multiple chemical sensitivity? Some people don’t believe this is real, but besides any doubt many people have, there is no denying that chemicals are used in increasing quantities and concentrations. The unfortunate side effect to the public is becoming desensitized to these harmful agents, except for the growing number who for unexplained reasons become more sensitive to them. Living in a toxin-free environment (or as close to one as I can be in the 21st century) has made me more aware of the onslaught of chemicals encountered in the supermarket. I think I was not aware how noxious the detergent aisle was until recently. Meanwhile, vinegar doesn’t bother me at all.

Some of my ancestors lived beyond 105 years. And that was before anyone knew about microorganisms. They did not have modern cleaning products in the 18th century, and yet they lived ostensibly healthy lives. Of course this is not to say that people in the 18th century didn’t contract illnesses due to bacterial infections. But maybe people had higher resistance to germs because they didn’t use hand sanitizer every fifteen minutes. I think we are so afraid of getting sick, we are in danger of making ourselves more guarded against the bug. Perhaps we can embrace it. Just don’t get too complacent.

So for the time-being, I hope restaurants would at least stop exposing people who are trying to eat to harmful chemicals. You can still douse the tables and booths with super-concentrated Clorox after everyone has left. Just use the buddy system in case you get a little too much of a good thing. Or better yet, think of alternative cleaning methods.

 

CUI BONO?

I’m fortunate that I am the recipient of a liberal arts education. This might seem like a contradiction in terms, since I did not receive specific job training from my university studies, aside from the credentials to teach literature, or having seemingly scattered reference points on the map of human history. Part of my career was in pursuit of the natural sciences, specifically human biology, at which I excelled. Ironically, I work in the field of information technology, which I came into purely by happy accident. So I am particularly blessed that I have a good job in spite of my area of study.

College may not be for everyone. There are many good-paying careers that do not require a college degree, not in the traditional sense. Electricians, plumbers, and welders, to name a few, while perhaps benefiting from study of a foreign language and some advanced maths, can find work after a one or two year course of study. Culinary arts and other fields promise the same results, with another year of study, possibly. But the traditional four year degree may not be necessary or economically feasible.

When I was an undergraduate back in the 1980’s, attending a school in the state university system, my tuition per semester amounted to about 8 weeks salary, based on minimum wage (then, $3.35 an hour) at 20 hours a week. Of course there was room and board, books, meals, and sundries. But I’m just talking about tuition. Here in 2017, that same state college tuition, based on minimum wage today of $7.25 an hour, will take you at least 60 weeks to pay off. It’s not unheard of for a college grad to be in hock for $100,000 or more in student debt. And if you are the parent of one of these students, you would pray that they have some career lined up, so they can start repaying their debt as soon as possible.

So I was fortunate. I did have to take out student loans, but not for too much. But I would gladly pay it all over again (provided I was paying 1980’s dollars). But reliving those years would offer no guarantee that things would work out the way they had. (Of course, things might have been better.) But was it worth it? Who benefited? (Cui Bono?) What did I really get with my degree? It didn’t provide any training germain to my current career. In fact, client-server software development didn’t really exist as we know it, not that anyone truly understands it now. (Incidentally, I met my wife at college). The skills needed to work in today’s IT world can be obtained from a local community college certificate program. But many companies still look for at least a bachelor’s degree (or equivalent work experience) from their candidates. Equivalent work experience? Abraham Lincoln was self-educated, and many people in their fields are self-taught.

But I would recommend the university experience for some. That experience is unique, and the memories last a lifetime. You may never apply your knowledge gained in that one semester of poli-sci, or remember the French you studied. But you will have benefitted from it. Will that experience be worth the thousands of dollars you will eventually have to pay? That may depend on what happens in the future. As I said, looking back, it seems worthwhile to me. But that was a different time, I suppose. It seems that colleges and universities are not what they used to be, academically speaking. Students may not wish to study literature, and they may see no value in analyzing Othello for hidden meaning.

It’s too bad you can’t simply certify yourself as self-taught. It worked for Lincoln. Why can’t a person study law and attempt the bar exam? What about medicine? Well, some areas of study really need to be at the university level. In the future, a four year degree might cost more than a house. I think we’re starting to see that now. It’s shocking how much tuition has increased over the years. As I mentioned above, calculated in terms of weeks worth of salary, it’s gone up by more than 7 times in 30 years. Is the answer in increasing the minimum wage? Should tuition be regulated? Is Bernie Sanders’ plan feasible? Could the US pay for anyone who wants a college education to receive one? In the meantime, certain skills are hard to come by. Even someone with a masters degree is not automatically qualified. On the other hand, I have a friend who has never set foot on a college campus and excels in the field of technology. But even then, education is the key. Education takes many forms. It can be through diligent observation of the world around us. It can be through books, extension of the great minds of the past. It may be through experience. Education is crucial.

And for you lawyers out there, cui bono does have a specific legal definition, but I am thinking of the broader meaning. Thanks for noticing.

An Arm and a Leg

A report published by the National Academies of Science, Engineering, and Medicine this week has received less attention than perhaps it deserves. The report, titled “Human Genome Editing: Science, Ethics, and Governance” explores the emerging reality of the not-so-distant future of addressing certain human diseases by editing specific genes in human embryos, egg and sperm cells. This level of medicine has heretofore been left to the imaginations of science fiction writers. But now, it looks like we are peering over the edge of that boundary between imagination and what looks to be a stark reality, and our notions of what is ethical and “right” might get shaken up just a bit.

What’s truly significant here is not only the ethical consideration, but more so the vision we procure from our daydreams and projections of our own future, like the distorted albeit detailed view through the peephole in the front door. Predictions may or may not come to fruition but will surely fuel the debate about humanity’s path, if not solely for the benefit of fleshing out our nightmares. The first thing one might conjure up is basically the plot of the 1997 film Gattaca, in which we see a future where designer babies can be ordered like you would a pizza, customizing your offspring to be taller, smarter, and stronger. This is the primary concern of some who believe we are looking in the face of pure eugenics, a pseudo-scientific study intent on reshaping the human race, or segments of it, into an ideal species, one not only disease-free, but perhaps also free of any tendencies toward obesity or depression. A “perfect” human, if you will.

If scientists were to, say, focus their energy on eliminating AIDS and malaria, populations in Africa would be the first to benefit. But something tells me altruism will lose out to economics, and companies will work to attract the rich, who will be more than willing to pay any amount to “build” a new generation of super-humans. With the rich now being relatively free of diseases like cancer and Parkinson’s – which used to be more of an equalizer – now only the poor will get sick. Optimists among you might see possibilities, but this new world where you can guarantee your children and their children will never suffer from devastating diseases is sure to render a class society, where now you can identify the second-class by their raspy cough or their hair loss due to chemotherapy.

Because, you see, if only poor people are the ones to suffer from human frailty, then where is the incentive for drug companies to do anything about their plight? Indeed today even the wealthy can suffer from schizophrenia or rheumatoid arthritis. But pharma can make a pill for what ails you, and people like Martin Skreli can capitalize on the remedy, marking up the price for a life-saving drug by 5000%. Not only are the poor going to be further marginalized, but even non-GMO humans who are not sick could still be discriminated against. Since nearsightedness could be eliminated, the world might become harder to navigate for the normal-sighted as text becomes smaller, and sight requirements become more stringent. Could we design a dynasty of athletes? Is tweaking some genes that control memory like cheating on a test?

The gene or gene-cluster that is responsible for addictive tendencies might be switched off in a family with a history of alcoholism. That is not to say that no one would develop a drinking habit, but we don’t know enough at this stage. The medical ethics community strongly emphasized that genetic manipulation would only be okay for preventing devastating and untreatable illness, as a quality of life issue, or for humanitarian interests. The ability to pick and choose the attributes of future generations is strongly frowned upon, but who polices the world of genetic research?

I fear for a future where someone like me, myopic with a slight attention problem, would be shunned by society, now having to exist in this Island of Misfit Toys we call “normal”. But if you were to eliminate aberrations in the future gene pool, the Stephen Hawkingses and Franklin Roosevelts of the world might never materialize. Some of the greatest examples of humanity have been flawed, frail individuals. Should we abandon that possibility for the hope of eliminating those frailties? Doesn’t my nearsightedness and my ADHD make me a better person because of those flaws? What sort of character would I possess if I never had to struggle?

Editing genes might look very attractive when you are faced with the seemingly insurmountable hurdle of finding a cure for cancer. Don’t get me wrong; I would be the first to congratulate the scientist who announces that he or she has accomplished that. Get rid of heart disease and diabetes, by all means. But take it one step at a time. Once we have “cured” something, let us take stock of it and all its ramifications. Maybe start with AIDS. Then cancer, followed by heart disease. (Some would argue that heart disease kills more people, but it is preventable in most cases.) It worries me that gene editing to prevent something might make a super-infectious pathogen possible. I expect there have been many lab trials, and any human trials might be quarantined just to be safe. In any case, it’s scary as hell, but people are dying. And this is not so far in our future. I predict within the next ten years a child will be born who possesses altered genes. This person will look like any one of us, maybe a little closer to perfect. Then it begins.

Read the NPR story for more