“Is Technology Killing Us?” It’s Dark Side Is…

TechnologyIsKillingUs

The quoted part of the title of this piece comes from a recent article titled the same written by Kelly Sheridan at the “IT Life” site (Kelly Sheridan Article ).

Ms. Sheridan does a good job of highlighting the many drawbacks of living in a society that is overwhelmed with technologies on a daily basis. However, the article, like many such articles that are appearing currently on the subject, appears more as a “puff” piece that fits in line with the common presentations of such serious material. It appears as if there is no real seriousness to the issue.

For example, if you go to the site to read the article you are immediately confronted with what now passes for serious discussion in the United States today. Surrounding the article’s placement in the page are the typical eye-sores of advertising as well as the common social media thumbnails encouraging you to “one up” the article on the corresponding sites. At the bottom, you are luckily presented with the most recent comments on the article but most of them merely present commonly found lackadaisical style in such responses as if the commenters are too busy to really care.

None of this is Kelly’s fault as she is merely the author. However, the article’s presence in such a clutter of disorganized attempts for the reader’s attention quickly demonstrates the unnatural and cluttered world Human brains are forced to inhabit on a regular basis. Just attempting to read anything amid such clutter could give anyone a migraine…

Nonetheless, this is just a very small tip of an ever growing iceberg.

Modern technology does have a dangerous “Dark Side” that is often encountered with its abuse as most are experiencing at an increasingly sub-conscious level while also engaging in committing such abuse on their own, which in the end targets such people’s well-being in the long term.

Technology, no matter how good it may be, is only as good as the context it was designed to operate in realistically. For example, the first new technology that was targeted for the wrath of sociological study was email. As most sociologists found, email would come to be considered one of the worst things to happen to people in modern-day working environments as well as their daily lives. Why?

Let’s begin with the ancestor to email, which is word-processing. Within context, word-processing was a boon to serious writers and such people who have been trained in the arts of proofing and editing find that the additional features of such software such as spell-checking and Thesaurus lookup were added benefits but hardly the necessary crutches that professional and well-trained writers really needed except when under tight deadlines to get work completed.

So the context of word-processing was and is to be able to avoid having to re-type prose of all types on paper from the beginning as many times as it was necessary or make messy corrections that may or may not fit within the space allotted to the typed line. Word-Processing software eliminated waste and allowed writers to concentrate on writing while also being able to edit their work in a more refined way.

However, if word-processing had been limited to those who understood how to use their native languages properly, the software would have remained within its own context rather safely corralling its murky “Dark Side”. However, this was not going to happen and so the average person was given the opportunity to acquire such software and word-processing’s “Dark Side” was set out upon the world at large.

Prior to this releasing of such software to the world at large people were trained in-depth as to how to spell properly. People not only memorized large caches of words in such training but they learned how to make a “best guess” at a spelling if they weren’t sure of it by using the tools of language structure. As word-processing became increasingly prevalent in society this earlier form of training was eventually side-stepped for the advantages of letting software correct misspellings thus weakening the required brain-power to know and understand inherently the native language being used.

Enter email, the natural progression of word-processing software as other areas of technology were being refined such as the Internet that now allowed anyone with a connection to send messages of any length to friends and co-workers.

What happened with this new form of writing? Well, to begin with the art of letter-writing faded into oblivion meaning that the actual composition of one’s thoughts to a personal friend, lover, or relative was lost along with the privacy of such compositions. Formal writing to representatives of institutions or for such organizations was diluted to the point that in all fairness it too has become a lost skill.

Email turned proper written language into nothing more than a casual affair where now with few exceptions such messaging is now littered with grammatical mistakes of all kinds let alone a complete deterioration of the understanding of proper sentence structure. Most such prose now is the equivalent of juvenile drivel.

Added to this was the increasing volume people received with email technology that in of itself fostered a completely increasing casual approach to writing that now on smart devices many people use cryptic symbols to represent words and sentences. How can one learn and or retain the art of the skill of composition if they don’t write? And who even writes with a style or flair that old-fashioned penmanship aided in developing.

Our software development profession has not been afforded the ability to maintain such literary standards in the face of such a technological onslaught on language. Go to any forum, technical news site, or technical support site and just note some of the poorly developed comments in reply to the corresponding articles. Our profession has become equally littered with “language dunces” who cannot communicate their thoughts clearly.

Not to be ignored, any news site that has any article of controversy will not only be littered with the foul stench of uncontrolled dislike for the author but it is a wonder that so many even had the capability to produce any written words at all such is the poor quality of the language capabilities of the respondents.

Not all of this is the fault of technology as some of this had a political and thus sociological underpinning with the election of Ronald Reagan to the presidency of the United States. Of course this wasn’t Reagan’s fault but the type of societal change that he ushered into American society was as much his as it was his Republican party’s, which today as the original trends continued is now seen as a party of the dim-witted American citizen (not that Democrats are any better but they have always had a penchant for explaining themselves in better detail).

As the 1980s and the Reagan presidency progressed a stark development in the selection of college curricula was noted by observers in a major news magazine at the time. Up until then US college students prided themselves on taking studies in the sciences. Those that didn’t and opted for liberal arts, teaching, and business (the latter considered at the time as lowest of the low) weren’t even considered serious students by college campus populations across the nation. However, now for the first time a change had occurred in course selection whereby more students were seen opting for studies in business than in the sciences. American society with the help of the new Republican dominance in politics was changing along with those who saw real intellectual pursuits as something to be derided.

Mix this detrimental trend with a serious development in technology targeting the use of language (ie: word-processing, email) and the results are not pretty. It would have taken a very determined society to avoid the many detriments that both of these developments harbored. Unfortunately, the US citizenry does not have such fortitude, which is more apparent in Europe.

By the time Apple introduced its iPhone in 2007 the foundation had been laid for a massive deterioration in not only the health of the average American but their intelligence levels as well. The iPhone on top of burgeoning Internet usage heralded an age of smart-devices and the intertwining of technology with everyone’s daily lives and even their most intimate thoughts.

The growth of smart device usage commingled with underlying American narcissism exploded onto the world stage with developing social media technologies that allowed the nascent alienating tendencies of existing technologies to be magnified beyond imagination as tiny, individual use devices allowed individuals to create their own fantasy worlds cutting many off from even basic social interaction let alone the actual reality that surrounded them. It is now not uncommon to find two people sitting at a table “texting” each other.

Everyone now has such devices and the result is a constant use of such instruments to promote nothing more than gibberish, grammatically deficient language structure, new forms of language abbreviations and a form of narcissism not seen in the world before, which social media technologies with their promotion of “I am important!” sociological constructs have done the most to encourage.

The most dangerous health aberration of the use of such individualized devices is the continuing blasts of low-level radiation that one gets when such a device is placed near the body, especially the ear when making phone calls. Radiation is cumulative and every dose near the brain increases the chance that the user will experience devastating brain cancer in their future.

What is just as detrimental is that such technologies provide such powerful capabilities that many are now using them as alternatives to Human functionality that was originally done commonly by the brain. Such use has become so widespread and continuous that recent scientific studies on the consequences of such technological reliance has in the past 10 to 15 years shown a major decrease in the average intelligence by most people who rely on such devices.

However, such a deterioration actually began with the basic use of the Internet for research and information lookup. Gone were the days even then when students would learn how to do basic in-depth research. Why should they? Everything was now on the Internet.

The situation became so chronic at the time that many professors and teachers in the university environments began to see a severe increase in patterns of cheating, plagiarism, and simple lazy work devoid of any substance.

“The Atlantic” provided some in-depth insight into this issue with the publication of its article, “Is Google Making Us Stupid”.

More recently sociologists doing research on the chronic usage of smart-devices have found that such reliance does in fact destroy one’s ability to understand the information they are looking for since all of it is now presented in “sound bite” form. And finding information “out of context” of the entire subject matter has shown that people no longer actually learn since they can so easily lookup the information when needed. This is understood by such researchers to be acquiring information “out of context” to the relevant material of the subject in question.

Together all these rather simplistic, harmless actions break down not only individual capability to think critically but these simple endeavors infect an entire society with such wide-spread usage of devices in this manner whereby the intelligence of the society at large is reduced dramatically.

Is it any wonder then that university students and young adults are no longer able to perform mental functionality that years ago was expected of people? Is it any wonder then that so many of the elected leaders in the United States have personalities and intelligence levels that mirror nothing more than those of school-yard bullies who are no longer capable of rational thought on momentous issues of our time? Such low intelligence in the US military has actually sparked a new debate over the possibilities of a “winnable nuclear war”, when in reality there has never been and never will be any such possibility.

With such device usage many people have to come to believe that they can actually multi-task similar to a computer, which is an entirely false perception. Most consumer and business computers are not designed with “parallelism” and thus only appear to be capable of multi-tasking when in reality these machines are merely task-switching in a sequential fashion. Humans are very similar to such machines in that the Human brain can only do and perform functionality in a sequential order for conscious thinking though it appears that we are doing many things at once. Only machines that are designed with inherent parallelism can actually do multiple things at the same time but even here such machines require multiple cores to do so. For Humans to achieve this level of capability there would have to be a significant change in the way our brains are constructed.

A recent article in the UK’s Guardian daily describes this fallacy in Human perception and the dangers that it portends for us a species (Why the modern world is bad for your brain).

Today, the modern world has become nothing more than a technological morass that is chipping away at years of developed mental skills that once weakened generally portends serious and dangerous possibilities for societies that deal with highly complex and detailed infrastructures. For example, climate scientists have categorically proved that current, radical shifts in the Earth’s weather can be directly attributable to the toxins and gases that industrialized societies are pouring into our atmosphere. Yet, we have political leaders around the globe that not only contend that nothing of the sort is happening when events show that they are demonstrably wrong. In other words, such people are no longer capable of understanding what is actually happening in their own environments or feel that their profit-oriented agendas are more important for their own interests than anything else. They have thus lost the basic abilities required of mere survival.

Like the short-burst radiation that cell-phone users experience on a regular basis, the weakening of Human intelligence is a cumulative process. The idea that the Human species can ignore the acquisition of basic mental skills in order to rely on smart-devices to make up for them is nothing more than marketing-hype provided by snake-oil salesman of the technology industries and futurists who have no conception of how sinister technology can be.

Like any other muscle in the Human body, the brain is one as well. It can be strengthened and weakened like any other such organ. However, the present trends for it are towards atrophy and once that sets in there is no way to reverse it…

Advertisements

About Steve Naidamast
Steve Naidamast is a senior software engineer and military historian. As a software engineer he has developed applications across a wide spectrum of platforms for over 42 years. As a military historian he first became an expert on WWI combat aviation but later moved his studies to the causes of conflict predominantly studying the WWI era.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: