A recent piece in CIO Magazine laments the growing communication chasms between Information Technology professionals and their non-technical counterparts. The article can be read at the following link… CIO Magazine article
It is a very prescient piece of writing that should be read by every technical professional. The article itself does not detail the history of what led up to this situation, but instead provides an insight into the consequences of a long history of terrible mismanagement in which many technical innovations were produced but poorly implemented.
What the article does stipulate is the increasing tensions and failures between Information Technology (IT) and non-IT personnel in US corporations to align themselves properly for the best interests of the goals involved in such organizations.
This situation did not occur overnight and it is certainly not the complete fault of the younger millennial generation now taking the reins in a good part of corporate software development (and a bit of the blame as well). Their misguided enthusiasm for the new products and technologies is certainly a factor but even here guidance or lack of it, is a distinct issue resulting from poor leadership.
There is certainly a division of thought between IT and non-IT personnel, which seriously affects at how both parties communicate with each other. However, this lack of understanding resides more in the non-IT arena, since it is in this arena that requires a greater understanding of Information Technology. The IT profession has always been an area of contention and suspicion among those who do not work in the field. As a result, it is easy for consumers of technology, especially those in business to commoditize it into something they can understand. Unfortunately, this has only widened the issues of communication over the years.
Information Technology is a field littered with its own slang and terminologies in addition to a set of thought processes that are alien to those outside of it. It is after all a technical profession imbued with both a science and an art to its practice as any creative, technical profession would be. However, instead of trying to understand this difference at a sophisticated level of intellectuality and acceptance, business leaders and managers, along with much of IT technical management, have attempted to mold the profession into something that a business person could understand without any credence given to the software development profession’s inherent characteristics and differences. The result is in essence a catastrophic crisis between the two as the CIO article details.
During the mainframe era of computing, IT was known as MIS or Management Information Sciences. When it came to all things computing, MIS ruled with an iron fist since few outside of it understood the ramifications of the decisions made against it. “Big Iron” as the mainframes were called, was something completely alien to the business environments implementing them at the time. This is not to say that MIS did not take its role too far. Its most egregious mistake was that it made no effort as a professional community to train non-technical business personnel in understanding it in a way in which they could appreciate the complexities of such a profession and the costs to business of mistakes made with such capabilities.
With the advent of the microcomputer technologies in business, this was about to change along with the rising business interests in globalization as a result of the new capabilities of the Internet. All these areas of change and innovation would converge on the IT profession with a ferocity that would change the landscape of business.
Microcomputer technologies were seen as finally giving businesses the flexibility they never had with the mainframes and once environments for the new software development efforts become the norm in the 1990s, business personnel were finally take advantage of this newly found flexibility.
Along with the rise of these new technologies, the software development environments had to keep pace and one elemental innovation was defined as a new style for programming, “Rapid Application Development (RAD)”, which detailed concepts that would allow new applications to be developed far more quickly than that which had been done against the mainframes. It was a natural evolution in that these more nimble technologies made development in general far easier to pursue on a creative level as well as a qualitative one.
Unfortunately, RAD was also seen as a way to foster development on a competitive basis between businesses. As it was far more flexible, this very characteristic gave it the ability to be radically misused and managed as old software development paradigms were slowly coming to be seen as unnecessary, antiquated, and inefficient. No one really thought about the development process itself and the fact that it really didn’t matter for what type of machine it was being done against; standardized practices were still required for quality, which in of itself was a prime factor at keeping costs manageable.
Without such consideration both business and technical managers rushed into the development of projects at scales not possible in the mainframe days. This lack of consideration towards practical software development standards allowed management to develop unrealistic perceptions of the use of these new technological innovations. Newer and smoother development processes against quality engineering processes were simply not enough to satiate the appetite for fast application development.
As the economic forces of globalization took in the US business world with the advent of the modern Internet, business leaders began to adopt corporate doctrines that would transform their business organizations. Much was written and documented about such transformative changes in the work environments of US corporations but little was done to rectify it outside lip-service to certain legal amenities. Business would simply become a brutal form of economic warfare that would tear down the original infrastructures of the former MIS environments as too costly in order to transform such environments into soulless, competitive entities.
This process saw the dissolution of these formalized MIS structures that had been successfully built within the mainframe era. Subsequently the IT profession saw the removal of business analysts, systems analysts, architects, designers, database administrators, project leaders, and so on down the line in the IT hierarchy, leaving only the absolute minimum personnel necessary.
While this devolutionary process was ongoing, management developed an entirely unfounded concept that a single developer would be able perform requirements and system analysis, functional specification development, implementation, and deliverable while also adding the further expectation that such a professional would accrue the necessary business knowledge for the applications he or she was developing.
With the former paths to credible communication between business and technical divisions now gone, all such endeavors fell to the minimalized software staffs and their growing external staffs of outsourced personnel, consultants and freelancers.
The push to popularize outsourcing of technical work and staff to foreign nations at the same time as younger, more radicalized, deadline-intense foreign managers were hired in the search for continuously reducing business costs eventually broke the once stronger bonds of trust between technical professionals and their employers.
The results of these business efforts to transform IT departments into lean, operating cost centers came with significant negative trends that would increase underlying confusion, resentment, and tensions. It is this last that has culminated a long process into the very issues that CIO Magazine article is now depicting.
Combined with the importance of the Internet in corporate development strategies, this technical perspective has played a substantial role in alienating technical personnel. As businesses demanded faster development processes, US technical vendors complied by developing increasing sophistication to their products while promising the ability to deliver ever faster deliverables.
The combination of events developed its own aberrations that deterred software developers from historical, standardized practical techniques of application development with such concepts such as “Extreme Programming” or “XP”, which was ultimately a complete failure. Out of this failure came the “Agile Manifesto” and the subsequent “Agile” development paradigm, which also has shown to be unable to move the project failure rate any lower than the consistent average of 70%. Nonetheless, both concepts promoted the ability to deliver software faster than before by eliminating critical design components from the software development process. Unfortunately, many technical professionals were not aware that the new “Agile” concept was merely a re-tooled software engineering paradigm of the “incremental” and “evolutionary” development cycles without the vital necessities that made these two principals successful.
Many organizations, anxious to promote their own efficacy in light of decreasing resources, used the ideas of “Agile Development” merely as cover for a “get it done now” organizational culture or what had been known in the mainframe era as “guerrilla programming”.
Technical acumen in these new innovative technologies built upon the new Java and .NET development environments gave way to technical fads and whims as mobile computing and social media both began to have their negative, sociological effects on society.
In addition, a new idea was being floated about how business requirements were changing in the 21st century (though no one has ever adequately described what exactly these changes were) and what were now needed were tools that could change with them. In translation, business wanted software development environments that could be molded as business demands supposedly warranted.
Based on the changing sociologies and this new business prerogative, new techniques were now being constantly sought for business development to ramp up development speeds. This would foment change in software technologies from well-founded tools to newer sets of technologies that their promoters touted as more beneficial for developers over the older, “legacy” application development tools. However, the surprise here is that these newer tools actually demonstrated qualities of far older tools that at the time were also described as aging and inefficient for their tasks.
Since so much development was being done for the Internet, the most transformative of these new technologies would come with the rise of an aging design-pattern called MVC for “Model View Controller”. Where N-tiered development standards had done a credible job of formulating many application designs, MVC would now take center stage in a software technology environment that was becoming increasingly fragmented by both newer business demands as well as the vendor developments of new tools.
Though the MVC paradigm had been around for quite a while and was a mainstay in the large-scale Java environments, Microsoft’s ASP.NET technologies still relied on software developers to make efficient design decisions without the constraints that design-pattern would impose. Yet, much was made of the inefficiency of ASP.NET and the sloppiness of its self-contained paradigm, which was more the result of poor development quality than anything having to do with Microsoft’s original innovative Internet design technologies.
As MVC was thought to be something new so a new generation of professional developers promoted it. With MVC came yet a new round of tools, technologies and development paradigms all the while mixing with stripped down development environments which had long ago forgotten what quality development was all about.
Surprisingly, few technical professionals today understand the sociological tremor that the rising popularity of ASP.NET MVC has had on the IT profession, which is much greater than Java development ever had since it eventually found itself relegated to only large-scale, enterprise development, resulting from its early focus on large-scale distributed development. ASP.NET however, had infiltrated all levels of businesses and their development organizations; and so too now would its new incarnation with the MVC paradigm attached.
As far as MVC being any more efficient than older methodologies, that is highly subjective. The design of the Internet is based upon hardware and concurrency. And with large levels of concurrency performance will always be a secondary factor. As a result, there is very little that code, as long as it is clean, will do for overall performance for a web-based application.
Further, MVC is a rather difficult and complex paradigm to implement efficiently, which is easily demonstrated by using it against a large-scale implementation. It requires much more knowledge than the original ASP.NET ever did, more complex tools, and an emphasis on the client, which entails the loading of far more supporting technologies for complex web-pages making front-end processing less efficient.
However, the popularization of MVC and its implementation on a variety of platforms has drawn to a conclusion business’ seemingly unbridled need for everything now, at the lowest cost. Its rise in popularity defines the confusion that is now being found in software development organizations in the US all of which have become fractured with increasing techno-babble and self-congratulatory emphasis that they are using the latest and coolest in technological innovation. In reality, they have sent the development world back to the days of DOS in its development style.
This underlying dissidence in IT organizations, exacerbated by the dissolution of other basic skills from too much reliance on word processors, social media, mobile computing devices, and other techno-wiz devices, has provided insight into what is hampering communication between people in general. The concerns raised by the CIO piece, is merely a reflection on the sociological landscape in general.
As business management in general tore up the traditional Information Technology landscape for their own agendas the people making up such organizations were left to their own devices to attempt satisfaction of ongoing and unreasonable demands that fomented their own negative reactions. This process has made concrete and substantial communication among such parties as much a thing of the past as the technologies that have been maligned in an effort to simply move ahead under such pressures…