…when everything becomes a computer, there will be no more computers
Starting Premise. 1 Let us agree, for sake of a thought experiment, that a Human represents the ultimate Machine potential. In other words, every Machine that a Human creates is a subset of Human capabilities. Yes, a Machine may be stronger, faster, lighter, more durable, et cetera than a Human, however, the most capabilities with which a Human will be able to endow a Machine are those of the Human. For the reader who wants to debate about Human flight, consider that the Human is capable of flight albeit extremely short survivable distances.
The Design Optimization complement to Human aspiration is a profoundly symbiotic form of Artificial Intelligence.
Back to the Future Past. In 1867, American inventor Christopher Latham Sholes struggled with improving the performance of the typewriter. If commonly used letter pairings such as S-T were struck too quickly, the mechanical linkages that transmitted the letter key selection to the striking of that letter on paper would get jammed. The typist had to stop Work, turn attention away from source material, turn attention to the typewriter, untangling jammed keys, return attention to the typewriter keys, reset their fingers, check that the typewriter carriage was in the proper spot, turn their attention to the source material, and resume Work.
The speed of the typewriter (a Machine) was inhibited by the faster speed of the Worker. Sholes’ solution to this problem was to separate the commonly used letter pairings and place the most frequently used keys on the left side of the layout. This solution slowed the Worker so that the Machine could perform at its best. Unfortunately, Human obsession with technology enables Machines to transform and evolve much faster than the Human being itself evolves. Thus, the speed of the Machine today is inhibited by the slower speed of the Worker.
“Those who cannot remember the past are condemned to repeat it.” Reason in Common Sense – George Santayana, 1905. In 2021, while computing Machines operate at speeds greater than that of Human thought, we must revisit System Solution design to avoid unintended consequences arising from lack of foresight.
CognitiveVirtual by SwissCognitive
Global Online AI Event Series
07. April 2021
Event Recording – Panel Discussion – With Input from Stewart Skoma
Global Online AI Event Series
07. April 2021
“Those who cannot remember the past area condemned to repeat it.”
Reason in Common Sense, George Santayana, 1905
“User” Anomaly. Other than with respect to most electronic Machines (e.g., computers, tablets, smartphones, et cetera), the User role does not occur in Humanity (excepting for references to consumers of recreational drugs). Drivers drive cars. Passengers ride in cars, in buses, on boats, on planes. People wear clothes, sleep on mattresses, swing golf clubs, hit golf balls, swim in pools, ride bicycles, enjoy entertainment, et cetera. Children play games, play sports, learn math, learn spelling, compose essays, read books, et cetera.
It is natural to discuss a homemaker using a stove, pots, and pans to make a meal and downright awkward to think of a User using a stove, pots, and pans… “User”, while necessary at the outset of the computer industry, should have become passé not long after Donald A. Norman’s 1999 publication of The Invisible Computer.
Please take note: When your child or grandchild asks for the iPad, they say: “May I play with/watch the iPad”. They do not say may I Use it and (most important) they will grow to never Use computers in the future. Just as when if everything were the color blue, there would be no color blue; when everything becomes a computer, there will be no more computers. As goes the computer, so goes User.
…when everything becomes a computer, there will be no computers.
In the context of Machines as subsets of the Human, it is fair to accept that most everything a Human may want to accomplish with a computing Machine was described by the Dr. Vannever Bush in his July 1945 Atlantic Monthly article As We May Think. One can make the case that Humanity has only now, in the year 2021, recently delivered on the vision of Dr. Bush. Further progress may be hampered by Humanity getting in its own way.
As We May Do. ‘Humanity getting in its own way’ is mistakenly accepting the User role as fundamental – that ‘User’ is doing something [as in the performance of Work = (Force x Distance) + (Thought x Time)] more than retrieving data from one Machine process and feeding it into another Machine process.
Computers are Machines. A Machine is an invention created by Humans to make work easier by multiplying the effect of Human effort. When creating a system solution design comprising Machines that execute management science algorithms (e.g., Double-Entry Bookkeeping, Linear Programming, Off-Set Leadtime Planning, et cetera), placing the non-Value-Add User role at its center ensures the automation will never run faster than the slowest User role being fulfilled by a Human.
Unfortunately, as Humanity continues relentlessly advancing technology, what many Users today actually do is stare at refracted light (sometimes it is reflected as in the case of transmissive display technology) or listen to synthesized speech and populate data (through myriad input mechanisms) that cause a Machine process to take the next programmed step in emitting refracted light and/or synthesizing speech.2
What is fundamental is that which a Worker does, as in Work = (Force x Distance) + (Thought x Time). It is what a Worker does and not how they spend time that is fundamental and needs to be elevated in the revisiting.
System Solutions must be revisited knowing that the same Management Science and Physical Science has been automated both upstream, downstream, and throughout extended enterprise system solutions. It comes as no surprise that the “Outputs” of one enterprise look a lot like the expected “Inputs” of another enterprise. What becomes a surprise is when there is a tremendous mismatch between enterprise-to-enterprise systems especially in the post Y2K era. Chances are good that an IT vendor and/or IT Employee convinced one or more Workers in one or both enterprise that they are so special that unique software must be written for them.
Back to the Future. Enabling Machines to automatically create Machine automation (i.e., computers programming computers) may require a breakthrough approach such as the movement from 3rd Person Design for 1st-Person Execution to the 2nd-Person Design + Execution. This new design perspective may then enable the conceptualization of the “Mind” of the first Machine interacting within itself and its environment conceptualizing and realizing a “Mind” of a second Machine, and so on.
Time out & Back up. Before attempting to wrap your head around the above paragraph, let us agree that, for Machine automation to achieve its potential, there would need to be a change; that something is broken and needs to be fixed. The old saw says, ‘The first step in a cure is accepting that you are sick’. So, what is wrong with what we are doing with computers today that precludes advancement, why is it that way, and what do we need to do differently?
Admitting Sickness. Anecdotally, while information technology has advanced exponentially, the increase in marginal productivity from the investment in information technology has been in decline. In other words, the same or more investment of Time + Talent + Treasure into exponentially advancing information technology is met with smaller and smaller improvements in outcome.
In 2021, every Design Engineer has multiple Computers (e.g., design workstation, notebook computer, tablet, smartphone, et al) at their disposal each of which are over 1,000,000 times the power of the single computer they had in 2001. New product development success has not advanced at the rate of the empowering underlying Machine automation.
With an overabundance of computers, most product development continues progressing incrementally with the success of new product development initiatives being not that much greater than the success of a Startup. It is no surprise that bright, energetic, talented engineers abandon what truly are exciting projects in large enterprise projects, throw caution to the wind, and become entrepreneurs.
Let us assume this flagging growth in productivity in the face of exponentially growing technology resource is true and that it is a problem. What is the Root Cause of this paradox? Simply stated, the Root Cause of today’s declining marginal returns in productivity and operational efficiency from investment in Information System Solutions is failure of Computer Science to implement Information Science for Optimization of how we Live – Work – Play. This failure will be overcome by establishing and working from a new perspective.
The rise of User. Fifty years is not a very long time in the history of Humanity. In the early 1970s, long before any talk of computer User Experience or Human-Machine Interface (HMI) most all HMI was punched cards and paper tape data input resulting in delivery of fan-folded green bar printed paper information output.
Green bar report production, sorting, delivery, storage, and archiving became an around-the-clock operation within each company that implemented computer automation earning itself a newly established functional department named Data Processing or DP for short.
DP initially served business functions directly responsible for: keeping track of money (Accounting), spending money on Workers (Payroll), and spending money on things (Materials).
Accounting, Payroll, and Materials Workers became the primary consumers of the output. While these Workers certainly used the DP department output, their primary role remained Accounting, Payroll, and Materials. Each of these Workers was not a computer user or User for brevity.
Humans continued advancing computing Machines enabling the concept of online transaction processing (OLTP) or being “online” with the computer. Through typewriter-style keyboards, stylus, light pen, microphone, camera, mouse, joystick, et cetera humans fed data input to the Machines. Through video display terminals, speakers, printers, and plotters humans consumed the information output from the Machines. Data Processing or DP evolved to become Management Information Systems or MIS.
The field of Management Science (initially pioneered in The Principles of Scientific Management – 1911, by Frederick Winslow Taylor) reached its zenith in the mid-1960s with the work of George Plossl and Oliver Wight memorialized in their seminal 1967 work: Production and Inventory Control – Principles and Techniques. Combined with Mark’s Standard Handbook for Mechanical Engineers – 1916-2016 we have algorithms for the molding of natural resources (“Physical Science”) to produce tools and Machines and the principles and techniques to plan, schedule, produce and deliver (“Management Science”) volumes of these tools and Machines. Imbuing Machines with these algorithms, principles, and techniques is the provenance of Information Science.
Information Science took significant steps forward when talented IBM Information Science professionals in 1973 published IBM COPICS – Communications Oriented Production & Information Control System serving as a Storyboard, External Design, and Information Model to realize automation of Plossl & Wight Management Science.
Proliferating User. IBM lore holds that founders of SAP SE were authors of essential COPICS system modules. These IBM Germany professionals saw the creation of Financial Accounting and Back-Office automation programs 3 as an opportunity that they could not pass up. Unfortunately, COPICS and most every other attempt to apply Computer Science to Management Science to create Information System Solutions were a product of systems engineers steeped in 1970s foresight. COPICS and all other seminal work in OLTP did not imagine that computing technology would ever evolve to enable the actual product being produced through a manufacturing process to monitor and report on its own state and condition.
In Working to realize IBM COPICS and myriad other OLTP system solution offerings, wherever the system designers encountered a capability that could not be performed by a Machine, they inserted a Human Worker in the User role. The User role itself is a non-Value-Add role (versus a Value-Add or Cost-Add roles of Value Chain definitions) created because humanity did not yet know what it did not know in the continuously advancing Computer Science & Information Science symbiotic relationship. Sadly today, in most cases in the year 2021, the User role persists as a vestige of early Computer Science challenges enabling one Machine (i.e., Computer) to communicate with another Machine.
Throughout the 1980s-1990s, Computer Science continued its advancement producing reentrant, multi-tasking enabled OLTP offerings for multi-threaded concurrent multiple input-output multi-programming models that would facilitate the realization of the COPICS and similar pioneering efforts of the period.
MIS proliferated throughout most all businesses with green bar reports and OLTP as its production. MIS employees would show up at the Accounting, Payroll, and Materials offices with a video display terminal (VDT) and keyboard, plop it on the Worker’s desk and say: “Use this and you will not have to wait as long for your reports!”.
A very subtle transformation began with the introduction of an additional role being filled by the Accounting, Payroll, and Materials Workers: User. MIS continued to advance with capabilities enabled for most every functional role within a company. Most every Worker began to take on this additional User role.
Humanity almost imperceptibly assumed this computer-related additional User responsibility, taking on the role of feeding the Machines and distributing the Machine output that had formerly been held by MIS. Most every Worker became a User. Since every Worker is not a manager, MIS renamed itself with the more universal designation of Information Technology or IT.
Systems Solution Perspective. Complete end-to-end IT system solutions were developed, IT careers were created, and books were written. Academia created undergraduate, graduate, and post-graduate curriculum feeding what became known as the IT Industry with workshops, seminars, and conferences. Throughout all this growth and evolution, attention transitioned from the Value-Add Worker role to the non-Value-Add User role to the point that the IT Industry primarily monetizes its offerings based on the number of Users versus value delivered.
Unfortunately, few readers of Donald A. Norman contributions to 1986 User Centered System Design: New Perspectives on Human-Computer Interaction may have read Norman’s complete works on design (not least of which is 1999, The Invisible Computer). It is likely that a very small minority read any more than the first chapter of any of Norman’s works anecdotal estimates being that less than 10% of non-fiction readers continue reading past the first chapter.
Unintended Consequence of User-Centered Systems Design. With a Systems Solution perspective and armed with seminal works such as User Centered System Design: New Perspectives on Human-Computer Interaction, Donald A. Norman & Stephen W. Draper, 1986, emphasis was thrust upon “User Experience” with IT Employees altogether bypassing methodical, rational product planning and instead rushing out to the User (‘Worker’) and asking them what they need.
The Worker never considers themselves to be a User. Workers are hired to fulfill a Role and, in doing so, earn and are paid wages, in other words, to do a job. What the Worker needs is #1 to be left alone to do their job, #2 a sense of security in their job, and #3 opportunity for growth. Since the IT Employee has already violated #1, the Worker focuses attention on #2 while hoping for #3. The result of the IT Employee – to Worker interaction is the Worker/User answers the IT professional’s request by telling them that they need something new and different from what they have.
This begins mutually beneficial co-dependent relationships between individual IT Employees and individual Workers that, unfortunately, serve as a detriment to the overall good of the company. To satisfy one another’s personal professional interests, they convince one another that what the User is doing is so special and so unique that it requires a special and unique IT solution.
IT vendors (the ones selling off-the-shelf System Solutions) thrive off this IT Employee-Worker dynamic as it creates customization of their off-the-shelf offerings making it very difficult to ever replace the solution. The IT Employee, the User (i.e., Worker), and IT vendor created a dependence upon them, individually and collectively going forward. Unfortunately, IT vendors, IT Employees, and Workers retire and expire, and the company suffers what could be operationally catastrophic disruption.
The IT vendor, IT Employee and Worker co-dependency holds the system solution potential hostage until one or more of these actors retires and/or expires.
2000s – Today
Fear is a Powerful Motivator. In anticipation of “unknown unknowns” wreaking havoc throughout Humanity at the turn of this last millennium, beginning in the late 1990s, most all company IT systems were wholesale replaced with non-Y2K 4 problem alternatives. Myriad legacy systems held hostage by the IT Employee – Worker co-dependence were simply tossed out and replaced with new IT vendor offerings immediately seeking to institute the IT vendor – IT Employee – Worker co-dependency. Unfortunately, many succeeded.
Variants of Sameness. OLTP System Solutions share the same lineage. Because of the Y2K wholesale replacement of the legacy OLTP offerings with a few IT vendor offerings, the common lineage is much more highly concentrated. All the OLTP System Solutions share the same Management Science and Physical Science heritage and today most all share the same or similar IT software base. With respect to these systems, 15th century double-entry bookkeeping is the backbone – there is a place for everything, and everything needs to be put in its place – wash, rinse, repeat. It should come as no surprise that Machines are able to learn process and then repeat the process of how companies operate.
RPA: The New QWERTY. In the year 2021, Robotic Process Automation (RPA) is on its rise within the field of Artificial Intelligence (AI). A simple definition of RPA is having a computer learn from basic, repetitive tasks performed by a User so that the computer (a Machine) can perform the same User task. After the nearly 50 years of obsession over “User”, this non-Value-Add role will be subsumed within Machine automation. It does seem fitting that the Machine should assimilate the function of exchanging data between Machines. Are we though creating another QWERTY keyboard? Since the system solution was purportedly developed as “User Centric”, would it not be prudent to step back and revisit system solutions making these Work and Worker Centric?
Solution: Move to the 2nd Person. The cure – the way to break the Information Technology Industry out of its malaise – is simple. A “User”-centric approach of Information System Solution design has the design of a solution accomplished in the 3rd Person with the intent of a Worker employing the product in the 1st Person. Worker-Centric design, where the Machine performs as an enhancement to the Worker must be accomplished from a 2nd Person perspective (in the spirit of Free Indirect Speech style).
There are few noteworthy novelists of acclaim writing in the Free Indirect Speech style, the most being James Joyce with the easier work to consume being his 1916 A Portrait of the Artist as a Young Man.
“History, Stephen said, is a nightmare from which I am trying to awake.”
A Portrait of the Artist as a Young Man,
James Joyce, 1916
2nd Person perspective is that wherein the Designer/Developer is immersed directly within the context of the Information System Solution Execution. Unlike the traditional history-biased 3rd Person perspective from which a Worker employs a product in the 1st Person perspective, the 2nd Person focus is on the present and the future. 3rd Person perspective enjoys the benefit of hindsight – learning from experience. Unfortunately, too may IT Workers and IT vendors confuse prediction (what the future will be) with forecasting (what the future should be). It is this “rearview-mirror forecasting” that results in tremendous misses in prediction. Leaning so heavily on historical data for predicting future-history – as most everyone does – is equivalent to declaring: “My, what a wonderful future you have behind you!”
Design Optimization is Amid the 2nd Person Vanguard. Machines assisting Humans in design – formulating, executing, and learning from executed plan results – is the nature of AI – Artificial Intelligence realized through Engineering Design Optimization. Computational algorithms and methodologies applied to design enables engineers to feed their design to Machines executing powerful optimization engines. Engineering Design Optimization frees the engineer to focus on design needs and desired outcomes.
Design Engineering Optimization, Machines – computers and algorithms – enables Workers to create more economically and ecologically-responsible designs all while consuming less resources. This virtuous cycle can be extended from optimized design-to-optimized planning-to-optimized execution and demands we revisit Enterprise System Solutions and the non-Value-Add User role.
The Design Optimization complement to Human aspiration is a profoundly symbiotic form of Artificial Intelligence.
 For purpose of this narrative, we will not consider the post-Singularity Machine-Enhanced Human of Raymond Kurzweil fame.
2 Yes, the reader may take exception to the above and possibly use fully immersion Virtual/Augmented/Mixed-Reality (VR/AR/MR) as examples – In 2021, these are predominately point solutions and not System Solutions or (as is the dominant forms) serving Gamers, not Users.
3 General Ledger, Accounts Payable, Payroll, (GLAPPR) & Billing, Inventory Control, Accounts Receivable, Sales Analysis (BICARSA)
4 For those unfamiliar, IT software developed prior to the Year 2000 (Y2K) in many cases represented the year as two digits (e.g., 1999 = ‘99’). Ambiguity arose when the year turned to 2000 as the Machine (i.e., Computer) did not know whether ‘00’ represented 1900, 2000, 1800, 1700, et cetera. This Y2K “glitch” required software to be rewritten to resolve the ambiguity and similar issues.
About the Author:
Stewart Skomra is CEO of OmniQuest™. For over 35 years Stewart has driven New Product and New Market Development Computer-Aided Design & Computer-Aided Manufacturing, Machine-to-Machine/IoT – Internet-of-Things, Supply-Chain Management, Auto-ID, and Wireless Technologies. From Blue-Chips including IBM, Intel, Qualcomm, and Trimble Navigation through multiple startups, he has led development initiatives serving industries including manufacturing, construction, distribution, transportation & logistics, wholesale & retail, consumer packaged goods, along with finance, insurance, healthcare, and multiple energy fields.
CognitiveVirtuals are regular worldwide-reaching online events bringing dozens of global AI leaders and experts together to share their views, experiences and expertise in the development of AI to the benefit of business and society. These 3 hour-long events are transparently addressing the development of cognitive technologies – including successes and challenges – while reaching and connecting a global online community of over ½ million followers.
All the sessions and formats are strictly content-driven with a non-sales approach, allowing focused and open discussions with content only. These events provide not only a platform to brainstorm and network but also to position experts, leaders, organisation, research developments, the current status and future outlook of AI.
Check out our upcoming CognitiveVirtual HERE