Programmers/Software developers are dead


So I’m a programmer (software developer/engineer/architect/CTO etc)  with my many hats over many years.  Why are we dead?  We are the middleman.   We are in between a customer using technology and the creator of that technology.   We are currently needed to convert the creator's vision into lines of code/logic which orchestrates visual/audio and perhaps tactile feedback to the customer to accomplish something.


If the creator could do this THEY WOULD.  They don’t want to have to spec out requirements in what is usually an abstract way to detail to an architect/programmer how to accomplish their tasks.  They don’t want to spend the $$, time, and frustration dealing with various people along with normal miscommunication that will occur when more people are involved in the process.  They want to focus on their idea and their process and have it just happen.  Who wouldn’t!  But today it’s extremely difficult so the programmers step in and convert the creator's thought to code and bridge the idea to the customer in a real way.


Enter natural language processing NLP, to solve the creator to customer gap….. or better yet enter NLP to change the game so the creator (or customer!) can solve the problem differently and more naturally.


Take the NEST thermostat product as an example.  In reality, customers just want to say “keep the house 72 degrees, except at night make it 66”, “When I’m away have the temperature stay at 55 minimum and 82 maximum” etc.  Customers don’t need nor want to have difficulty understanding a device/interface.  Programmers really don’t want to implement a difficult or complex interface.   Customers don’t care how to solve the problem they just want it to just work. The old Honeywell type thermostats were/are awful.  They had the switch from cool & heat, plus other switches/buttons nobody knew what to do with or why there were there.


Who cares!!! Just keep it 72!   Hot or cold.


Customers ‘may’ want the current temp displayed, ‘may’, but that’s about it.  If the NEST thermostat had a voice interface, this would be closer to what a human would want.    The device could be much less expensive as well with only a sensor, wifi module, microphone, and speaker.  There would be a huge reduction in work for the creator:


-less requirements specifications and documentation

-less user interface design/coding

-minimal firmware support for rotating knobs

-less meetings, component sourcing



Once NEST or whoever does the NLP language support for ‘CONTROLLING TEMPERATURE’.  That language interface code would/should be the  SAME CODE and language for EVERYBODY who makes thermostat controllers.   You don’t need to re-implement the ability to control the temperature again for the language portion.  The language for a human in, say English, to control the temperature is finite and small.   The language intent can be shared for all interested in temperature control.  Low-level device hardware-firmware would still be different.  But is simpler and minimal compared to user interface elements.


Today NEST, Westinghouse, Ecoobee, etc… do NOT SHARE THE SAME code today for devices controlling temperature.  EACH is individually implemented again and again and again and again.    This is true for millions of other common language concepts (volume/change channel/lights on-off,….) done across all computing devices/computers.  The amount of time/money/effort spent on RE-DOING concepts is enormous.  It is not shared because they are all using their OWN particular set of buttons/knobs/interfaces etc.  They are not leveraging language/voice.  Each unique implementation requires many many people from programmers to architects to people managing the process.  Less embedded developers.


With a voice/NLP based interactive implementation, there would be significantly less need for the middlemen (programmers/managers etc…) There would be a greater amount of shared language ‘code’ which could be leveraged across companies.  Like Linux open source solutions, it will eventually exist for language in mass.


With NLP the number of tasks customers/users could access or do would be expanded millions of times over.  


“Millions”?!...that may sound grandiose but it’s probably even more than millions.    


Humans have evolved to be perfect language ‘machines’ and are able to go through dialog to exchange ideas, get and share information etc… all with perfect ease.   Other than the common language ‘challenge’ we all face communicating with each other in person, there are no interfaces/widgets we need to figure out when using language.  We simply say or ask.  Less interface developers.

We interact seeing what’s possible and how to accomplish it.  This is UNLIKE our existing user interface screens we deal with today which are really unnatural and extremely limiting.


The creator will then focus on the language-area they need and what functions are to be performed.  The future NEST thermostat ;-)  creator will define the abilities of the device along with language needed for that and perhaps a display (temp?) and that’s it.  The new profession “AI/language architect” (a type of programmer) will collect existing language solutions for temperature, humidity etc… and will use them in the product.  Of course, programmers still exist, we are not completely dead.  Implementing the firmware of the NEST 8.0 device and connecting to the ‘intent’ of the NLP for temperature will be required.  The language intent would go through a common format deciphered down from the natural language.  This ‘common format’ is independent of the 100’s of actual human languages.  This implementation controlling the low-level temperature etc… is still most likely different still on different devices.  


So the tons of programmers/architects.. etc… required today can now do more language or AI/language work along with the necessary if any screen/data display work.   The need to build the torturous ‘icon’ interfaces esp. like NEST will be all but eliminated.  From a developer's point of view, I can say spending time on features like language which people will seamlessly use vs. trying to create ‘tough’ icons/interface designs people won’t understand or try is very exciting, both personally and empathetically.


So, not all programmers are dead…. But like many other industries, as NLP takes over they/we need to adapt or disappear.  We need to work to make more AI/language-based approaches so future products are truly seamless and better for customers.   Less developers.



Leave a comment

Please note, comments must be approved before they are published