Von Neumann told Shannon to call his measure entropy, since “no one really knows what entropy is, so in a debate you will always have the advantage.”

Software wants to be simple. Software moves towards higher abstractions, not lower. Any attempt to reverse this may work in the short run but won't last in the long run. Abstraction is a defining trait of modern human behavior, and it shines in computer science.

Just observe the kinds of programming engineers do today. Many aren't dealing with low-level assembly language or even managing memory. Few are rewriting the TCP/IP stack. Instead, we use human-readable programming languages and higher-order data structures.

Yet, simplicity is valuable. Manufactured complexity is one attempt to capture that value. Managed services are an example of manufactured complexity. A company, usually a creator or expert of open-source software, sells a hosted version of the software, managing updates, bug fixes, and uptime. The managed version will often include proprietary add-ons that make the maintenance easier - Neo4j offers clustering and backups, Confluent offers elastic storage and scalability, Redis Labs offers additional functionality and an SLA. It's a constant battle on two fronts: too little freemium functionality can reduce the bottoms-up effects and conversion. Too much proprietary functionality can create an opportunity for open-source clones.

In the long run, simplicity always wins out.


As an aside, most of what you've read about entropy from writers and philosophers is wrong. There is no physical basis for computer systems to tend to become more complex and disorderly. Thermodynamic entropy is not disorder. Instead, entropy measures the dispersal of energy. It shouldn't be confused with information entropy, a completely unrelated topic.