From Hardware to Software: The Rise of Programmers and the Evolution of the Software Industry
The article traces the shift from hardware‑centric profit models to the dominance of free and open‑source software, highlighting how programmers emerged as a vital profession in the 1980s and how software now underpins virtually every aspect of modern life.
From Hardware to Software
Hardware used to be a cash cow until free software appeared.
People have constantly tried to lower the difficulty of programming, yet we still rely on software developers.
Hardware is a fickle business. Decades ago, the integration, construction, and sales of computers were a lucrative tree, but profit margins later shrank. Companies like Dell have gone private and Gateway was acquired by Acer. These leading hardware firms historically avoided software, focusing on PCs pre‑installed with Microsoft Windows and adding subscription services to boost revenue.
This created frustration: users paid over $1,000 for a computer only to be nagged by antivirus vendors for constant payments.
When Microsoft still ruled the world, Steve Ballmer famously shouted “Developers! Developers! Developers!” in a packed arena, proclaiming his love for the company.
He argued that selling software is selling something that can be reproduced infinitely – a “nothing” with huge profit potential, unless someone else offers it cheaper or for free. That is exactly what happened: free‑software systems like Linux began to eat away at the server market, and web‑based free applications such as Google Apps gradually replaced desktop software.
Expectations for software have constantly evolved. IBM decoupled hardware and software in the 1960s to raise fees; Microsoft bundled Internet Explorer with Windows in 1998 and faced antitrust lawsuits; Apple initially prohibited third‑party apps for the iPhone in 2007, then launched the App Store, spawning a massive commercial ecosystem exemplified by games like Angry Birds. Today many devices ship with software – a PC comes with an operating system containing hundreds of programs, and users still download or purchase additional applications.
For decades, there has been a promise that programming could become as easy as writing plain English, dragging icons, or listing rules so that even a smart executive or a child could code. Despite many attempts, we still depend on developers.
Consequently, a craft and a profession emerged. It appeared in the early 1950s and truly took off in the 1980s, when a small group of experts learned to control machines to satisfy basic human needs – scheduling flights, sending letters, even “killing zombies.” These programmers turned signals from keyboards and numbers in memory into endlessly reproducible digital commands we call “software,” which now manages global economic infrastructure.
Most software created today is not as famous as Microsoft Word, yet it is everywhere: from custom‑built systems to standardized component‑based products. Programmers build on the work of their predecessors, improving it. Whether a set‑top box shows you the program guide, an ATM dispenses cash, an elevator takes you to the fifth floor, or Facebook serves a billion users, software is at work.
In short, the programmer community rose in the 1980s, gained the ability to command machines to meet human needs, and now creates systems that run the world’s economic foundation.
Source: Business Weekly
Qunar Tech Salon
Qunar Tech Salon is a learning and exchange platform for Qunar engineers and industry peers. We share cutting-edge technology trends and topics, providing a free platform for mid-to-senior technical professionals to exchange and learn.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.