clock menu more-arrow no yes mobile

Filed under:

The Evolution of the Developer

"Developers are now the real decision makers in technology."

Screenshot/Apple WWDC 2013

There’s an ongoing conversation about the shifting role of developers and what that means for IT departments. They’re now shaping product and user experience with such great influence that businesses must understand their importance in order to be as successful as possible.

For a great read on the rise of the developer, check out Stephen O’Grady’s book, “The New Kingmakers: How Developers Conquered The World.” In the book, O’Grady goes into great depth about the history of developers, what circumstances expanded their resources, and how they became the ones calling the shots. He points out that even Apple, one of the most successful tech companies in the world, dedicated its homepage to thank developers back in March of 2012:

Apple thanks developers

It wasn’t that long ago, however, when enterprise buyers with a lot of purchasing power were the primary consumers of technology. Both hardware and software were much more expensive than they are today, and even the foundational tools necessary for a simple website were available only under commercial license — operating systems, Web servers, development tools, etc. It gets a bit more complicated, but the reality was that developers were at the mercy of their employer’s capital.

But that has changed. I’m hard-pressed to think of any paradigm shift, in terms of personnel, quite as drastic and rapid as the new role of the developer. With software being free and readily available, the sole ball and chain left shackled to the ankles of developers was hardware — but with the development of the cloud market, developers have a newfound stray-dog freedom.

And so we’ve seen cloud computing disrupt the industry’s power structure. Before, it was essentially unaffordable for an individual developer to purchase a dedicated server. Shared hosting services offered a viable option, but with cloud technology, hardware is virtualized and managed by a hypervisor that is able to administer servers, as well as to create partitions of CPU, memory, storage and network. Since customers each get their own virtual server instance, and there is no competition for resources among users, it appears to them as if they have a dedicated server. This allows for great flexibility and limitless options, which will undoubtedly improve the agility of businesses willing to embrace cloud infrastructure.

There are other significant factors worth considering, particularly the Internet, which allowed developers to interact and collaborate directly. To tackle the full breadth of that topic, however, would require reading outside the confines of this piece. As important as the dot-com boom was for developers, open source played an absolutely critical role that is particular to the empowerment of developers. The availability of countless software projects, free of cost, impacted the IT industry in a way that can never be reversed.

Not only do developers control the code and applications, they control the infrastructure underneath, which mandates the integration of software developers and information technology. It won’t be long before future IT departments revolve around developers, increasing the already rapid rate of innovation in cloud technology. Moreover, it’s only a matter of time before Fortune 500 companies begin to move their IT completely to the cloud, and instead of the CTO or CIO bringing the cloud into the company, developers and operators are making those decisions.

The main deterrents for businesses looking to adopt cloud-based solutions are still privacy and data security. They view the cloud as risky because they don’t control and manage security concerning the servers holding their data. But, as Rick Spickelmier, CTO of Birst, points out, “organizations actually have far less control over their in-house data than they might believe.” If you take a look at the statistics, it becomes apparent that organizations would be better off putting their data in the cloud.

That’s because, as Spickelmier says, cloud hosting providers invest “far more in physical and digital security infrastructure than most corporations because of economies of scale. Unless your company is willing to secure its datacenter with biometric scanning and advanced surveillance systems; or invest in encryption methods, third-party certifications, and regular testing against attacks; you cannot provide the security that top cloud providers offer. In many cases, companies who move to the cloud get security they didn’t have and never knew they needed.”

It’s simple: Businesses that are agile and willing to embrace cloud infrastructure will have an advantage over ones that don’t. And employers willing to accept the developer’s newfound prominence will fare better than ones that are slow to adjust to this new reality. As O’Grady puts it in his book:

“Developers are now the real decision makers in technology. Learning how to best negotiate with these New Kingmakers, therefore, could mean the difference between success and failure.”

Ben Uretsky is co-founder and CEO of DigitalOcean. Reach him @benuretsky.

This article originally appeared on

Sign up for the newsletter Today, Explained

Understand the world with a daily explainer plus the most compelling stories of the day.