Pages

Wednesday, November 28, 2012

Lowering the Startup Barrier to Disruption Through De-Centralization

Today we are honored to have Yermo Lamers, noted software developer and tech leader, as a guest blogger. Yermo has developped many industry leading software applications and has a deep background in communications software, data platforms, and command and control systems. In his "spare time" Yermo designed and developed a software platform for hosting socially aware portals, including his own at http://miles-by-motorcycle.com where you can follow his passion for motorcycling, and http://a-software-guy.com/ where he maintains a technology centric discussion board.
 
Network Speed

Lowering the Startup Barrier to Disruption Through De-Centralization

By: Yermo Lamers


 

We have a strong natural bias to keep using existing ideas that have served us well. Once established, changing our thinking is difficult. But the world around us changes relentlessly. Here-in are sewn the seeds of disruption.

Oftentimes, incremental and seemingly insignificant changes in technology have huge effects that are not immediately apparent. Network speeds, which have been incrementally improving for decades, are such a change. Sure, we can do things faster, but what does it mean?

Human beings making a request of an information system will typically start to get bored after a few seconds and frustrated after a few more. That's the benchmark we look for in getting a response. In the good 'ol days, network speeds were only fast enough to transmit a few raw characters over any distance in that time-frame. I remember as a little kid playing the game of Zork after hours using a Silent 700 thermal paper terminal connected using an acoustic coupler modem to a minicomputer at NASA. I think it could only transmit at 110 baud. You typed in a line, “Pick up sword”. You waited a second or two and then the thing started whirring a response back at you. That system could only transmit one line of characters at a time “fast enough” to match the human expectation. It was a natural consequence of these slow connection speeds that the world was ruled by dumb terminals connected to centrally located mini-computers and, on the high end, mainframes. Huge proprietary businesses with wide moats were built protected by this centralization.

Then one day, I heard about this impossibly fast new technology called Ethernet.  “10Mbits/s?” There are those who say the PC solely brought about the end of minicomputer and mainframe era. I would disagree and suggest that it was, in fact, Ethernet that was the key to disrupting their world. Ethernet was fast. But what did it mean? Ethernet meant you could now inexpensively hook commodity machines together to quickly distribute data in a way that was not possible before. Expensive minicomputers which used to be data store and computational powerhouse combined could now be replaced by commodity machines that acted essentially as nothing but a data store. Computation was offloaded to relatively inexpensive workstations. The world of client server was born and a whole new industry came along to disrupt the one before. Importantly, economies of scale made the knowledge to run these machines a commodity which lowered one of the big barriers to starting new businesses. Namely, talent was now available. In a way, it was increases in local area network speeds more than the PC itself that enabled the rise of Microsoft. Microsoft was able able to see ways to exploit the new context with fresh eyes, at IBM's expense, who were still caught in thinking that the centralized models that had worked before would continue to be competitive.

Microsoft built an empire on the local area network. They controlled the server and the workstation. What I never saw was how Microsoft's moat was tied to a particular Goldilocks zone of wide area network speeds.

It started some time in '93. I remember getting a US Robotics Dual Standard modem. It could talk to another modem of the same type at 14,400. What did this mean? It meant I could download a complete distribution of Linux in some reasonable time. As modem speeds increased, it became easier and easier for programmers to start distributing what they had written. The free software movement had been around for quite some time, but it was the advent of the high speed modem that, in my opinion, was key to it's rise. Just as I did, Microsoft failed to notice that this was a harbinger of things to come. Linux was not the threat; not by itself. Network speed was the real threat to it's business model.

As long as broadband speeds were low enough and it was impractical to distribute truly large quantities of data quickly, Microsoft's position was defensible. They still controlled the client and the server on the LAN. However, at some point, wide area network speeds became fast enough to disrupt Microsoft's stranglehold on business processes. Mired in old ideas one might think this is because it was easier to distribute large quantities of software thus threatening Microsoft's hold on distribution. I would argue that once network speeds became fast enough sometime in the early 2000's, it led to the ascendancy of business models that could not have succeeded before. The era of Google, Facebook and other web 2.0 companies was upon us and Microsoft's model of services centralized to the LAN was suddenly feeling antiquated. What did these new high speed network connections mean? Third parties somewhere out on the vast internet could now deliver experiences to users in the critical couple second window that rivaled the experiences delivered by local desktop software. The browser became the platform. This is actually what killed my little stock market software company. It turns out users hate installing software. They hate updating it. They hate not being able to use it where-ever when-ever they want to. With functionality delivered by Web 2.0 there's nothing for the user to install or update. They just log in and use the service on whatever device they want. The availability of these Web 2.0 services also had a positive effect on small business. It meant that the costs to start a new business had once again been reduced. A business could now start out without needing to run it's own administrative servers or even network. Just use Google Docs, online payroll, and accounting service to get off the ground. There little need to fund an expensive IT staff for internal operations anymore let alone pay Microsoft's licensing fees. Additionally, these services enabled a more mobile and distributed workforce. Most small businesses can't afford the infrastructure costs to make their internal Microsoft dominated networks available to a mobile workforce. Now with Web 2.0, they get a mobile enabled work-force for free. This in turn enables the business to look for talent where it happens to be regardless of whether it's local, across the country or around the world. The local area network that Microsoft dominated was another kind of centralized model disrupted by network speed increases.

Network speeds continue to increase and there's a new potentially larger disruption of a centralized model in the works. Network speeds are starting to go beyond “human response time” speeds and are beginning to reach what we can call “machine speed”.  Reaching a speed where a rich experience can be delivered to a user within their boredom threshold was the catalyst to disrupt one of the most successful businesses in history. What effect would it have if network speed increased to the point where in-machine and inter-machine data transfer rates become less distinguishable?

A core assumption in the last 40+ years of operating system design is that operating with the outside world is slow. Machines are distinct and services are centralized on the machine. I have my machine and you have yours and they run distinct operating systems. Even if I have a datacenter, each machine runs it's own OS.

If the network is so fast that I can call services between machines at something close to “machine speed” across the country or even around the world, it might imply the paradigm of single distinct machines starts looking dated. What new disruptive decentralization might occur?

Web 2.0 companies brought zero-install, zero-maintenance services to the people side of the business. For many companies, especially heavily online enterprises, there are still very significant costs associated with providing the core services of the business to it's users. There are servers to configure. There are programs to write and third party components to integrate. There's data base administration to do and patches to apply. There's also excess capacity which has to be built, maintained, paid for and left idle to handle the occasional unexpected surges in usage. Then there are all the very expensive salaries to pay. I've heard many entrepreneurs bemoan how difficult and expensive it is to find talent. Imagine being able to launch a new initiative quicker, with less overhead, less staff and less need for development and administration.

This new disruptive force is called the “cloud”. And by the cloud, I do not mean just virtual servers that you still have to administer. I mean the ability to distribute and auto-scale the components that have traditionally made up online software systems. This is called Platform as a Service and it does for software what Web 2.0 did for people. It radically decentralizes it.

With “Platform as a Service” (PAS) models, you no longer run any kind of server at all. You develop your application which represents your unique value proposition and then push it out to your “PAS Cloud” vendor. The vendor runs your application as just another process in their distributed network of machines. There's no administration and there is built in scalability. If you happen to get favorable press and suddenly need more capacity there's no need to wake up your IT staff, since you don't have one. You simply open your control panel and spool up additional instances of your application to match demand. There's no excess capacity for you to carry and pay for unnecessarily. Your operational costs are reduced. Your ability to respond is increased.

But more importantly, there is an ecosystem of third party online service components rapidly developing which are likely to shorten online software development cycles. There are features almost every online application requires such as image manipulation,  databases, messaging, e-commerce, tracking, reporting, and alerting available among many others. In the past, these services had to be developed or at least downloaded and installed on some server. Then they had to be maintained, patched and upgraded. Imagine, if such services were simply available online somewhere that you could simply sign up for and for nominal incremental cost just hook into your application and start using. Imagine that these components could scale based on demand. Sure, resizing a couple hundred images is no problem. But what if you run into hockey-stick adoption and need to scale a million tomorrow? Imagine how the act of building applications could become nothing more than hooking up distributed pre-built  scalable zero-administration services together.

The context has changed. Old ideas based on centralization are a competitive disadvantage. In this new context of faster network speeds, the cloud and Platform as a Service models, new initiatives can be built with less funding, less staff, less infrastructure and with a much shorter time to market while being more scalable and vastly more distributed.

What will it mean when networks become faster still? What kinds of hidden centralization that we don't even question now will be disrupted?