Serverless computing: Mapping out the possibilities (Part 2)

Now that serverless computing is here, learn how to move the virtual ball forward.

Getty Images

Imagine a map of your business. On this map, plot an 'x' for each user, representing a smartphone, tablet, or laptop where the user interface resides. Then plot an 'o' at the data center location for each service people use from their device, whether it's corporate email or Netflix. It won't take long before realizing a third element, a line connecting each 'x' to it's corresponding 'o' is required to keep track of the relationship binding the two. What you end up with is an ordered mess, a highly random distribution of x's connecting via lines so dense they'd block out the sun to concentrated amalgamations of o's. Now ask: How does this model, where every 'x' terminates in an 'o,' enable interaction between two people standing next to each other? How does this model make sense in burgeoning Internet of Things (IoT) where a single system, such as an intelligent vehicle, interacts with multiple other systems? And finally ask: Is there enough time to coordinate and move the data if it has to travel halfway across a state, or country, to a common meeting place?

私人和公共云今天,可能高ly virtualized, but they are not distributed. In reality, clouds are concentrated in a small number of high-density mega data centers dotted around the globe. These data centers tend to be built at the intersection of high-telco bandwidth availability and low-cost power, not near urban centers where the compute needs are the greatest. As a result, the 'o's are distinct from the 'x's' unless you happen to be near one of those mega data centers. The architectural reality underpinning the topology of the cloud is based on our preconceived focus on servers and data centers.

However, don't confuse today's architecture with tomorrow's opportunity. Remember cloud computing is about having the right resources in the right place at the right time. The focus of cloud should not be on creating greater compute mass via centralization, but rather distributing that mass as far as possible. Meeting the needs of a shrinking world creating ever more data demands a new model of distributed computing, one which moves beyond mega data centers, closer to the end point. Maybe the personal computing trend of the 1970's wasn't such a bad idea after all.

Computing power needs to be somewhere, so if not in centralized data centers, where? Taking a page from the grid computing handbook, the amount of processing power wasted each day on smartphones, tablets, and PC's is staggering. In fact, there's so much processing power available, Intel recently announced the computing power would actually be scaling down in future processors to refocus efforts on energy efficiency. Nobody seemed to care. Why? Experts know most of that power goes to waste, but power consumption is rapidly rising as the limiting factor on cost. The truth is we are living in the age of the Network Computer, the 1990's prediction for computing nirvana where very little work is being done at the endpoint. Yet a smartphone today has nearly as much horsepower as a server just one or two generations ago. As a result, the world has plenty of computing power available today: each 'x' capable of also being an 'o.' Connectivity would be a concern, however the power of 4G LTE and forthcoming 5G, combined with fiber internet build-outs across the country, means there is abundant and robust network bandwidth available on demand.

Consumers are already comfortable installing their own applications via app stores, and packaging a robust application has never been easier thanks to the rapidly evolving container ecosystem, including Docker and Kubernetes. Yet if the applications are on the device, how would two devices interact directly without having a centralized service? How about Peer-to-Peer (P2P) technology whose value is not inextricably linked to BitTorrent? This is Serverless Computing architecture, pushing compute and even storage as far out of the data center as possible, enabling our connective infrastructure to connect the world while operating at the scale of a single person.

As with any architecture, there are limitations, so only a subset of applications benefit from going serverless, and the data centers continue to play an important role as collection, coordination, and distribution points. However, like the brick-and-mortar retail world, value is delivered locally to the individual. Making Serverless Computing work requires additional advancements. Battery life for mobile devices needs to be extended fairly significantly, probably three to four times existing storage capabilities, to deliver the current required to shift computing responsibilities. Power efficiency and cooling become even more important as devices which double as Dutch ovens serve nobody's need.

Mobile data costs will need to drop, which is likely to happen with continued innovation but can be accelerated through the reversal of data charges similar to 1-800 numbers in the voice world. To manage such an extensively distributed world we'll need better tools for orchestration and monitoring, tools which will have to work in a P2P world. The good news is none of these needs are unique to Serverless Computing, nor new.

I further believe market forces are already driving us toward a Serverless Computing foundation. First and foremost is privacy. There is a growing sense of animosity toward data collection and tracking, with governments around the globe wading through the details and pushing back on service providers. When the application and data are pushed to the end points, there is an opportunity for the consumer to take ownership of their data: their location, profile, habits, likes and dislikes. Second, and building on the newly enabled privacy platform, the architecture provides a better model for digital advertising, the economic engine of the Internet. By reversing the ad model from push to pull, relevancy, the goal of every advertiser, automatically improves. The architecture also creates the opportunity to put the bandwidth cost burden on the advertiser rather than the consumer. Other benefits accrue as well, such as distributing the waste heat and power consumption while significantly reducing capital needs by shifting the burden to the end user's asset, which they're going to purchase regardless. And not to forget Security, the entire model of computing would change eliminating the penetrate-once vulnerability of data centers replaced by a crowdsourcing approach where every device becomes a sensor detecting unexplained activity.

对我来说,这是云,我想象它在2001 before we called it cloud, when I started thinking about bringing grid computing and P2P together with Service Oriented Architecture and the Web. For telcos and cable companies, Serverless Computing could be a tremendous gift of future revenue and relevance, given their ownership of the last mile. While I do not see this as a panacea, I do see at least an advancement in the conversation on how to deal with a world where our endpoints are no longer fixed. Consider how much data is being collected today, from the GPS location in your phone to the RFID tag on your razor blades. We are living in the Data Age, where every single device powered by electricity will likely become part of the Internet of Things. Since everything else is free, compared to the cost of moving data, there is an economic incentive to move data as short a distance as possible. We live locally, we're served locally, why not compute locally?

Join the Network World communities onFacebookandLinkedInto comment on topics that are top of mind.
Related:

Copyright © 2016足球竞彩网下载

它的工资调查:结果是在