Value Driven Computing
Facebook and Google sell the power to change your mind
They argue that paying them will make you more likely to buy the products and services they promote, because their data and algorithms will give them an edge over any other company or medium: They turn electric power right into mind-changing power via computed analytics driving irresistible advertisement.
Electric power has a very real and significant cost, mind-changing power definitely has a benefit for a vendor, but how do they relate and how do the hyper-scalers decide when to invest more energy into a deeper analysis or stop wasting electrons on a consumer's mind that just won't bend far enough to buy?
Whenever you interact via Web, Search, Android or Messenger they place a bet, investing electricity against making progress influencing you. The value of that bet can only be a small part of the product cost as the vendor will need to keep most revenue to design, produce, distribute, support, collect and recycle your new heart’s desire.
Putting too many electrons on the table would kill a Web-giant’s bottom line, putting too few could mean somebody else tips your mind in the other direction and losing all electrons already expended.
When every transaction in a Web-giant’s data center becomes an electron venture funding decision, computing as we know it changes rather fundamentally: The focus moves from performance and computing capacity to economy and opportunistic value driven computing.
On a cool sunny day with a nice breeze, green energy may be so cheap you could be paid just to take it out of the grid. Spikes in demand, price and opportunity could be very significant and so should be the ability to do, not-do, pre-process or postpone, go deeper or extra shallow on the analysis and deal with the “semantic elasticity” of the results produced.
Your code could plainly refuse to deliver any result for the electrons offered and the quality of response requested; it could first work on a rough hunch and then progressively look wider or deeper for better answers until you have either exhausted your electron budget or the chances of a changed mind become too low. Your workloads might follow the sun or a storm around the globe, your data might follow or decide it’s not worth the move. You might have to consider using accelerators, but only because they are cheap today or you get extra for being quick about it.
Because that already is far too simple, consider that you won’t be the only one making similar decision, that computers in the clouds do not really grow or disappear with the weather and that bored computers today still only eat slightly fewer electrons with depreciation sauce than busy ones.
You won’t ever compute again, without a very good idea about value of the results versus the cost of doing or not doing IT: Welcome to the world of Value Driven Computing!
In this Next Generation Architectures series of blogs we explore the challenges of this new type of computing, where everyone wants to use more data for deeper insights and connect and involve potentially every little manufactured Thing to the Internet, to provide more comfort and better services that customers will pay for, but where the bottom line depends more than ever on your ability to find and exploit opportunities.
We also explore what IT service providers and device vendors need to change to sustain value in a world where the power of criminals and the risks of cyberwarfare rise naturally with the reach of Internet and the capabilities of those things it connects.
The design of Next Generation Architectures is driven by the needs of a world:
- Where the cost of energy isn't static, yet one of the most important obstacles: You need to constantly find ways to compute better and deeper with less energy
- Where millions or billions of Things connected to the Internet (IoT), devices can generate huge amounts of data that needs to be turned into valuable information, filtered, correlated and enriched into knowledge in several places along its path to the core of the cloud
- Where data center and application support tools use logic and machine learning to negotiate for value vs. effort and APIs
- Where security mechanisms and processes designed for data centers cannot be economically replicated to ever smaller and cheaper devices, ever closer to consumers, while their integrity and security becomes more critical to the person that buys them for their home, their family, to carry on or even within their body
- Where networks designed for communication between sovereign mainframes need to be usable as nervous systems for distributed robots and services
- Where very few software and hardware assets created during the last seven decades of IT still fit, but nobody can afford to redesign from scratch or wait until it's finished:
We will dig into:
- Serverless architectures, Lambda design, AI based workload automation to support dynamic and opportunistic workload optimizations, client side offloading of compute
- Potential software and hardware techniques to make IT intrinsically resilient against exploitation
- New ways of networking to make the link between the different parts of a distributed bot more natural to create and safe to use
- The importance of globally accepted, high quality standards to enable broad and sustainable value creation.
For the next post ("Quality kills IoT ") we will deal with what makes IoT systems so easy to kill, Quality!