|By Reuven Cohen||
|March 2, 2009 07:00 AM EST||
It's that dreaded time of the month again, the time of the month that we, the 400,000+ Amazon Web Service consumers await with great anticipation / horror. What I'm talking about is the Amazon Web Services Billing Statement sent at beginning of each month. A surprise every time. In honor of this monthly event, I thought I'd take a minute to discuss some of the hurdles as well as opportunities for Billing, Metering & Measuring the Cloud.
I keep hearing that one of the biggest issues facing IaaS users currently is a lack of insight into costing, billing and metering. The AWS costing problem is straightforward enough, unlike other cloud services Amazon has decided to not offer any kind of real time reporting or API for their cloud billing (EC2, S3, etc). There are some reporting features for DevPay and Flexible Payments Service (Amazon FPS) as well as a Account Activity page, but who has time for a dashboard when what we really want is an realtime API?
To give some background, when Amazon launched S3 and later EC2 the reasoning was fairly straightforward, they were a new services still in beta. So without officially comfirming, the word was a billing API was coming soon. But 3 years later, still no billing billing API? So I have to ask, what gives?
Other Cloud services have done a great job of providing a real time view of what the cloud is costing you. One of the best examples is GoGrid's myaccount.billing.get API and widget which offers a variety of metrics through their Open Source GoGrid API.
Billing APIs aside, another major problem still remains for most cloud users, a basis for comparing the quality & cost of cloud compute capacity between cloud providers. This brings us to the problem of metering the cloud which Yi-Jian Ngo at Microsoft pointed out last year. In his post he stated that "Failing to come up with an appropriate yardstick could lead to hairy billing issues, savvy customers tinkering with clever arbitrage schemes and potentially the inability of cloud service providers to effectively predict how much to charge in order to cover their costs."
Yi-Jian Ngo couldn't have been more right in pointing to Wittgenstein's Rule: "Unless you have confidence in the ruler's reliability, if you use a ruler to measure a table, you may as well be using the table to measure the ruler."
A few companies have attempted to define cloud capacity, notably Amazon's Elastic Compute Cloud service uses a EC2 Compute Unit as the basis for their EC2 pricing scheme (As well as bandwidth and storage) Amazon states they use a variety of measurements to provide each EC2 instance with a consistent and predictable amount of CPU capacity. The amount of CPU that is allocated to a particular instance is expressed in terms of EC2 Compute Units. Amazon explains that they use several benchmarks and tests to manage the consistency and predictability of the performance from an EC2 Compute Unit. One EC2 Compute Unit provides the equivalent CPU capacity of a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor. They claim this is the equivalent to an early-2006 1.7 GHz Xeon processor. Amazon makes no mention of how they achieve their benchmark and users of the EC2 system are not given any real insight to how they came to their benchmark numbers. Currently there are no standards for cloud capacity and therefore there is no effective way for users to compare with other cloud providers in order to make the best decision for their application demands.
An idea I suggested in a post last year was to create an open universal compute unit which could be used to address an "apples-to-apples" comparison between cloud capacity providers. My rough concept was to create a Universal Compute Unit specification and benchmark test based on integer operations that can form an (approximate) indicator of the likely performance of a given virtual application within a given cloud such as Amazon EC2, GoGrid or even a virtualized data center such as VMWare. One potential point of analysis cloud be in using a stand clock rate measured in hertz derived by multiplying the instructions per cycle and the clock speed (measured in cycles per second). It can be more accurately defined within the context of both a virtual machine kernel and standard single and multicore processor types.
My other suggestion was to create a Universal Compute Cycle (UCC) or the inverse of Universal Compute Unit. The UCC would be used when direct system access in the cloud and or operating system is not available. One such example is Google's App Engine or Microsoft Azure. UCC could be based on clock cycles per instruction or the number of clock cycles that happen when an instruction is being executed. This allows for an inverse calculation to be performed to determine the UcU value as well as providing a secondary level of performance evaluation / benchmarking.
I'm not the only one thinking about this, One such company trying to address this need is Satori Tech with their capacity measurement metric, which they call the Computing Resource Unit (“CRU”). They claim that the CRU allows for dynamic monitoring of available and used computing capacity on physical servers and virtual pools/instances. The CRU allows for uniform comparison of capacity, usage and cost efficiency in heterogeneous computing environments and abstraction away from operating details for financial optimization. Unfortunately the format is a patented and closed format only available to customers of Satori Tech.
And before you say it, I know that UCU, UCC or CRU could be "gamed" by unsavory cloud providers attempting to pull an "Enron", this is why we would need to create an auditable specification which includes a "certified measurement" to address this kind of cloud bench marking. A potential avenue is IBM's new "Resilient Cloud Validation" program, which I've come to appreciate lately. (Sorry about my previous lipstick on pig remarks) The program will allow businesses who collaborate with IBM to perform a rigorous, consistent and proven program of benchmarking and design validation to use the IBM logo: "Resilient Cloud" when marketing their services. These types of certification programs may serve as the basis for defining a level playing field among various cloud providers. Although I feel that a more impartial trade group such as the IEEE may be a better entity to handle the certification process.
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Oct. 25, 2016 09:30 AM EDT Reads: 2,566
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Oct. 25, 2016 09:15 AM EDT Reads: 1,062
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Oct. 25, 2016 08:30 AM EDT Reads: 890
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Oct. 25, 2016 08:00 AM EDT Reads: 5,738
Successful digital transformation requires new organizational competencies and capabilities. Research tells us that the biggest impediment to successful transformation is human; consequently, the biggest enabler is a properly skilled and empowered workforce. In the digital age, new individual and collective competencies are required. In his session at 19th Cloud Expo, Bob Newhouse, CEO and founder of Agilitiv, will draw together recent research and lessons learned from emerging and established ...
Oct. 25, 2016 07:45 AM EDT Reads: 1,399
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
Oct. 25, 2016 07:45 AM EDT Reads: 4,907
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Oct. 25, 2016 06:45 AM EDT Reads: 4,823
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Oct. 25, 2016 06:15 AM EDT Reads: 11,433
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will w...
Oct. 25, 2016 05:30 AM EDT Reads: 3,376
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Oct. 25, 2016 05:15 AM EDT Reads: 1,997
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
Oct. 25, 2016 04:30 AM EDT Reads: 9,706
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Oct. 25, 2016 04:15 AM EDT Reads: 955
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Oct. 25, 2016 04:15 AM EDT Reads: 1,741
SYS-CON Events announced today that Streamlyzer will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Streamlyzer is a powerful analytics for video streaming service that enables video streaming providers to monitor and analyze QoE (Quality-of-Experience) from end-user devices in real time.
Oct. 25, 2016 04:15 AM EDT Reads: 1,012
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Oct. 25, 2016 04:00 AM EDT Reads: 968
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
Oct. 25, 2016 03:45 AM EDT Reads: 1,130
Cloud based infrastructure deployment is becoming more and more appealing to customers, from Fortune 500 companies to SMEs due to its pay-as-you-go model. Enterprise storage vendors are able to reach out to these customers by integrating in cloud based deployments; this needs adaptability and interoperability of the products confirming to cloud standards such as OpenStack, CloudStack, or Azure. As compared to off the shelf commodity storage, enterprise storages by its reliability, high-availabil...
Oct. 25, 2016 03:15 AM EDT Reads: 1,166
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
Oct. 25, 2016 02:00 AM EDT Reads: 628
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
Oct. 25, 2016 01:15 AM EDT Reads: 1,030
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
Oct. 25, 2016 01:15 AM EDT Reads: 1,423