|By Glenn Rossman||
|May 8, 2009 04:11 PM EDT||
Zimory Public Cloud delivers server capacity for short-term computing power needs. Via the Zimory Internet platform you choose from a variety of data centers around the world offering the resources that you need. You define the quantity and quality of performance (including security), and set the desired usage period. After completion of the online order (usually less than 5 minutes), you have access to your resources.
With Zimory Public Cloud you receive computing power from certified datacenters with binding service level agreements (SLAs). As an intermediary between providers and users, Zimory handles all pricing, contracting, and accounting. The price is based on usage (CPU hours, RAM, data storage and data transfer).
Zimory Public Cloud enables a flexible extension of your infrastructure - you "rent" computing power in a predefined way - helping you to manage projects and peak loads automatically and at low cost.
Zimory Public Cloud combines diverse virtual servers from data centers around the world into a homogeneous „Compute Cloud". Via the Zimory internet platform you can draw the desired computing power - in a pre-defined quality - with very short notice.
You get access to Zimory Public Cloud with the online platform "Zimory Cloud Manager" - the world‘s first Internet platform for the trading of computer resources. Zimory Cloud Manager handles ordering and billing, as well as the entire administration of the virtual server. It provides the following functions:
- Web-based interface
Users rent or offer virtual servers using Zimory Cloud Manager --the interface for online selection and reservation of virtual servers. Also, the entire administration of the virtual server with remote access is handled through Cloud Manager.
- Search and filter appropriate resources
The Cloud manager supplies a transparent representation of resources within the data center location, hardware and SLA. It offers comprehensive search functions as well as different technical and non-technical filter mechanisms. Thus resources, which are based on a different virtualization layers, can be filtered first.
- Filtering appropriate resources
The Cloud manager supplies a transparent representation of resources within the data center location, hardware and SLA. It offers comprehensive search functions as well as various technical and non-technical filtering mechanisms - allowing resources that technically do not fit an application to be filtered out in advance.
- Interfaces for monitoring and billing
The system provides an administration client for monitoring and billing. This allows admins to transfer user-related data directly into the billing system. Zimory Cloud Manager offers flexible billing models, which can be based, for example, on the current resource utilization.
- Automated Administration - using time-based or load-based triggers
The user can define rules that trigger automated management operations in the virtual machines. For example, in the case of a continuously high CPU load, additional server instances in a cluster will be added. Another example is that after a certain period an instance can automatically be stopped.
- User Account Management
Authorized users of the Zimory Public Cloud can add and manage additional users within their accounts. For example, employees of a team can administer together a pool of virtual machines. The account manager can see usage statistics of the other users and remove or block them from his user account.
- Backup and recovery of virtual machines
Zimory Cloud Manager provides the end user with complete VM backup management, enabling them to save their machines with little effort and restore them later - or create new clone instances from backup.
- API to manage virtual machines
All functions are controllable by a SOAP and REST-based API. Users can generate automated workflows with simple implementations (e.g. using Ruby).
- Add virtual machines
After registration and approval by Zimory, users can add additional virtual machines to Zimory Public Cloud and offer them to other users via Zimory Cloud Manager. The new machines will be automatically checked and adjusted for the selected cloud resource.
- Planning mechanisms
Zimory Cloud Manager includes a planning module which distributes virtual machines optimized to cloud resources. Since cloud resources can be dynamically added and switched off within Zimory Enterprise Cloud, this planning module is required - to select, based on usage profiles and heuristics -- the best resources for each application.
How Zimory Public Cloud works
Simply go to https://cloud.zimory.com/ and provide your name, login, password, credit card and contact e-mail address. Companies also have the option to pay by invoice (please contact [email protected]). By entering your login name and your password, you generate a personal account. There are no costs or contracts to generate a Zimory account.
Now you can start your virtual server. Zimory distinguishes between appliances and deployments. Appliances are preconfigured images that are either just the operating system or a complete application with the associated operating system included. This can be a Debian OS with 40 MB, an SAP application with 80 GB or a complete Microsoft Exchange Server on a Windows 2008 server. Zimory offers preconfigured turnkey appliances with different operating systems, a standard web application stack, and storage. Alternatively, you can also upload and use your own virtual appliances, generated using Xen or VMware.
The following example uses the pre-configured Zimory appliance with Debian 4.0 Etch, a preconfigured web server and 8 GB of storage.
A deployment is an appliance that was in use in the Public Zimory at least once. During deployment, a Zimory preconfigured appliance or a self-made appliance gets a unique name and a defined hardware environment:
- Memory (RAM) can be selected between 128 - 8192 MB. This value is part of the billing.
- An external IP address can now be assigned to the deployment, which is needed for external access. This is not necessary for configuration of an internal cluster or a load balancer because an internal IP address is sufficient and the communication within the cloud can be handled with internal IP addresses.
- Now you choose a cloud provider based on criteria of „proximity" and „desired quality". The selected quality level - gold, silver or bronze - reflects various SLAs (see Section 6) with different prices for each deployment.
- After setting the various parameters, the appliance will be provisioned and started at the cloud provider of your choice - and will turn into a deployment.
Control of Deployment
The external IP address of a deployment is available immediately after the start. If the virtual server contains a web server for example, it is instantly available via HTTP on the external IP address.
In addition, all deployments can be accessed via a Java VNC viewer with the „Console" icon in the Zimory Cloud Manager. The VNC password is generated individually with a random generator.
Now you will see the login prompt - either on the graphical web interface or directly on the console. The default user name is either ‚root‘ or ‚zimory‘, the default password is ‚zimory‘.
Zimory provides some pre-configured images - but users can also upload their own images. Images that are made with Xen 3.2, VMware Server 1.x or VMware ESX 3.5 can be used in the Zimory Public Cloud (and also in the Zimory Enterprise Cloud). With Zimory, images can be migrated quickly on different platforms. Thus, virtual machines can easily be deployed in different locations. Moreover, users can very quickly set up a large number of identical machines. This technology works with Xen as well as with VMware.
Zimory image management creates maximum transparency for administrators - distribution of images and storage of „Disk Snapshots" is carried out automatically. The user only needs to select the target platform using the above described criteria - and the rest happens automatically.
Supported Virtualization Layers and Applications
Zimory currently supports VMware Server 1.x, VMware ESX(i) 3.5 and Xen 3.2 - which must be used to generate boot images. Zimory will support other virtualization layers such as KVM, VMware Server 2.0 and Hyper-V.
Currently Zimory offers preconfigured standard images for Ubuntu and CentOS Linux systems. Users can also upload their own images - that are already pre-configured and with equipment with their own software stack. Zimory standard images range from a simple Linux operating system to a fully configured „application stack", such as a LAMP server. In addition, all applications that are available for Ubuntu or CentOS can be installed in the standard images. At the moment, Windows appliances have to be created and uploaded directly, but in the near future, standard appliances for Windows will be offered as well.
To operate an application with a highly fluctuating load within Zimory Public Cloud, the application must be scalable within itself. Given that, Zimory Public Cloud works well for multi-tier software architectures. Multi-tier architectures are scalable, since the individual layers are logically separated. For example, in distributed system architectures, the data layer runs on a central database server, the logic layer runs on a remote application server, and the delivery is handled by a web server. In such an architecture, the individual components can be adapted to increasing load by replication. For example, if many users use the application, a clone of the application server can be created, which shares the requests with the first server. This clone operation can be triggered through the API of the Public Cloud with a rule-based trigger. This means that the clone operation will be started if the virtual machine already has a significant CPU load for a defined time period.
Zimory uses standardized, proven hypervisor technologies, which do not cause any negative effects on the operation of other virtual machines when a single virtual machine malfunctions. At the network level, Zimory also uses standard technologies.
Individual users of the Zimory Cloud are separated by different VLANs - allowing the individual virtual machines of different users to interact with each other only via a firewall.
Using VLANs also ensures that other systems of a data center are protected from the virtual machines of the Zimory Cloud. Furthermore, the Zimory Cloud supports the use of different physical networks -- which allows a further separation.
Communication between components is done via standardized communication protocols, which simplifies integration into existing network-level security concepts.
Data are exchanged via HTTP, which handle server requests as completely independent transactions. In particular, requests are handled without reference to previous requests and no session information is exchanged or stored.
Zimory Public Cloud continuously back-ups in secure back-end storage. In addition, you can generate your own snapshots with the corresponding Zimory functionality. By storing your backup data in different data centers you increase the level of security for your data.
In case of failure of a single physical machine, or even an entire data center, the image can be started again very quickly in a remote data center. In doing so, however, the data from the RAM generated after the last backup is lost. Zimory technology makes it very simple - with low overhead - to create online backups of individual machines. This can also be automated.
Service Level Agreements and Support
Zimory offers high available server resources by selected data center operators. In the first step, virtual servers in the Public Cloud are provided by Zimory partner T-Systems. Additional certified data centers will be connected soon.
Zimory makes certifications and quality standards of the various data centers transparent and easily understandable. Users have the option to select higher-level certifications for specific applications and to choose less expensive services for other applications. You can also select data centers with independent certified safety standards.
Zimory Public Cloud distinguishes three levels of quality - Gold, Silver and Bronze - according to the SLA of the connected data centers. All resources in the Public Cloud apply to the security standards described in the previous chapter. However, individual resources could have more quality characteristics such as certifications, failover systems and guaranteed support level.
A major difference between Gold, Silver and Bronze levels is the classification of the connected data centers based on tier classification. The following definition is used:
- Tier 4: Has multiple active supply paths for power and air-conditioning, has redundant components, it is fault-tolerant and provides an availability of at least 99,995%.
- Tier 3: Has multiple active supply paths for power and air-conditioning, with only one system active in standard use; has redundant components and is manageable at the same time and provides an availability of at least 99,982%.
- Tier 2: Has one path each for power and air-conditioning; has redundant components and provides an availability of at least 99,741%.
- Tier 1: Has one path each for power and air-conditioning; has redundant components and provides an availability of at least 99,671%.
You can implement various use cases using Zimory Public Cloud. Here are a few examples:
- Sudden peak loads and additional external resources
- Your company's marketing campaign is a complete success. From one day to the next, the number of visitors on your website increases tenfold. With Zimory Public Cloud your web server easily handles this increase. If the number of visitors on your website goes down again after the end of the campaign, you can reduce the hardware for your website as well and you pay only for what you have used.
- Recurring peak loads
- Daily, weekly or monthly booking runs regularly cause short-term peak loads. These periods are well known and well defined. Using Zimory Public Cloud resources automated and on-demand, you manage peak loads cost-effectively without investing in additional hardware.
Testing and development server
Whether for the introduction of a new content management system or ERP system, companies rewrite testing and development capacities for a few days or weeks. With Zimory Public Cloud admins can see at a glance where to get the needed capacities.
The Zimory price plan is based on a "pay what you use" only price model. There are no
hidden or additional costs.
The pricing is based on:
- Virtual CPU and RAM per hours
- Network traffic (upload) per GB
- Network traffic (download) per GB
- Storage per GB per month
The Zimory Service Level Agreements are an important component for your security.
You can choose from the three quality offerings gold, silver and bronze.
The final price you have to pay will be shown before you deploy a virtual software
appliance on a configured resource in a data center you have selected.
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
Oct. 21, 2016 10:45 AM EDT Reads: 1,631
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 21, 2016 10:30 AM EDT Reads: 1,214
Donna Yasay, President of HomeGrid Forum, today discussed with a panel of technology peers how certification programs are at the forefront of interoperability, and the answer for vendors looking to keep up with today's growing industry for smart home innovation. "To ensure multi-vendor interoperability, accredited industry certification programs should be used for every product to provide credibility and quality assurance for retail and carrier based customers looking to add ever increasing num...
Oct. 21, 2016 09:15 AM EDT Reads: 205
@ThingsExpo has been named the Top 5 Most Influential M2M Brand by Onalytica in the ‘Machine to Machine: Top 100 Influencers and Brands.' Onalytica analyzed the online debate on M2M by looking at over 85,000 tweets to provide the most influential individuals and brands that drive the discussion. According to Onalytica the "analysis showed a very engaged community with a lot of interactive tweets. The M2M discussion seems to be more fragmented and driven by some of the major brands present in the...
Oct. 21, 2016 08:45 AM EDT Reads: 11,114
In an era of historic innovation fueled by unprecedented access to data and technology, the low cost and risk of entering new markets has leveled the playing field for business. Today, any ambitious innovator can easily introduce a new application or product that can reinvent business models and transform the client experience. In their Day 2 Keynote at 19th Cloud Expo, Mercer Rowe, IBM Vice President of Strategic Alliances, and Raejeanne Skillern, Intel Vice President of Data Center Group and ...
Oct. 21, 2016 08:45 AM EDT Reads: 1,358
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...
Oct. 21, 2016 08:00 AM EDT Reads: 5,571
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at Cloud Expo, Ed Featherston, a director and senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Oct. 21, 2016 07:45 AM EDT Reads: 3,728
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
Oct. 21, 2016 07:15 AM EDT Reads: 1,275
Virgil consists of an open-source encryption library, which implements Cryptographic Message Syntax (CMS) and Elliptic Curve Integrated Encryption Scheme (ECIES) (including RSA schema), a Key Management API, and a cloud-based Key Management Service (Virgil Keys). The Virgil Keys Service consists of a public key service and a private key escrow service.
Oct. 21, 2016 07:15 AM EDT Reads: 873
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Oct. 21, 2016 06:45 AM EDT Reads: 1,784
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Oct. 21, 2016 06:15 AM EDT Reads: 4,635
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessi...
Oct. 21, 2016 05:45 AM EDT Reads: 5,047
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Oct. 21, 2016 05:00 AM EDT Reads: 3,935
Fifty billion connected devices and still no winning protocols standards. HTTP, WebSockets, MQTT, and CoAP seem to be leading in the IoT protocol race at the moment but many more protocols are getting introduced on a regular basis. Each protocol has its pros and cons depending on the nature of the communications. Does there really need to be only one protocol to rule them all? Of course not. In his session at @ThingsExpo, Chris Matthieu, co-founder and CTO of Octoblu, walk you through how Oct...
Oct. 21, 2016 04:30 AM EDT Reads: 3,074
Major trends and emerging technologies – from virtual reality and IoT, to Big Data and algorithms – are helping organizations innovate in the digital era. However, to create real business value, IT must think beyond the ‘what’ of digital transformation to the ‘how’ to harness emerging trends, innovation and disruption. Architecture is the key that underpins and ties all these efforts together. In the digital age, it’s important to invest in architecture, extend the enterprise footprint to the cl...
Oct. 21, 2016 04:15 AM EDT Reads: 1,717
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity.
Oct. 21, 2016 04:00 AM EDT Reads: 10,932
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Oct. 21, 2016 03:15 AM EDT Reads: 3,861
One of biggest questions about Big Data is “How do we harness all that information for business use quickly and effectively?” Geographic Information Systems (GIS) or spatial technology is about more than making maps, but adding critical context and meaning to data of all types, coming from all different channels – even sensors. In his session at @ThingsExpo, William (Bill) Meehan, director of utility solutions for Esri, will take a closer look at the current state of spatial technology and ar...
Oct. 21, 2016 03:15 AM EDT Reads: 1,654
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Oct. 21, 2016 02:00 AM EDT Reads: 5,910
The IoT industry is now at a crossroads, between the fast-paced innovation of technologies and the pending mass adoption by global enterprises. The complexity of combining rapidly evolving technologies and the need to establish practices for market acceleration pose a strong challenge to global enterprises as well as IoT vendors. In his session at @ThingsExpo, Clark Smith, senior product manager for Numerex, will discuss how Numerex, as an experienced, established IoT provider, has embraced a ...
Oct. 21, 2016 01:30 AM EDT Reads: 892