Welcome!

Agile Computing Authors: Elizabeth White, Yeshim Deniz, Zakia Bouachraoui, Liz McMillan, Pat Romanski

Related Topics: Agile Computing, Microsoft Cloud, Containers Expo Blog, Silverlight, Release Management , @CloudExpo

Agile Computing: Blog Feed Post

Amazon’s Answer to SQL Azure - Amazon Relational Database Service

Makes it easier for you to set up, operate, and scale a relational database in the cloud

Amazon Cloud Journal

Today Amazon released its answer to SQL Azure, the hosted cloud database offered by Microsoft. The newest service form Amazon, the Amazon Relational Database Service, or Amazon RDS for short, now in beta, makes it easier for you to set up, operate, and scale a relational database in the cloud. You get direct database access without worrying about infrastructure provisioning, software maintenance, or common database management tasks.

Using the RDS APIs or the command-line tools, you can access the full capabilities of a complete, self-contained MySQL 5.1 database instance in a matter of minutes. You can scale the processing power and storage space as needed with a single API call and you can initiate fully consistent database snapshots at any time.

Much of what you already know about building applications with MySQL will still apply. Your code and your queries will work as expected; you can even import a dump file produced by mysqldump to get started.

Amazon RDS is really easy to use. You have a suite of command-line tools, but keep in mind that you can also do everything using the APIs.

During the beta you can create up to twenty databases per AWS account, and each one can consume up to 1 TB of storage. You can specify an availability zone (which you should do if you plan to access it from an EC2 instance) or you can let RDS choose one for you.

Each DB Instance exports a number of metrics to CloudWatch including CPU Utilization (percent), Free Storage Space (bytes), and Database Connections (count).

Once you’ve deployed RDS for production use, you can easily scale up to larger instance sizes, add additional storage space (up to a total of 1 TB per RDS instance), and make backups with ease. You can easily snapshot a production database and then bring it back to the lab to dig in to a problem.

RDS usage is charged by the DB Instance hour. As noted above, there are five instance sizes and corresponding hourly rates. You’ll also pay 10 cents per GB per month for your provisioned storage and 10 cents for every million I/O requests. You get backup space to store 100% of your provisioned storage at no additional charge, with additional space priced at 15 cents per GB per month. The usual AWS charges for data transferred in and out of the cloud also apply.

There are a number of enhancements planned for the future. Here are some of the features planned for the coming months:

  • Reserved DB Instances so that you can pay a low one-time fee and then receive a substantial discount on your hourly usage charges.
  • A High Availability offering so that you can easily and cost-effectively provision synchronously replicated RDS instances in two different availability zones.

RDS will make a really nice complement to Amazon SimpleDB and that each of the services has a number of unique features and use cases.

As always, Amazon provides plenty of documentation, libraries, and FAQs.

Related posts:

Read the original blog entry...

More Stories By Alin Irimie

Alin Irimie is a software engineer - architect, designer, and developer with over 10 years experience in various languages and technologies. Currently he is Messaging Security Manager at Sunbelt Software, a security company. He is also the CTO of RADSense Software, a software consulting company. He has expertise in Microsoft technologies such as .NET Framework, ASP.NET, AJAX, SQL Server, C#, C++, Ruby On Rails, Cloud computing (Amazon and Windows Azure),and he also blogs about cloud technologies here.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...