You will not be able to have a conversation about IT in today’s world without cloud computing being mentioned at least once. But this term isn’t as simple as you may imagine. There are a lot of issues, solutions, architectures, and types for you to digest. We could all benefit from a brief explanation of the cloud so that non-experts will know just what it is.
It is very common to think of the cloud as a huge network of computers with tons of information. But, there are many different types of clouds each capable of handling different types of problems. Specific benefits and features of cloud types can affect decisions in deploying and developing cloud solutions. Sometimes a single cloud type won’t be enough, and multiple types will be required.
It manages and provisions large networks of machines to provide computing resources on-demand which can scale horizontally on any hardware. Utility clouds are usually accessible through APIs. This lets almost anyone to provision resources for addressing IT needs. Some of the examples of utility clouds include EC2, VMWare vCloud Suite, Apache CloudStack, and Microsoft System Center.
This contains a number of subtypes. One type provides API-accessible storage to help access block devices in applications and is used typically for backup, data retention, archiving, and storage of documents. They can even be used for synchronizing content across various systems, devices, etc. They can even provide both human interfaces and API to access stored content.
This is among the fastest growing types of cloud and is characterized by the horizontally scalable, analytic-driven processing of data. Of all the different types, this is the most complicated in terms of software components and analysis of criteria for determining which data cloud frameworks should be deployed.
Data clouds need analytics frameworks for deciphering mass information held. Analytics perform complex algorithms for correlation, entity disambiguation, trends, relationships identification, data efficacy, etc. But, the framework usually decides how analytics will work.
The Big Data problem when it comes to Twitter was about more than allowing users to track the happenings and whereabouts of Justin Bieber and Lady Gaga. It even provides Trends by location which means users will get to know what is working in real-time in their locations. Twitter bought STORM back in 2011 and now provides it as an open-source and free real-time computation system, apart from Twitter Trends.
Also, there are various repositories for the data cloud, and everyone has unique benefits to offer. Data versioning is just one of the things that Apache HBase is capable of. It can also process billions of rows and millions of columns together. Then there’s Accumulo which is a scalable, robust, open-source, sorted, high-performance data retrieval, and storage system.
This is actually a sort of utility cloud, but where utility is usually focused on scalability of production, prototyping clouds are focused on new capabilities. Most utility cloud providers today use fast access to their server images in order to prototype correctly.