Google just launched a data storage service that will allow multi-national corporations and big organizations to run big data analysis on its cloud service.
Google Cloud Bigtable is not an entirely new service, as Google has been using it internally for years to manage its core services, such as Google Analytics, Gmail, and Google Search.
Finance companies can use the new service to store petabytes of trading data for analysis of emerging trends. The service can also be used to store sensor data from internet-of-things monitoring system.
Other industries that stand to benefit from the service as well include energy firms, digital advertising firms, telecommunication companies, biomedical industries and other data-intensive corporations.
The Bigtable Google Cloud service is a hosted NoSQL data store. Users can read and write data using the application programming interface (API) for Apache HBase.
Users can use the cloud service with existing Hadoop software since the Bigtable can be accessed through HBase commands. The Hadoop is an open source data processing platform that allows users to work with extremely huge sets of data. The Bigtable will also work with other Google cloud services, including Google Cloud Dataflow and Google BigQuery.
Google fully runs the service offering encryption of data for security and backup through data replication. Additional storage is automatically provided as data grows.
The pricing for this service will depend on a number of things such as amount of storage used, number of nodes deployed, and network usage. The service is still in beta and those interested in using it can sign up for a free trial.