Big Data / Hadoop / HPC
Big Data / Hadoop / HPC
Hadoop is a valuable tool for mining huge amounts of data in a very short period of time and leveraging the power of farms of inexpensive commodity servers and disks. When there is one dataset, one organization mining that data, and a static or growing load placed on that dataset, Hadoop works perfectly without help from any cloud technology.
However, when elasticity, multi-tenancy and flexibility are required, running Hadoop on a private cloud platform like Nimbula Director can provide huge cost and operational benefits.
Nimbula and MapR bring Hadoop to Private Cloud. Now customers can deploy Hadoop on top of Nimbula Director in mere minutes and start running their Hadoop job on the cloud.
Customers benefit from:
- Rapid provisioning: Deploying Hadoop clusters in under 2 minutes
- Self-service Hadoop: Launching their own Hadoop clusters on a private cloud without needing to requisition hardware
- Highly available Hadoop: Operating their Hadoop clusters in a lights-out data center with automated re-provisioning of failed nodes and a no-NameNode architecture
- Multi-tenant Hadoop: Sharing infrastructure between Hadoop clusters with complete permissions, network and resource isolation
- Multi-purpose infrastructure: Sharing infrastructure between Hadoop and non-Hadoop workloads
Nimbula Director takes bare metal servers with local disks and turns them into a large multi-tenant pool of compute. End users can self-service deploy their own Hadoop instances, copy in datasets, run however many jobs they need, save the results and deprovision their Hadoop deployment when completed.
In this way, many users can have their own private Hadoop farm on a shares infrastructure. They can modify and manipulate their own farm without impacting others’. Their data and network are completely isolated. Consumption by each user can be regulated by quotas and metered for chargeback.
HA is simplified, automated and made more cost effective. When a server is lost, Nimbula Director, through its orchestration feature, will automatically redeploy any lost Hadoop instances on available nodes. Once the Hadoop instance starts, the Hadoop file system will automatically start replication data into the new instance. Using the HA features of Nimbula Director along with the recovery features of Hadoop, recovery from hardware failure can now be accomplished without any user intervention.
Hadoop Development and testing for enterprises and SaaS companies
An elastic Hadoop deployment on top of Nimbula Director allows each user to deploy Hadoop sandboxes in mere minutes, increasing efficiency and quality of the development results.
Multiple users, groups and tenants can deploy their own Hadoop deployment on the same infrastructure, completely isolated from each other. They can each grow and shrink their deployment over time based on needs.
Shared infrastructure for Hadoop and non-Hadoop workloads
Not all Hadoop deployments will run at full capacity at all times. Customers can now execute their varying types of work, Hadoop and non-Hadoop jobs, on a single private cloud infrastructure.