Cluster Setup

In this chapter

This chapter explains automated cluster setup process.

Introduction

A cluster consists of a set of connected computers that work together so that in many respects they can be viewed as a single system.

Installing a Hadoop cluster typically involves unpacking the software on all the machines in the cluster. Typically one machine in the cluster is designated as the NameNode and another machine as the ResourceManager, exclusively. These are the masters. The rest of the machines in the cluster act as both DataNode and NodeManager. These are the slaves.

Cluster setup might be a time consuming task if you want to add large number of nodes on different machines. To ease the task, QueryIO provides automated cluster setup.

First screen that appears on QueryIO server when no host is present is Cluster Setup through which you can directly configure your entire cluster.

To configure the cluster, go to Admin > Cluster Setup.

QueryIO provides two types of cluster setup:

  1. Single Machine Cluster (For evaluation users)
  2. Multiple Machines Cluster (For advanced users)

Single Machine Cluster (For evaluation users)

If you are not a professional user and just want to evaluate the product, you might want to go for single machine cluster setup.

This setup installs all required nodes on a single machine.

Just provide machine's credentials and QueryIO will install a NameNode, DataNode, ResourceManger and a NodeManager with default settings.

Following details need to be provided :


Multiple Machines Cluster (For advanced users)

If you are an advanced user and want to configure cluster for production use, select "Multiple Machines Cluster" option to install nodes on multiple machines.

This setup allows you to install multiple nodes on multiple systems simultaneously.

Follow these steps to configure the cluster :







Copyright 2018 QueryIO Corporation. All Rights Reserved.

QueryIO, "Big Data Intelligence" and the QueryIO Logo are trademarks of QueryIO Corporation. Apache, Hadoop and HDFS are trademarks of The Apache Software Foundation.