• Home
  • Archive
  • Tools
  • Contact Us

The Customize Windows

Technology Journal

  • Cloud Computing
  • Computer
  • Digital Photography
  • Windows 7
  • Archive
  • Cloud Computing
  • Virtualization
  • Computer and Internet
  • Digital Photography
  • Android
  • Sysadmin
  • Electronics
  • Big Data
  • Virtualization
  • Downloads
  • Web Development
  • Apple
  • Android
Advertisement
You are here: Home » How To Install Apache HBase : Ubuntu Single Cloud Server Instance

By Abhishek Ghosh August 19, 2018 10:35 am Updated on August 19, 2018

How To Install Apache HBase : Ubuntu Single Cloud Server Instance

Advertisement

Apache Hbase is column-oriented distributed datastore. Previously, we have shown installing Apache Accumulo. Accumulo is distributed key/value store, built on Hadoop. In another guide we have shown how to install Apache Cassandra. Cassandra is a column-oriented distributed datastore, inspired by BigTable. We have sown how to install Hadoop on single server instance. We can install Hbase without installing Hadoop.
The reason to use Apache HBase instead of conventional Apache Hadoop is mainly to do random reads and writes. When we are using Hadoop we are going to read the whole dataset whenever we want to run a MapReduce job. Hadoop is a distributed file system (HDFS) and MapReduce (a framework for distributed computing). HBase is key-value data store built on top of Hadoop (on top of HDFS).

Hadoop comprises of HDFS and Map-Reduce. HDFS is a file system which provides a reliable storage with high fault tolerance using replication by distributing the data across a set of nodes. The Hadoop distributed file system aka HDFS provides multiple jobs for us. It consists of 2 components, NameNode (Where the metadata about the file system is stored.) and datanodes(Where the actual distributed data is stored).

Map-Reduce is a set of 2 types of java daemons called Job-Tracker and Task-Tracker. Job-Tracker daemon governs the jobs to be executed, whereas the Task-tracker daemons are the daemons which run on top of the data-nodes in which the data is distributed so that they can compute the program execution logic provided by the user specific to the data within the corresponding data-node.

Advertisement

---

HDFS is the storage component and Map-Reduce is the Execution component. As for the HBase concern, simply we can not connect remotely to HBase without using HDFS because HBase can not create clusters and it has its own local file system. HBase comprises of HMaster (Which consists of the metadata) and RegionServers. RegionServers are another set of daemons running on top of the data-node in the HDFS cluster to store and compute the database related data in the HDFS cluster. We store this in HDFS so that we exploit the core functionality of HDFS that is data replication and fault tolerance. The difference between Map-Reduce Daemons and Hbase-RegionServer Daemons which run on top of HDFS is that, the Map-Reduce Daemons only perform Map-Reduce (Aggregation) type of jobs, whereas the Hbase-RegionServer daemons perform the DataBase related functionalities like read, write etc.

HBase does not support SQL scripting. HBase is not a direct replacement of classic SQL database, but Apache Phoenix project provides a SQL layer for HBase as well as JDBC driver that can be integrated with various analytics and business intelligence applications. The Apache Trafodion project provides a SQL query engine with ODBC and JDBC drivers and distributed ACID transaction protection across multiple statements, tables and rows that uses HBase as a storage engine.

Here is Step By Step Guide On How To Install Apache HBase On Ubuntu Single Cloud Server Instance.

How To Install Apache HBase

 

Steps on How To Install Apache HBase

 

Requirements :

  • DNS pointing properly
  • SSH with root user
  • Minimum Java 6 from Oracle
  • Setup of Loopback IP
  • NTP
  • Tweaking ulimit and nproc

Point DNS with FQDN. You can not use localhost on public server. Both forward and reverse DNS resolving must work. HBase expects the loopback IP address to be 127.0.0.1. We will tweak the settings later.

Create ssh passwordless login like we did for installation of Hadoop.

In the file /etc/security/limits.conf add :

Vim
1
2
3
hadoop  -       nofile  32768
hadoop hard nproc 32000
hadoop soft nproc 32000

In the file /etc/pam.d/common-session add as the last line in the file:

Vim
1
session required  pam_limits.so

/etc/hosts should look something like this:

Vim
1
2
3
4
...
            127.0.0.1 localhost
            127.0.0.1 your.domain.com
...

Install commonly needed things :

Vim
1
2
sudo apt-get update
sudo apt-get install -y git wget ntp maven tar make gcc ant

Log out and log back in again for the changes to take effect. Commonly we install Oracle Java in this way :

Vim
1
2
3
sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java8-installer

We can install in different way as well :

Vim
1
2
3
4
5
6
7
8
9
#JDK installation
#copy jdk file
wget http://download.oracle.com/otn/java/jdk/8u172-b11/a58eab1ec242421181065cdc37240b08/jdk-8u172-linux-x64.tar.gz
sudo cp jdk-8u112-linux-x64.tar.gz /usr/lib
cd /usr/lib
#extract jdk file
sudo tar -xvf jdk-8u112-linux-x64.tar.gz
#remove compressed file
sudo rm jdk-8u112-linux-x64.tar.gz

For manual setting up PATH and JAVA_HOME variables, add the following commands to ~/.profile file :

Vim
1
2
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-amd64
export PATH=$PATH:$JAVA_HOME/bin

And follow by reload :

Vim
1
source ~/.profile

Now we need to install Hbase :

Vim
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#Install Hbase
#create hbase directory
sudo mkdir -p /usr/lib/hbase
 
#download latest Hbase
## http://www.apache.org/dyn/closer.cgi/hbase/
## https://github.com/apache/hbase
wget http://www-eu.apache.org/dist/hbase/2.1.0/hbase-2.1.0-bin.tar.gz
cp hbase-* /usr/lib/hbase
cd /usr/lib/hbase
#extract hbase files
tar -xzvf hbase-2.1.0-bin.tar.gz
#remove habse compressed file
#rm hbase-2.1.0-bin.tar.gz
#make sure that hbase folder is at path like
# /usr/lib/hbase/hbase-2.1.0
cd /usr/lib/hbase/hbase-2.1.0/conf

You’ll get a file named hbase-site.xml there, move it as backup :

Vim
1
mv hbase-site.xml hbase-site.xml.backup

Create empty file named hbase-site.xml :

Vim
1
2
touch hbase-site.xml
nano hbase-site.xml

It will be like :

Vim
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
<?xml version=\"1.0\"?>
<?xml-stylesheet type=\"text/xsl\" href=\"configuration.xsl\"?>
<configuration>
 
<property>
<name>hbase.rootdir</name>
<value>$HOME/HBASE/hbase/</value>
</property>
 
<property>
<name>hbase.zookeeper.property.dataDir</name>
<value>$HOME/HBASE/zookeeper/</value>
</property>
 
</configuration>

Final steps :

Vim
1
2
3
4
5
mkdir HBASE
mkdir HBASE/hbase
mkdir HBASE/zookeeper
echo "export HBASE_HOME=/usr/lib/hbase/hbase-2.1.0" >> ~/.profile
echo "export PATH=\$PATH:\$HBASE_HOME/bin" >> ~/.profile

To build from source, :

Vim
1
2
3
mvn package -DskipTests
# Run test
mvn test -fn

For Ubuntu 18.04 LTS, you need this patch :

Vim
1
2
wget https://issues.apache.org/jira/secure/attachment/12899868/HBASE-19188.branch-1.2.002.patch
patch -p1 -i HBASE-19188.branch-1.2.002.patch

Finally, we can start HBase, with the start-hbase.sh script from bin directory, like :

Vim
1
/usr/local/HBase/bin/start-hbase.sh

We can also start Hbase shell :

Vim
1
hbase shell

Tagged With Apache Apex , https://thecustomizewindows com/2018/08/how-to-install-apache-hbase-ubuntu-single-cloud-server-instance/ , how to download apache hbase on ubuntu 18 04 , install hbase on ubuntu 18 04

This Article Has Been Shared 188 Times!

Facebook Twitter Pinterest

Abhishek Ghosh

About Abhishek Ghosh

Abhishek Ghosh is a Businessman, Surgeon, Author and Blogger. You can keep touch with him on Twitter - @AbhishekCTRL.

Here’s what we’ve got for you which might like :

Articles Related to How To Install Apache HBase : Ubuntu Single Cloud Server Instance

  • Big Data as a Service (BDaaS) Basics

    Big Data as a Service or BDaaS, is as if combination of SaaS, PaaS and DaaS. Self Hosting Big Data platform is time consuming and costly.

  • Real-time Big Data Analytics in Health Care Using Tools From IBM

    Goal of the article Real-time Big Data Analytics in Health Care Using Tools From IBM is to provide understanding of big data in the health.

  • How to Include Jupyter Notebooks in WordPress Posts

    How to Include Jupyter Notebooks in WordPress Posts? There Are Several Ways. It Depends on What Exactly Your Blogs Are About.

  • How To Install Apache Avro On Ubuntu Running Apache Hadoop

    Here is How To Install Apache Avro On Ubuntu Running Apache Hadoop. Avro is a Framework Supporting RPC and Data Serialization.

  • Configure Apache Tika With WordPress to Search, Get Meta of PDF/Doc Files

    Apache Tika Can Be Combined With PHP. Here is How to Configure Apache Tika With WordPress to Search, Get Meta of PDF/Doc/Excel/Text and Other Type of Files.

Additionally, performing a search on this website can help you. Also, we have YouTube Videos.

Take The Conversation Further ...

We'd love to know your thoughts on this article.
Meet the Author over on Twitter to join the conversation right now!

If you want to Advertise on our Article or want a Sponsored Article, you are invited to Contact us.

Contact Us

Subscribe To Our Free Newsletter

Get new posts by email:

Please Confirm the Subscription When Approval Email Will Arrive in Your Email Inbox as Second Step.

Search this website…

 

Popular Articles

Our Homepage is best place to find popular articles!

Here Are Some Good to Read Articles :

  • Cloud Computing Service Models
  • What is Cloud Computing?
  • Cloud Computing and Social Networks in Mobile Space
  • ARM Processor Architecture
  • What Camera Mode to Choose
  • Indispensable MySQL queries for custom fields in WordPress
  • Windows 7 Speech Recognition Scripting Related Tutorials

Social Networks

  • Pinterest (24.3K Followers)
  • Twitter (5.8k Followers)
  • Facebook (5.7k Followers)
  • LinkedIn (3.7k Followers)
  • YouTube (1.3k Followers)
  • GitHub (Repository)
  • GitHub (Gists)
Looking to publish sponsored article on our website?

Contact us

Recent Posts

  • Cyberpunk Aesthetics: What’s in it Special January 27, 2023
  • How to Do Electrical Layout Plan for Adding Smart Switches January 26, 2023
  • What is a Data Mesh? January 25, 2023
  • What is Vehicular Ad-Hoc Network? January 24, 2023
  • Difference Between Panel Light, COB Light, Track Light January 21, 2023

About This Article

Cite this article as: Abhishek Ghosh, "How To Install Apache HBase : Ubuntu Single Cloud Server Instance," in The Customize Windows, August 19, 2018, January 30, 2023, https://thecustomizewindows.com/2018/08/how-to-install-apache-hbase-ubuntu-single-cloud-server-instance/.

Source:The Customize Windows, JiMA.in

PC users can consult Corrine Chorney for Security.

Want to know more about us? Read Notability and Mentions & Our Setup.

Copyright © 2023 - The Customize Windows | dESIGNed by The Customize Windows

Copyright  · Privacy Policy  · Advertising Policy  · Terms of Service  · Refund Policy

We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of ALL the cookies.
Do not sell my personal information.
Cookie SettingsAccept
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checkbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checkbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT