Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 2

Hive properties:-

Default location where database and table directories are created is: - “/user/hive/warehouse

hive.metastore.warehouse.dir = Path you want to store your table and database directory and its
content (in hive-site.xml)

Hive.exec.mode.local.auto = true

“By setting this property hive uses local mode more aggressively.
It means hive uses local mode whenever it feels will be better as per performance like when small
data set is there. Even when we are running hadoop in distributed or psedodistributed mode”

javax.jdo.option.ConnectionURL: -
jdbc:derby:;databaseName=/home/me/hive/metastore_db;create=true

Here in database name part of the value string, we have given the location where metadata will be
stored. This change eliminates the problem of Hive dropping the metastore_db directory in the
current working directory every time we start a new Hive session. Now, we’ll always have access to
all our metadata, no matter what directory we are working in.

Below is default behaviour of Hive metastore: - default location is current working directory: -

Creating a metastore_db subdirectory under whatever working directory you happen to be in is not
convenient, as Derby “forgets” about previous metastores when you change to a new working
directory.

Executing Hive Queries from Files


If you are at the bash shell prompt then use
> Hive –f “path of file having hive queries”

If we are already inside hive shell..


Hive> source “path of file having hive queries”
--------------------------------------------------------------------------
If you want to run a shell command on hive cli, just use ! followed by command and then semicolon.

Like hive> ! pwd;

Don’t invoke interactive commands that require user input. Shell “pipes” don’t work and neither do
file “globs.” For example, ! ls *.hql; will look for a file named *.hql;, rather than all files that end with
the .hql extension.
---------------------------------------------------------------------
You can run the hadoop dfs ... commands from within the hive CLI; just drop the hadoop word from
the command and add the semicolon at the end:

hive> dfs -ls / ;


hive.cli.print.header = True
“will print the column header in results” by default it is false”

------------------------------------------------

Values of the new TIMESTAMP type can be integers, which are interpreted as seconds since
the Unix epoch time (Midnight, January 1, 1970), floats, which are interpreted as seconds
since the epoch time with nanosecond resolution (up to 9 decimal places), and
strings, which are interpreted according to the JDBC date string format convention,
YYYY-MM-DD hh:mm:ss.fffffffff.

----------------------------------------------------------------------------

Hive QL DDL

Create database command:-

-Create database ‘databasename”


-Create database IF not exists ‘databasename’ (this command suppress the error if database already
exist)
-Create database ‘databasename’ location ‘Path where you want to create database’
- Create database ‘databasename’ comment ‘Comments you want to provide to the database’

- CREATE DATABASE financials


> WITH DBPROPERTIES ('creator' = 'Mark Moneybags', 'date' = '2012-01-02');

Show database command:-


It is used to show all the database in the system:-

-Show databses;
- Show datbases like ‘h.*’

You might also like