Creating External TablesΒΆ

Let us understand how to create external table in Spark Metastore using orders as example. Also we will see how to load data into external table.

Let us start spark context for this Notebook so that we can execute the code provided. You can sign up for our 10 node state of the art cluster/labs to learn Spark SQL using our unique integrated LMS.

val username = System.getProperty("user.name")
import org.apache.spark.sql.SparkSession

val username = System.getProperty("user.name")
val spark = SparkSession.
    builder.
    config("spark.ui.port", "0").
    config("spark.sql.warehouse.dir", s"/user/${username}/warehouse").
    enableHiveSupport.
    appName(s"${username} | Spark SQL - Managing Tables - Basic DDL and DML").
    master("yarn").
    getOrCreate

If you are going to use CLIs, you can use Spark SQL using one of the 3 approaches.

Using Spark SQL

spark2-sql \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Scala

spark2-shell \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Pyspark

pyspark2 \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse
  • We just need to add EXTERNAL keyword in the CREATE clause and LOCATION after STORED AS clause or just LOCATION as part of CREATE TABLE statement.

  • We can use same LOAD commands to get data from either local file system or HDFS which we have used for Managed table.

  • Once table is created we can run DESCRIBE FORMATTED orders to check the metadata of the table and confirm whether it is managed table or external table.

  • We need to specify the location while creating external tables.

Here is the script to create external table in Spark Metastore.

%%sql

USE itversity_retail
%%sql

DROP TABLE IF EXISTS orders
import sys.process._

val username = System.getProperty("user.name")
s"hdfs dfs -rm -R /user/${username}/external/retail_db/orders" !
s"hdfs dfs -mkdir -p /user/${username}/external/retail_db/orders" !
%%sql

CREATE EXTERNAL TABLE orders (
  order_id INT COMMENT 'Unique order id',
  order_date STRING COMMENT 'Date on which order is placed',
  order_customer_id INT COMMENT 'Customer id who placed the order',
  order_status STRING COMMENT 'Current status of the order'
) COMMENT 'Table to save order level details'
ROW FORMAT DELIMITED FIELDS TERMINATED BY ','
LOCATION '/user/itversity/external/retail_db/orders'
s"hdfs dfs -ls /user/${username}/external/retail_db/orders" !
%%sql

LOAD DATA LOCAL INPATH '/data/retail_db/orders' 
  INTO TABLE orders
s"hdfs dfs -ls /user/${username}/external/retail_db/orders" !
%%sql

SELECT * FROM orders LIMIT 10
%%sql

SELECT count(1) FROM orders
spark.sql("DESCRIBE FORMATTED orders").show(200, false)