Managed Tables vs. External TablesΒΆ

Let us compare and contrast between Managed Tables and External Tables.

Let us start spark context for this Notebook so that we can execute the code provided. You can sign up for our 10 node state of the art cluster/labs to learn Spark SQL using our unique integrated LMS.

val username = System.getProperty("user.name")
import org.apache.spark.sql.SparkSession

val username = System.getProperty("user.name")
val spark = SparkSession.
    builder.
    config("spark.ui.port", "0").
    config("spark.sql.warehouse.dir", s"/user/${username}/warehouse").
    enableHiveSupport.
    appName(s"${username} | Spark SQL - Managing Tables - Basic DDL and DML").
    master("yarn").
    getOrCreate

If you are going to use CLIs, you can use Spark SQL using one of the 3 approaches.

Using Spark SQL

spark2-sql \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Scala

spark2-shell \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Pyspark

pyspark2 \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse
  • When we say EXTERNAL and specify LOCATION or LOCATION alone as part of CREATE TABLE, it makes the table EXTERNAL.

  • Rest of the syntax is same as Managed Table.

  • However, when we drop Managed Table, it will delete metadata from metastore as well as data from HDFS.

  • When we drop External Table, only metadata will be dropped, not the data.

  • Typically we use External Table when same dataset is processed by multiple frameworks such as Hive, Pig, Spark etc.

  • We cannot run TRUNCATE TABLE command against External Tables.

%%sql

USE itversity_retail
%%sql

SHOW tables
spark.sql("DESCRIBE FORMATTED orders").show(200, false)
%%sql

TRUNCATE TABLE orders
spark.sql("DESCRIBE FORMATTED order_items").show(200, false)
%%sql

TRUNCATE TABLE order_items
%%sql

DROP TABLE orders
%%sql

DROP TABLE order_items
import sys.process._

s"hdfs dfs -ls /user/${username}/retail_db/orders" !