Validating FunctionsΒΆ

Let us see how we can validate Spark SQL functions.

Let us start spark context for this Notebook so that we can execute the code provided. You can sign up for our 10 node state of the art cluster/labs to learn Spark SQL using our unique integrated LMS.

val username = System.getProperty("user.name")
import org.apache.spark.sql.SparkSession

val username = System.getProperty("user.name")
val spark = SparkSession.
    builder.
    config("spark.ui.port", "0").
    config("spark.sql.warehouse.dir", s"/user/${username}/warehouse").
    enableHiveSupport.
    appName(s"${username} | Spark SQL - Predefined Functions").
    master("yarn").
    getOrCreate

If you are going to use CLIs, you can use Spark SQL using one of the 3 approaches.

Using Spark SQL

spark2-sql \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Scala

spark2-shell \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Pyspark

pyspark2 \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse
  • Spark SQL follows MySQL style. To validate functions we can just use SELECT clause - e. g.: SELECT current_date;

  • Another example - SELECT substr('Hello World', 1, 5);

  • If you want to use Oracle style, you can create table by name dual and insert one record.

  • You can also create temporary view on top of dataframe and start writing SQL Queries. We will see an example with Scala based approach. Here are the code snippets using both Scala as well as Pyspark.

Using Scala

val orders = spark.read.
    schema("order_id INT, order_date STRING, order_customer_id INT, order_status STRING").
    csv("/public/retail_db/orders")
orders.createOrReplaceTempView("orders_temp")

Using Python

orders = spark.read. \
    schema("order_id INT, order_date STRING, order_customer_id INT, order_status STRING"). \
    csv("/public/retail_db/orders")
orders.createOrReplaceTempView("orders_temp")
%%sql

SELECT current_date AS current_date
%%sql

SELECT substr('Hello World', 1, 5) AS result
%%sql

USE itversity_retail
%%sql

SELECT current_database()
%%sql

DROP TABLE IF EXISTS dual
%%sql

CREATE TABLE dual (dummy STRING)
%%sql

INSERT INTO dual VALUES ('X')
%%sql

SELECT current_date AS current_date FROM dual
%%sql

SELECT substr('Hello World', 1, 5) AS result FROM dual
  • Here is how you can validate functions using Data Frame.

    • Create Data Frame

    • Create temporary view using Data Frame

    • Run queries using view with relevant functions

val orders = spark.read.
    schema("order_id INT, order_date STRING, order_customer_id INT, order_status STRING").
    csv("/public/retail_db/orders")
orders.createOrReplaceTempView("orders_temp")
%%sql

SELECT o.*, lower(order_status) AS order_status_lower FROM orders_temp AS o LIMIT 10