Overview of FunctionsΒΆ

Let us get overview of pre-defined functions in Spark SQL.

Let us start spark context for this Notebook so that we can execute the code provided. You can sign up for our 10 node state of the art cluster/labs to learn Spark SQL using our unique integrated LMS.

val username = System.getProperty("user.name")
import org.apache.spark.sql.SparkSession

val username = System.getProperty("user.name")
val spark = SparkSession.
    builder.
    config("spark.ui.port", "0").
    config("spark.sql.warehouse.dir", s"/user/${username}/warehouse").
    enableHiveSupport.
    appName(s"${username} | Spark SQL - Predefined Functions").
    master("yarn").
    getOrCreate

If you are going to use CLIs, you can use Spark SQL using one of the 3 approaches.

Using Spark SQL

spark2-sql \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Scala

spark2-shell \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Pyspark

pyspark2 \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse
  • We can get list of functions by running SHOW functions

  • We can use DESCRIBE command to get the syntax and symantecs of a function - DESCRIBE FUNCTION substr

  • Following are the categories of functions that are more commonly used.

    • String Manipulation

    • Date Manipulation

    • Numeric Functions

    • Type Conversion Functions

    • CASE and WHEN

    • and more

%%sql

SHOW functions
spark.sql("SHOW functions").show(300, false)
spark.catalog.listFunctions.show(300, false)
%%sql

DESCRIBE FUNCTION substr
spark.sql("DESCRIBE FUNCTION substr").show(false)