Using to_date and to_timestamp

Let us understand how to convert non standard dates and timestamps to standard dates and timestamps.

  • yyyy-MM-dd is the standard date format

  • yyyy-MM-dd HH:mm:ss.SSS is the standard timestamp format

  • Most of the date manipulation functions expect date and time using standard format. However, we might not have data in the expected standard format.

  • In those scenarios we can use to_date and to_timestamp to convert non standard dates and timestamps to standard ones respectively.

Let us start spark context for this Notebook so that we can execute the code provided. You can sign up for our 10 node state of the art cluster/labs to learn Spark SQL using our unique integrated LMS.

from pyspark.sql import SparkSession

import getpass
username = getpass.getuser()

spark = SparkSession. \
    builder. \
    config('spark.ui.port', '0'). \
    config("spark.sql.warehouse.dir", f"/user/{username}/warehouse"). \
    enableHiveSupport(). \
    appName(f'{username} | Python - Processing Column Data'). \
    master('yarn'). \
    getOrCreate()

If you are going to use CLIs, you can use Spark SQL using one of the 3 approaches.

Using Spark SQL

spark2-sql \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Scala

spark2-shell \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Pyspark

pyspark2 \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Tasks

Let us perform few tasks to extract the information we need from date or timestamp.

  • Create a Dataframe by name datetimesDF with columns date and time.

datetimes = [(20140228, "28-Feb-2014 10:00:00.123"),
                     (20160229, "20-Feb-2016 08:08:08.999"),
                     (20171031, "31-Dec-2017 11:59:59.123"),
                     (20191130, "31-Aug-2019 00:00:00.000")
                ]
datetimesDF = spark.createDataFrame(datetimes, schema="date BIGINT, time STRING")
datetimesDF.show(truncate=False)
+--------+------------------------+
|date    |time                    |
+--------+------------------------+
|20140228|28-Feb-2014 10:00:00.123|
|20160229|20-Feb-2016 08:08:08.999|
|20171031|31-Dec-2017 11:59:59.123|
|20191130|31-Aug-2019 00:00:00.000|
+--------+------------------------+
from pyspark.sql.functions import lit, to_date
l = [("X", )]
df = spark.createDataFrame(l).toDF("dummy")
df.show()
+-----+
|dummy|
+-----+
|    X|
+-----+
df.select(to_date(lit('20210302'), 'yyyyMMdd').alias('to_date')).show()
+----------+
|   to_date|
+----------+
|2021-03-02|
+----------+
# year and day of year to standard date
df.select(to_date(lit('2021061'), 'yyyyDDD').alias('to_date')).show()
+----------+
|   to_date|
+----------+
|2021-03-02|
+----------+
df.select(to_date(lit('02/03/2021'), 'dd/MM/yyyy').alias('to_date')).show()
+----------+
|   to_date|
+----------+
|2021-03-02|
+----------+
df.select(to_date(lit('02-03-2021'), 'dd-MM-yyyy').alias('to_date')).show()
+----------+
|   to_date|
+----------+
|2021-03-02|
+----------+
df.select(to_date(lit('02-Mar-2021'), 'dd-MMM-yyyy').alias('to_date')).show()
+----------+
|   to_date|
+----------+
|2021-03-02|
+----------+
df.select(to_date(lit('02-March-2021'), 'dd-MMMM-yyyy').alias('to_date')).show()
+----------+
|   to_date|
+----------+
|2021-03-02|
+----------+
df.select(to_date(lit('March 2, 2021'), 'MMMM d, yyyy').alias('to_date')).show()
+----------+
|   to_date|
+----------+
|2021-03-02|
+----------+
from pyspark.sql.functions import to_timestamp
df.select(to_timestamp(lit('02-Mar-2021'), 'dd-MMM-yyyy').alias('to_date')).show()
+-------------------+
|            to_date|
+-------------------+
|2021-03-02 00:00:00|
+-------------------+
df.select(to_timestamp(lit('02-Mar-2021 17:30:15'), 'dd-MMM-yyyy HH:mm:ss').alias('to_date')).show()
+-------------------+
|            to_date|
+-------------------+
|2021-03-02 17:30:15|
+-------------------+
  • Let us convert data in datetimesDF to standard dates or timestamps

datetimesDF.printSchema()
root
 |-- date: long (nullable = true)
 |-- time: string (nullable = true)
datetimesDF.show(truncate=False)
+--------+------------------------+
|date    |time                    |
+--------+------------------------+
|20140228|28-Feb-2014 10:00:00.123|
|20160229|20-Feb-2016 08:08:08.999|
|20171031|31-Dec-2017 11:59:59.123|
|20191130|31-Aug-2019 00:00:00.000|
+--------+------------------------+
from pyspark.sql.functions import col, to_date, to_timestamp
datetimesDF. \
    withColumn('to_date', to_date(col('date').cast('string'), 'yyyyMMdd')). \
    withColumn('to_timestamp', to_timestamp(col('time'), 'dd-MMM-yyyy HH:mm:ss.SSS')). \
    show(truncate=False)
+--------+------------------------+----------+-------------------+
|date    |time                    |to_date   |to_timestamp       |
+--------+------------------------+----------+-------------------+
|20140228|28-Feb-2014 10:00:00.123|2014-02-28|2014-02-28 10:00:00|
|20160229|20-Feb-2016 08:08:08.999|2016-02-29|2016-02-20 08:08:08|
|20171031|31-Dec-2017 11:59:59.123|2017-10-31|2017-12-31 11:59:59|
|20191130|31-Aug-2019 00:00:00.000|2019-11-30|2019-08-31 00:00:00|
+--------+------------------------+----------+-------------------+