Date and Time Arithmetic

Let us perform Date and Time Arithmetic using relevant functions over Spark Data Frames.

  • Adding days to a date or timestamp - date_add

  • Subtracting days from a date or timestamp - date_sub

  • Getting difference between 2 dates or timestamps - datediff

  • Getting the number of months between 2 dates or timestamps - months_between

  • Adding months to a date or timestamp - add_months

  • Getting next day from a given date - next_day

  • All the functions are self explanatory. We can apply these on standard date or timestamp. All the functions return date even when applied on timestamp field.

Let us start spark context for this Notebook so that we can execute the code provided. You can sign up for our 10 node state of the art cluster/labs to learn Spark SQL using our unique integrated LMS.

from pyspark.sql import SparkSession

import getpass
username = getpass.getuser()

spark = SparkSession. \
    builder. \
    config('spark.ui.port', '0'). \
    config("spark.sql.warehouse.dir", f"/user/{username}/warehouse"). \
    enableHiveSupport(). \
    appName(f'{username} | Python - Processing Column Data'). \
    master('yarn'). \
    getOrCreate()

If you are going to use CLIs, you can use Spark SQL using one of the 3 approaches.

Using Spark SQL

spark2-sql \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Scala

spark2-shell \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Using Pyspark

pyspark2 \
    --master yarn \
    --conf spark.ui.port=0 \
    --conf spark.sql.warehouse.dir=/user/${USER}/warehouse

Tasks

Let us perform some tasks related to date arithmetic.

  • Get help on each and every function first and understand what all arguments need to be passed.

  • Create a Dataframe by name datetimesDF with columns date and time.

datetimes = [("2014-02-28", "2014-02-28 10:00:00.123"),
                     ("2016-02-29", "2016-02-29 08:08:08.999"),
                     ("2017-10-31", "2017-12-31 11:59:59.123"),
                     ("2019-11-30", "2019-08-31 00:00:00.000")
                ]
datetimesDF = spark.createDataFrame(datetimes, schema="date STRING, time STRING")
datetimesDF.show(truncate=False)
+----------+-----------------------+
|date      |time                   |
+----------+-----------------------+
|2014-02-28|2014-02-28 10:00:00.123|
|2016-02-29|2016-02-29 08:08:08.999|
|2017-10-31|2017-12-31 11:59:59.123|
|2019-11-30|2019-08-31 00:00:00.000|
+----------+-----------------------+
  • Add 10 days to both date and time values.

  • Subtract 10 days from both date and time values.

from pyspark.sql.functions import date_add, date_sub
help(date_add)
Help on function date_add in module pyspark.sql.functions:

date_add(start, days)
    Returns the date that is `days` days after `start`
    
    >>> df = spark.createDataFrame([('2015-04-08',)], ['dt'])
    >>> df.select(date_add(df.dt, 1).alias('next_date')).collect()
    [Row(next_date=datetime.date(2015, 4, 9))]
    
    .. versionadded:: 1.5
help(date_sub)
Help on function date_sub in module pyspark.sql.functions:

date_sub(start, days)
    Returns the date that is `days` days before `start`
    
    >>> df = spark.createDataFrame([('2015-04-08',)], ['dt'])
    >>> df.select(date_sub(df.dt, 1).alias('prev_date')).collect()
    [Row(prev_date=datetime.date(2015, 4, 7))]
    
    .. versionadded:: 1.5
datetimesDF. \
    withColumn("date_add_date", date_add("date", 10)). \
    withColumn("date_add_time", date_add("time", 10)). \
    withColumn("date_sub_date", date_sub("date", 10)). \
    withColumn("date_sub_time", date_sub("time", 10)). \
    show()
+----------+--------------------+-------------+-------------+-------------+-------------+
|      date|                time|date_add_date|date_add_time|date_sub_date|date_sub_time|
+----------+--------------------+-------------+-------------+-------------+-------------+
|2014-02-28|2014-02-28 10:00:...|   2014-03-10|   2014-03-10|   2014-02-18|   2014-02-18|
|2016-02-29|2016-02-29 08:08:...|   2016-03-10|   2016-03-10|   2016-02-19|   2016-02-19|
|2017-10-31|2017-12-31 11:59:...|   2017-11-10|   2018-01-10|   2017-10-21|   2017-12-21|
|2019-11-30|2019-08-31 00:00:...|   2019-12-10|   2019-09-10|   2019-11-20|   2019-08-21|
+----------+--------------------+-------------+-------------+-------------+-------------+
  • Get the difference between current_date and date values as well as current_timestamp and time values.

from pyspark.sql.functions import current_date, current_timestamp, datediff
datetimesDF. \
    withColumn("datediff_date", datediff(current_date(), "date")). \
    withColumn("datediff_time", datediff(current_timestamp(), "time")). \
    show()
+----------+--------------------+-------------+-------------+
|      date|                time|datediff_date|datediff_time|
+----------+--------------------+-------------+-------------+
|2014-02-28|2014-02-28 10:00:...|         2559|         2559|
|2016-02-29|2016-02-29 08:08:...|         1828|         1828|
|2017-10-31|2017-12-31 11:59:...|         1218|         1157|
|2019-11-30|2019-08-31 00:00:...|          458|          549|
+----------+--------------------+-------------+-------------+
  • Get the number of months between current_date and date values as well as current_timestamp and time values.

  • Add 3 months to both date values as well as time values.

from pyspark.sql.functions import months_between, add_months, round
datetimesDF. \
    withColumn("months_between_date", round(months_between(current_date(), "date"), 2)). \
    withColumn("months_between_time", round(months_between(current_timestamp(), "time"), 2)). \
    withColumn("add_months_date", add_months("date", 3)). \
    withColumn("add_months_time", add_months("time", 3)). \
    show(truncate=False)
+----------+-----------------------+-------------------+-------------------+---------------+---------------+
|date      |time                   |months_between_date|months_between_time|add_months_date|add_months_time|
+----------+-----------------------+-------------------+-------------------+---------------+---------------+
|2014-02-28|2014-02-28 10:00:00.123|84.16              |84.17              |2014-05-31     |2014-05-31     |
|2016-02-29|2016-02-29 08:08:08.999|60.13              |60.14              |2016-05-31     |2016-05-31     |
|2017-10-31|2017-12-31 11:59:59.123|40.06              |38.07              |2018-01-31     |2018-03-31     |
|2019-11-30|2019-08-31 00:00:00.000|15.1               |18.08              |2020-02-29     |2019-11-30     |
+----------+-----------------------+-------------------+-------------------+---------------+---------------+