Boolean Operators¶
Let us understand details about boolean operators while filtering data in Spark Data Frames.
If we have to validate against multiple columns then we need to use boolean operations such as
AND
orOR
or both.Here are some of the examples where we end up using Boolean Operators.
Get count of flights which are departed late at origin and reach destination early or on time.
Get count of flights which are departed early or on time but arrive late by at least 15 minutes.
Get number of flights which are departed late on Saturdays as well as on Sundays.
Let us start spark context for this Notebook so that we can execute the code provided. You can sign up for our 10 node state of the art cluster/labs to learn Spark SQL using our unique integrated LMS.
from pyspark.sql import SparkSession
import getpass
username = getpass.getuser()
spark = SparkSession. \
builder. \
config('spark.ui.port', '0'). \
config("spark.sql.warehouse.dir", f"/user/{username}/warehouse"). \
enableHiveSupport(). \
appName(f'{username} | Python - Basic Transformations'). \
master('yarn'). \
getOrCreate()
If you are going to use CLIs, you can use Spark SQL using one of the 3 approaches.
Using Spark SQL
spark2-sql \
--master yarn \
--conf spark.ui.port=0 \
--conf spark.sql.warehouse.dir=/user/${USER}/warehouse
Using Scala
spark2-shell \
--master yarn \
--conf spark.ui.port=0 \
--conf spark.sql.warehouse.dir=/user/${USER}/warehouse
Using Pyspark
pyspark2 \
--master yarn \
--conf spark.ui.port=0 \
--conf spark.sql.warehouse.dir=/user/${USER}/warehouse
Tasks¶
Let us perform some tasks to understand filtering in detail. Solve all the problems by passing conditions using both SQL Style as well as API Style.
Read the data for the month of 2008 January.
airtraffic_path = "/public/airtraffic_all/airtraffic-part/flightmonth=200801"
airtraffic = spark. \
read. \
parquet(airtraffic_path)
airtraffic.printSchema()
root
|-- Year: integer (nullable = true)
|-- Month: integer (nullable = true)
|-- DayofMonth: integer (nullable = true)
|-- DayOfWeek: integer (nullable = true)
|-- DepTime: string (nullable = true)
|-- CRSDepTime: integer (nullable = true)
|-- ArrTime: string (nullable = true)
|-- CRSArrTime: integer (nullable = true)
|-- UniqueCarrier: string (nullable = true)
|-- FlightNum: integer (nullable = true)
|-- TailNum: string (nullable = true)
|-- ActualElapsedTime: string (nullable = true)
|-- CRSElapsedTime: integer (nullable = true)
|-- AirTime: string (nullable = true)
|-- ArrDelay: string (nullable = true)
|-- DepDelay: string (nullable = true)
|-- Origin: string (nullable = true)
|-- Dest: string (nullable = true)
|-- Distance: string (nullable = true)
|-- TaxiIn: string (nullable = true)
|-- TaxiOut: string (nullable = true)
|-- Cancelled: integer (nullable = true)
|-- CancellationCode: string (nullable = true)
|-- Diverted: integer (nullable = true)
|-- CarrierDelay: string (nullable = true)
|-- WeatherDelay: string (nullable = true)
|-- NASDelay: string (nullable = true)
|-- SecurityDelay: string (nullable = true)
|-- LateAircraftDelay: string (nullable = true)
|-- IsArrDelayed: string (nullable = true)
|-- IsDepDelayed: string (nullable = true)
Get count of flights which are departed late at origin and reach destination early or on time.
airtraffic. \
select('IsDepDelayed', 'IsArrDelayed', 'Cancelled'). \
distinct(). \
show()
+------------+------------+---------+
|IsDepDelayed|IsArrDelayed|Cancelled|
+------------+------------+---------+
| NO| NO| 0|
| YES| YES| 1|
| NO| YES| 0|
| YES| NO| 0|
| YES| YES| 0|
+------------+------------+---------+
airtraffic. \
filter("IsDepDelayed = 'YES' AND IsArrDelayed = 'NO' AND Cancelled = 0"). \
show()
airtraffic. \
filter("IsDepDelayed = 'YES' AND IsArrDelayed = 'NO' AND Cancelled = 0"). \
count()
54233
API Style
from pyspark.sql.functions import col
airtraffic. \
filter((col("IsDepDelayed") == "YES") &
(col("IsArrDelayed") == "NO") &
(col("Cancelled") == 0)
). \
count()
54233
airtraffic. \
filter((airtraffic["IsDepDelayed"] == "YES") &
(airtraffic.IsArrDelayed == "NO") &
(airtraffic.Cancelled == 0)
). \
count()
54233
Get count of flights which are departed early or on time but arrive late by at least 15 minutes.
airtraffic. \
select('IsDepDelayed', 'IsArrDelayed', 'Cancelled'). \
distinct(). \
show()
+------------+------------+---------+
|IsDepDelayed|IsArrDelayed|Cancelled|
+------------+------------+---------+
| NO| NO| 0|
| YES| YES| 1|
| NO| YES| 0|
| YES| NO| 0|
| YES| YES| 0|
+------------+------------+---------+
# Cancelled is always 0 when there is no delay related to departure
# We can ignore check against Cancelled
airtraffic. \
filter("IsDepDelayed = 'NO' AND ArrDelay >= 15"). \
count()
20705
airtraffic. \
filter("IsDepDelayed = 'NO' AND ArrDelay >= 15 AND cancelled = 0"). \
count()
20705
API Style
from pyspark.sql.functions import col
airtraffic. \
filter((col("IsDepDelayed") == "NO") &
(col("ArrDelay") >= 15)
). \
count()
20705
Get number of flights departed late on Sundays as well as on Saturdays. We can solve such kind of problems using
IN
operator as well.
from pyspark.sql.functions import col, concat, lpad
airtraffic. \
withColumn("FlightDate",
concat(col("Year"),
lpad(col("Month"), 2, "0"),
lpad(col("DayOfMonth"), 2, "0")
)
). \
show()
l = [('X',)]
df = spark.createDataFrame(l, "dummy STRING")
from pyspark.sql.functions import current_date
df.select(current_date()).show()
+--------------+
|current_date()|
+--------------+
| 2021-03-01|
+--------------+
from pyspark.sql.functions import date_format
df.select(current_date(), date_format(current_date(), 'EE').alias('day_name')).show()
+--------------+--------+
|current_date()|day_name|
+--------------+--------+
| 2021-03-01| Mon|
+--------------+--------+
from pyspark.sql.functions import date_format
df.select(current_date(), date_format(current_date(), 'EEEE').alias('day_name')).show()
+--------------+--------+
|current_date()|day_name|
+--------------+--------+
| 2021-03-01| Monday|
+--------------+--------+
from pyspark.sql.functions import col, concat, lpad
airtraffic. \
withColumn("FlightDate",
concat(col("Year"),
lpad(col("Month"), 2, "0"),
lpad(col("DayOfMonth"), 2, "0")
)
). \
filter("""
IsDepDelayed = 'YES' AND Cancelled = 0 AND
(date_format(to_date(FlightDate, 'yyyyMMdd'), 'EEEE') = 'Saturday'
OR date_format(to_date(FlightDate, 'yyyyMMdd'), 'EEEE') = 'Sunday'
)
"""). \
count()
57873
API Style
from pyspark.sql.functions import col, concat, lpad, date_format, to_date
airtraffic. \
withColumn("FlightDate",
concat(col("Year"),
lpad(col("Month"), 2, "0"),
lpad(col("DayOfMonth"), 2, "0")
)
). \
filter((col("IsDepDelayed") == "YES") & (col("Cancelled") == 0) &
((date_format(
to_date("FlightDate", "yyyyMMdd"), "EEEE"
) == "Saturday") |
(date_format(
to_date("FlightDate", "yyyyMMdd"), "EEEE"
) == "Sunday")
)
). \
count()
57873