pyspark.sql.functions.try_add#
- pyspark.sql.functions.try_add(left, right)[source]#
Returns the sum of left`and `right and the result is null on overflow. The acceptable input types are the same with the + operator.
New in version 3.5.0.
Examples
Example 1: Integer plus Integer.
>>> import pyspark.sql.functions as sf >>> spark.createDataFrame( ... [(1982, 15), (1990, 2)], ["birth", "age"] ... ).select("*", sf.try_add("birth", "age")).show() +-----+---+-------------------+ |birth|age|try_add(birth, age)| +-----+---+-------------------+ | 1982| 15| 1997| | 1990| 2| 1992| +-----+---+-------------------+
Example 2: Date plus Integer.
>>> import pyspark.sql.functions as sf >>> spark.sql( ... "SELECT * FROM VALUES (DATE('2015-09-30')) AS TAB(date)" ... ).select("*", sf.try_add("date", sf.lit(1))).show() +----------+----------------+ | date|try_add(date, 1)| +----------+----------------+ |2015-09-30| 2015-10-01| +----------+----------------+
Example 3: Date plus Interval.
>>> import pyspark.sql.functions as sf >>> spark.sql( ... "SELECT * FROM VALUES (DATE('2015-09-30'), INTERVAL 1 YEAR) AS TAB(date, itvl)" ... ).select("*", sf.try_add("date", "itvl")).show() +----------+-----------------+-------------------+ | date| itvl|try_add(date, itvl)| +----------+-----------------+-------------------+ |2015-09-30|INTERVAL '1' YEAR| 2016-09-30| +----------+-----------------+-------------------+
Example 4: Interval plus Interval.
>>> import pyspark.sql.functions as sf >>> spark.sql( ... "SELECT * FROM VALUES (INTERVAL 1 YEAR, INTERVAL 2 YEAR) AS TAB(itvl1, itvl2)" ... ).select("*", sf.try_add("itvl1", "itvl2")).show() +-----------------+-----------------+---------------------+ | itvl1| itvl2|try_add(itvl1, itvl2)| +-----------------+-----------------+---------------------+ |INTERVAL '1' YEAR|INTERVAL '2' YEAR| INTERVAL '3' YEAR| +-----------------+-----------------+---------------------+
Example 5: Overflow results in NULL when ANSI mode is on
>>> import pyspark.sql.functions as sf >>> origin = spark.conf.get("spark.sql.ansi.enabled") >>> spark.conf.set("spark.sql.ansi.enabled", "true") >>> try: ... spark.range(1).select(sf.try_add(sf.lit(sys.maxsize), sf.lit(sys.maxsize))).show() ... finally: ... spark.conf.set("spark.sql.ansi.enabled", origin) +-------------------------------------------------+ |try_add(9223372036854775807, 9223372036854775807)| +-------------------------------------------------+ | NULL| +-------------------------------------------------+