A RetroSearch Logo

Home - News ( United States | United Kingdom | Italy | Germany ) - Football scores

Search Query:

Showing content from https://spark.apache.org/docs/latest/api/java/org/apache/spark/sql/functions.html below:

functions (Spark 4.0.0 JavaDoc)

Object

org.apache.spark.sql.functions

public class functions extends Object

Commonly used functions available for DataFrame operations. Using functions defined here provides a little bit more compile-time safety to make sure the function exists.

You can call the functions defined here by two ways: _FUNC_(...) and functions.expr("_FUNC_(...)").

As an example, regr_count is a function that is defined here. You can use regr_count(col("yCol", col("xCol"))) to invoke the regr_count function. This way the programming language's compiler ensures regr_count exists and is of the proper form. You can also use expr("regr_count(yCol, xCol)") function to invoke the same function. In this case, Spark itself will ensure regr_count exists when it analyzes the query.

You can find the entire list of functions at SQL API documentation of your Spark version, see also the latest list

This function APIs usually have methods with Column signature only because it can support not only Column but also other types such as a native string. The other variants currently exist for historical reasons.

Since:
1.3.0

RetroSearch is an open source project built by @garambo | Open a GitHub Issue

Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo

HTML: 3.2 | Encoding: UTF-8 | Version: 0.7.4