pyspark.ml.
UnaryTransformer
¶
Abstract class for transformers that take one input column, apply transformation, and output the result as a new column.
Methods
clear
(param)
Clears a param from the param map if it has been explicitly set.
copy
([extra])
Creates a copy of this instance with the same uid and some extra params.
Creates the transform function using the given param map.
explainParam
(param)
Explains a single param and returns its name, doc, and optional default value and user-supplied value in a string.
Returns the documentation of all params with their optionally default values and user-supplied values.
extractParamMap
([extra])
Extracts the embedded default param values and user-supplied values, and then merges them with extra values from input into a flat param map, where the latter value is used if there exist conflicts, i.e., with ordering: default param values < user-supplied values < extra.
Gets the value of inputCol or its default value.
getOrDefault
(param)
Gets the value of a param in the user-supplied param map or its default value.
Gets the value of outputCol or its default value.
getParam
(paramName)
Gets a param by its name.
hasDefault
(param)
Checks whether a param has a default value.
hasParam
(paramName)
Tests whether this instance contains a param with a given (string) name.
isDefined
(param)
Checks whether a param is explicitly set by user or has a default value.
isSet
(param)
Checks whether a param is explicitly set by user.
Returns the data type of the output column.
set
(param, value)
Sets a parameter in the embedded param map.
setInputCol
(value)
Sets the value of inputCol
.
setOutputCol
(value)
Sets the value of outputCol
.
transform
(dataset[, params])
Transforms the input dataset with optional parameters.
transformSchema
(schema)
validateInputType
(inputType)
Validates the input type.
Attributes
Methods Documentation
clear
(param: pyspark.ml.param.Param) → None¶
Clears a param from the param map if it has been explicitly set.
copy
(extra: Optional[ParamMap] = None) → P¶
Creates a copy of this instance with the same uid and some extra params. The default implementation creates a shallow copy using copy.copy()
, and then copies the embedded and extra parameters over and returns the copy. Subclasses should override this method if the default approach is not sufficient.
Extra parameters to copy to the new instance
Params
Copy of this instance
createTransformFunc
() → Callable[[â¦], Any]¶
Creates the transform function using the given param map. The input param map already takes account of the embedded param map. So the param values should be determined solely by the input param map.
explainParam
(param: Union[str, pyspark.ml.param.Param]) → str¶
Explains a single param and returns its name, doc, and optional default value and user-supplied value in a string.
explainParams
() → str¶
Returns the documentation of all params with their optionally default values and user-supplied values.
Extracts the embedded default param values and user-supplied values, and then merges them with extra values from input into a flat param map, where the latter value is used if there exist conflicts, i.e., with ordering: default param values < user-supplied values < extra.
extra param values
merged param map
getInputCol
() → str¶
Gets the value of inputCol or its default value.
getOrDefault
(param: Union[str, pyspark.ml.param.Param[T]]) → Union[Any, T]¶
Gets the value of a param in the user-supplied param map or its default value. Raises an error if neither is set.
getOutputCol
() → str¶
Gets the value of outputCol or its default value.
getParam
(paramName: str) → pyspark.ml.param.Param¶
Gets a param by its name.
hasDefault
(param: Union[str, pyspark.ml.param.Param[Any]]) → bool¶
Checks whether a param has a default value.
hasParam
(paramName: str) → bool¶
Tests whether this instance contains a param with a given (string) name.
isDefined
(param: Union[str, pyspark.ml.param.Param[Any]]) → bool¶
Checks whether a param is explicitly set by user or has a default value.
isSet
(param: Union[str, pyspark.ml.param.Param[Any]]) → bool¶
Checks whether a param is explicitly set by user.
outputDataType
() → pyspark.sql.types.DataType¶
Returns the data type of the output column.
set
(param: pyspark.ml.param.Param, value: Any) → None¶
Sets a parameter in the embedded param map.
setInputCol
(value: str) → P¶
Sets the value of inputCol
.
setOutputCol
(value: str) → P¶
Sets the value of outputCol
.
transform
(dataset: pyspark.sql.dataframe.DataFrame, params: Optional[ParamMap] = None) → pyspark.sql.dataframe.DataFrame¶
Transforms the input dataset with optional parameters.
pyspark.sql.DataFrame
input dataset
an optional param map that overrides embedded params.
pyspark.sql.DataFrame
transformed dataset
transformSchema
(schema: pyspark.sql.types.StructType) → pyspark.sql.types.StructType¶
validateInputType
(inputType: pyspark.sql.types.DataType) → None¶
Validates the input type. Throw an exception if it is invalid.
Attributes Documentation
inputCol
: Param[str] = Param(parent='undefined', name='inputCol', doc='input column name.')¶
outputCol
= Param(parent='undefined', name='outputCol', doc='output column name.')¶
params
¶
Returns all params ordered by name. The default implementation uses dir()
to get all attributes of type Param
.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4