pyspark.sql.functions — PySpark 3.2.0 documentation
spark.apache.org › pyspark › sql# """ A collections of builtin functions """ import sys import functools import warnings from pyspark import since, SparkContext from pyspark.rdd import PythonEvalType from pyspark.sql.column import Column, _to_java_column, _to_seq, _create_column_from_literal from pyspark.sql.dataframe import DataFrame from pyspark.sql.types import StringType ...
PySpark Documentation — PySpark 3.2.0 documentation
spark.apache.org › docs › latestPySpark is an interface for Apache Spark in Python. It not only allows you to write Spark applications using Python APIs, but also provides the PySpark shell for interactively analyzing your data in a distributed environment. PySpark supports most of Spark’s features such as Spark SQL, DataFrame, Streaming, MLlib (Machine Learning) and Spark Core.
PySpark Functions — Glow documentation
glow.readthedocs.io › en › latestPySpark Functions . PySpark Functions. Glow includes a number of functions that operate on PySpark columns. These functions are interoperable with functions provided by PySpark or other libraries. glow.add_struct_fields(struct, *fields) [source] . Adds fields to a struct. Added in version 0.3.0.