pyspark.pandas.DataFrame.expanding#

DataFrame.expanding(min_periods=1)#

Provide expanding transformations.

Note

‘min_periods’ in pandas-on-Spark works as a fixed window size unlike pandas. Unlike pandas, NA is also counted as the period. This might be changed soon.

Parameters
min_periods: int, default 1

Minimum number of observations in window required to have a value (otherwise result is NA).

Returns
a Window sub-classed for the operation