Interface | Description |
---|---|
TaskCompletionListener |
:: DeveloperApi ::
|
TaskFailureListener |
:: DeveloperApi ::
|
Class | Description |
---|---|
AccumulatorContext |
An internal class used to track accumulators by Spark itself.
|
AccumulatorV2<IN,OUT> |
The base class for accumulators, that can accumulate inputs of type
IN , and produce output of
type OUT . |
CausedBy |
Extractor Object for pulling out the root cause of an error.
|
ClosureCleaner |
A cleaner that renders closures serializable if they can be done so safely.
|
CollectionAccumulator<T> |
An
accumulator for collecting a list of elements. |
CollectionsUtils | |
DoubleAccumulator |
An
accumulator for computing sum, count, and averages for double precision
floating numbers. |
EnumUtil | |
InnerClosureFinder | |
IntParam |
An extractor object for parsing strings into integers.
|
JsonProtocol |
Serializes SparkListener events to/from JSON.
|
LegacyAccumulatorWrapper<R,T> | |
LongAccumulator |
An
accumulator for computing sum, count, and average of 64-bit integers. |
MemoryParam |
An extractor object for parsing JVM memory strings, such as "10g", into an Int representing
the number of megabytes.
|
MethodIdentifier<T> |
Helper class to identify a method.
|
MutablePair<T1,T2> |
:: DeveloperApi ::
A tuple of 2 elements.
|
ReturnStatementFinder | |
RpcUtils | |
ShutdownHookManager |
Various utility methods used by Spark.
|
SignalUtils |
Contains utilities for working with posix signals.
|
SizeEstimator |
:: DeveloperApi ::
Estimates the sizes of Java objects (number of bytes of memory they occupy), for use in
memory-aware caches.
|
SparkExitCode | |
SparkShutdownHook | |
SparkUncaughtExceptionHandler |
The default uncaught exception handler for Executors terminates the whole process, to avoid
getting into a bad state indefinitely.
|
StatCounter |
A class for tracking the statistics of a set of numbers (count, mean and variance) in a
numerically robust way.
|
ThreadUtils | |
Utils |
Various utility methods used by Spark.
|
VersionUtils |
Utilities for working with Spark version strings
|