Main Content

getDefaultReducePartitions

Class: matlab.compiler.mlspark.RDD
Namespace: matlab.compiler.mlspark

Get the number of default reduce partitions in an RDD

Syntax

numPartitions = getDefaultReducePartitions(obj)

Description

numPartitions = getDefaultReducePartitions(obj) gets the number of default reduce partitions in obj.

Input Arguments

expand all

An input RDD, specified as an RDD object

Output Arguments

expand all

The number of default reduce partitions in the input RDD, returned as a scalar value.

Examples

expand all

Get the number of default reduce partitions in an RDD.

%% Connect to Spark
sparkProp = containers.Map({'spark.executor.cores'}, {'1'});
conf = matlab.compiler.mlspark.SparkConf('AppName','myApp', ...
                        'Master','local[1]','SparkProperties',sparkProp);
sc = matlab.compiler.mlspark.SparkContext(conf);

%% getDefaultReducePartitions
x = sc.parallelize({1,2,3});
y = x.map(@(x)({x,1}));
z1 = y.reduceByKey(@(a,b)(a+b));
z2 = y.reduceByKey(@(a,b)(a+b), 3);

z1.getDefaultReducePartitions() % ans is 1
z2.getDefaultReducePartitions() % ans is 3, as the 2nd argument to reduceByKey is the number of reduce partitions

Version History

Introduced in R2016b