共享变量
spark一个非常重要的特性就是共享变量
默认情况下,如果在一个算子的函数中使用到了某个外部的变量,那么这个变量的值会被拷贝到每个task(线程)中,此时,每个task只能操作自己的那份变量副本,如果多个task想要共享某个变量,默认情况下是做不到的
spark为此提供了两种共享变量,
Broadcast Variable(广播变量):会将使用到的变量,仅仅为每个节点拷贝一份,BroadcastVariable是只读的,主要用处是优化性能,通过减少变量到各个节点的网络传输消耗,以及在各个节点上的内存消耗;
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
/**
* @author Administrator
*/
object BroadcastVariable {
def main(args: Array[String]) {
val conf = new SparkConf()
.setAppName("BroadcastVariable")
.setMaster("local")
val sc = new SparkContext(conf)
val factor = 3;
val factorBroadcast = sc.broadcast(factor)
val numberArray = Array(1, 2, 3, 4, 5)
val numbers = sc.parallelize(numberArray, 1)
val multipleNumbers = numbers.map { num => num * factorBroadcast.value }
multipleNumbers.foreach { num => println(num) }
}
}
--------------------------------------------------------------------------------------------
import java.util.Arrays;
import java.util.List;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
import org.apache.spark.api.java.function.Function;
import org.apache.spark.api.java.function.VoidFunction;
import org.apache.spark.broadcast.Broadcast;
/**
* 广播变量
* @author Administrator
*
*/
public class BroadcastVariable {
public static void main(String[] args) {
SparkConf conf = new SparkConf()
.setAppName("BroadcastVariable")
.setMaster("local");
JavaSparkContext sc = new JavaSparkContext(conf);
// 在java中,创建共享变量,就是调用SparkContext的broadcast()方法
// 获取的返回结果是Broadcast<T>类型
final int factor = 3;
final Broadcast<Integer> factorBroadcast = sc.broadcast(factor);
List<Integer> numberList = Arrays.asList(1, 2, 3, 4, 5);
JavaRDD<Integer> numbers = sc.parallelize(numberList);
// 让集合中的每个数字,都乘以外部定义的那个factor
JavaRDD<Integer> multipleNumbers = numbers.map(new Function<Integer, Integer>() {
private static final long serialVersionUID = 1L;
@Override
public Integer call(Integer v1) throws Exception {
// 使用共享变量时,调用其value()方法,即可获取其内部封装的值
int factor = factorBroadcast.value();
return v1 * factor;
}
});
multipleNumbers.foreach(new VoidFunction<Integer>() {
private static final long serialVersionUID = 1L;
@Override
public void call(Integer t) throws Exception {
System.out.println(t);
}
});
sc.close();
}
}
Accumulator(累加变量):可以让多个task共同操作一份变量,主要用于多个节点对一个变量进行共享性的操作,比如累加操作。但是task只能对Accumulator进行累加操作,不能读取他的值,只有Driver才可以读取Accumulator的值。
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
/**
* @author Administrator
*/
object AccumulatorVariable {
def main(args: Array[String]) {
val conf = new SparkConf()
.setAppName("AccumulatorVariable")
.setMaster("local")
val sc = new SparkContext(conf)
val sum = sc.accumulator(0)
val numberArray = Array(1, 2, 3, 4, 5)
val numbers = sc.parallelize(numberArray, 1)
numbers.foreach { num => sum += num }
println(sum)
}
}
------------------------------------------------------------------------------------
/**
* 累加变量
* @author Administrator
*
*/
public class AccumulatorVariable {
public static void main(String[] args) {
SparkConf conf = new SparkConf()
.setAppName("Accumulator")
.setMaster("local");
JavaSparkContext sc = new JavaSparkContext(conf);
// 创建Accumulator变量
// 需要调用SparkContext的accumulator()方法
final Accumulator<Integer> sum = sc.accumulator(0);
List<Integer> numberList = Arrays.asList(1, 2, 3, 4, 5);
JavaRDD<Integer> numbers = sc.parallelize(numberList);
numbers.foreach(new VoidFunction<Integer>() {
private static final long serialVersionUID = 1L;
@Override
public void call(Integer t) throws Exception {
// 然后在函数内部,就可以对Accumulator变量,调用add()方法,累加值
sum.add(t);
}
});
// 在driver程序中,可以调用Accumulator的value()方法,获取其值
System.out.println(sum.value());
sc.close();
}
}