1 | [root@localhost:3 spark-1.2.1-bin-hadoop2.4]# ./bin/run-example SparkPi 10 > Sparkpilog.txt |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 | Spark assembly has been built with Hive, including Datanucleus jars on classpath Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 15/05/19 09:58:38 WARN Utils: Your hostname, localhost.localdomain resolves to a loopback address: 127.0.0.1; using 10.10.19.186 instead (on interface eth0) 15/05/19 09:58:38 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address 15/05/19 09:58:38 INFO SecurityManager: Changing view acls to: root 15/05/19 09:58:38 INFO SecurityManager: Changing modify acls to: root 15/05/19 09:58:38 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 15/05/19 09:58:43 INFO DAGScheduler: Stopping DAGScheduler 15/05/19 09:58:44 INFO MapOutputTrackerMasterActor: MapOutputTrackerActor stopped! 15/05/19 09:58:44 INFO MemoryStore: MemoryStore cleared 15/05/19 09:58:44 INFO BlockManager: BlockManager stopped 15/05/19 09:58:44 INFO BlockManagerMaster: BlockManagerMaster stopped 15/05/19 09:58:44 INFO SparkContext: Successfully stopped SparkContext 15/05/19 09:58:44 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 15/05/19 09:58:44 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 15/05/19 09:58:44 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down. |
1 | Pi is roughly 3.142888 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 | public final class JavaSparkPi { public static void main(String[] args) throws Exception { SparkConf sparkConf = new SparkConf().setAppName("JavaSparkPi"); JavaSparkContext jsc = new JavaSparkContext(sparkConf); int slices = (args.length == 1) ? Integer.parseInt(args[0]) : 2; int n = 100000 * slices; List<Integer> l = new ArrayList<Integer>(n); for (int i = 0; i < n; i++) { l.add(i); } JavaRDD<Integer> dataSet = jsc.parallelize(l, slices); int count = dataSet.map(new Function<Integer, Integer>() { @Override public Integer call(Integer integer) { double x = Math.random() * 2 - 1; double y = Math.random() * 2 - 1; return (x * x + y * y < 1) ? 1 : 0; } }).reduce(new Function2<Integer, Integer, Integer>() { @Override public Integer call(Integer integer, Integer integer2) { return integer + integer2; } }); System.out.println("Pi is roughly " + 4.0 * count / n); jsc.stop(); } } |
1 2 3 | FWDIR="$(cd "`dirname "$0"`"/..; pwd)" export SPARK_HOME="$FWDIR" EXAMPLES_DIR="$FWDIR"/examples |
1 2 3 | if [ -n "$1" ]; then EXAMPLE_CLASS="$1" Shift |
1 2 3 4 5 | "$FWDIR"/bin/spark-submit \ --master $EXAMPLE_MASTER \ --class $EXAMPLE_CLASS \ "$SPARK_EXAMPLES_JAR" \ "$@" |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 | while (($#)); do if [ "$1" = "--deploy-mode" ]; then SPARK_SUBMIT_DEPLOY_MODE=$2 elif [ "$1" = "--properties-file" ]; then SPARK_SUBMIT_PROPERTIES_FILE=$2 elif [ "$1" = "--driver-memory" ]; then export SPARK_SUBMIT_DRIVER_MEMORY=$2 elif [ "$1" = "--driver-library-path" ]; then export SPARK_SUBMIT_LIBRARY_PATH=$2 elif [ "$1" = "--driver-class-path" ]; then export SPARK_SUBMIT_CLASSPATH=$2 elif [ "$1" = "--driver-java-options" ]; then export SPARK_SUBMIT_OPTS=$2 elif [ "$1" = "--master" ]; then export MASTER=$2 fi shift done |
1 | exec "$SPARK_HOME"/bin/spark-class org.apache.spark.deploy.SparkSubmit "${ORIG_ARGS[@]}" |
1 | root@localhost:3 spark-1.2.1-bin-hadoop2.4]# ./bin/run-example JavaWordCount ./wordcountdata.txt |
1 2 3 4 5 6 7 8 9 10 11 12 13 | For: 4 SQLMLlib: 1 subfolder: 1 OS).: 1 Streaming,: 1 APIs: 1 full: 1 --master: 3 through: 1 Provisioning3rd-Party: 1 applications: 4 graph: 3 over: 1 |
欢迎光临 电子技术论坛_中国专业的电子工程师学习交流社区-中电网技术论坛 (http://bbs.eccn.com/) | Powered by Discuz! 7.0.0 |