1.环境准备
spark-jobserver 172.16.23.64(centos7.0.1406)
mysql 172.16.18.73
2.安装步骤
2.1. 版本选择
[https://github.com/spark-jobserver/spark-jobserver](https://github.com/spark-jobserver/spark-jobserver)
选择V0.9.0稳定版本,[https://github.com/spark-jobserver/spark-jobserver/releases/tag/v0.9.0](https://github.com/spark-jobserver/spark-jobserver/releases/tag/v0.9.0),官方文档说明0.9版本对应的spark版本为2.3.2,我们在用的spark版本为2.3.4,暂时没有发现不兼容的问题
2.2 编译、打包、部署
$JOBSERVER_HOME/config目录下已存在一些配置模板,可以复用这些模板并对其中的配置项做相应的调整。
在conf目录下创建local.conf及local.sh文件,修改其中的配置项。
模式设置:
submit.deployMode = "cluster"
driver.supervise = "true"
master = "[spark://hadoop-master:6066](spark://hadoop-master:6066)"
数据存储配置,选择mysql
sqldao {
# Slick database driver, full classpath
slick-driver = slick.driver.MySQLDriver
# JDBC driver, full classpath
jdbc-driver = com.mysql.jdbc.Driver
# Directory where default H2 driver stores its data. Only needed for H2.
rootdir = /tmp/spark-jobserver/sqldao/data
Full JDBC URL / init string, along with username and password. Sorry, needs to match above.
# Substitutions may be used to launch job-server, but leave it out here in the default or tests won't pass
jdbc {
url = "jdbc:mysql://172.16.18.73:3306/spark_jobserver?useUnicode=true&characterEncoding=utf8&useSSL=false&autoReconnect=true"
user = "root"
password = "EcarxMysql!01"
}
flyway.locations="db/mysql/migration"
jar包大小限制
spray.can.server {
parsing.max-content-length = 150m
idle-timeout = 400s
request-timeout = 300s
}
$JOBSERVER_HOME/bin/server_package local//编译打包
打包成功后,拷贝出job-server.tar.gz到目标运行平台,应该部署安装spark的服务器环境中。
3 启动、验证
$JOBSERVER_HOME/bin/server_start 启动服务,默认监听端口为8090,可在启动前修改local.conf进行配置。
//创建context
curl -d "" 'localhost:8090/contexts/test-context?num-cpu-cores=2&memory-per-node=1G'
//上传jar包
curl X POST localhost:8090/binaries/test -H "Content-Type: application/java-archive" --data-binary @job-server-tests/target/scala-2.11/job-server-tests_2.11-0.9.1-SNAPSHOT.jar
//执行job,如果不指定context,执行job过程中会自动生成一个context,job结束后context自动关闭
curl -d "" 'localhost:8090/jobs?appName=realJar&classPath=VeryShortDoubleJob'