第一步,ipython调用pyspark
步骤可以参考这里,
生成 notebook 配置文件
jupyter notebook --generate-config
修改生成 的notebook 配置文件
vi ~/.jupyter/jupyter_notebook_config.py
c.NotebookApp.ip = '1xx.xxx.xxx.xxx'
如果想外网也可以访问,ip 就设为外网 IP 地址,否则就设为127.0.0.1,代表本机访问。
报错:Unrecognized alias: '--profile=pyspark', it will probably have no effect.
原因:
“ipython has moved to version 5.0,which means that if you are using it,it will be reading its configuraiton from ~/.jupyter,not ~/.ipython
You have to create a new configuration file with
jupyter notebook --generate-config
and then edit the resulting
~/.jupyter/jupyter_notebook_config.py .”
简单意思就是:ipython版本在5.0之后,配置目录为~/.jupyter,而不是 ~/.ipython
修改
vi ~/.jupyter/jupyter_notebook_config.py
修改 c.NotebookApp.ip = '127.0.0.1'。
如果想外网也可以访问,ip 就设为外网 IP 地址
启动
jupyter-notebook --config='~/.jupyter/jupyter_notebook_config.py'
在jupyter上测试pyspark,创建SparkContext对象
import findspark
import os
findspark.init()
import pyspark
sc = pyspark.SparkContext()
第二步,为ipython添加scala的kernel:
基本思路,参考这里:
#添加toree
pip install toree
#配置spark目录
jupyter toree install --spark_home=your-spark-home
这里的spark-home:
也就是你进入/opt/spark-2.0.0-bin-hadoop2.7/sbin,可以
#停止spark
./stop-all.sh
#启动spark
./start-all.sh
上面的your-spark-home就是/opt/spark-2.0.0-bin-hadoop2.7/
查看kernels列表:
jupyter kernelspec list
结果:
启动jupyter
jupyter-notebook --config='~/.jupyter/jupyter_notebook_config.py'
报错如下:
java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
这里在查找解决办法的过程上,出了个小插曲,尝试jupyter官网一个方法的时候,遇到了sbt的问题
sbt: command not found
sbt:Getting org.scala-sbt sbt 0.13.6
多次折腾之后,无法解决,决定重新配置。
第三步,仍然无法解决问题,决定重新配置(所以其实可以直接从这里开始。。。)
先把kernel全部删除,首先查看kernel的详情和安装路径:
jupyter kernelspec list
将kernels目录下的都删除,但不删除kernels目录本身
#下面的 /root/.local/share/jupyter/kernels/,对应本机的kernel路径,然后最后的'*'代表该目录下全删除。
rm -rf /root/.local/share/jupyter/kernels/*
然后卸载toree
pip uninstall toree
3.1、安装toree方法a
在这个问题下发现当环境配置为:spark 2.0 +2.11scala,应该安装toree的版本为toree 0.2.0.dev1此方法需要 python 2.7 +conda,
如图
按提示输入命令:
pip install -i https://pypi.anaconda.org/hyoon/simple toree
3.2、安装toree方法b
如果没有 python 2.7 & conda,就下载tgz文件然后
tar zxvf toree-0.2.0.dev1.tar.gz
pip install -e toree-0.2.0.dev1
3.3、安装toree方法c
wget https://dist.apache.org/repos/dist/dev/incubator/toree/0.2.0/snapshots/dev1/toree-pip/toree-0.2.0.dev1.tar.gz
pip install toree-0.2.0.dev1.tar.gz
3.4、toree安装完成后,配置kernel
重新装好toree后,重新将spark目录配置给jupyter toree:
jupyter toree install --spark_home=/opt/spark-2.0.0-bin-hadoop2.7
检查一下现在的kernel列表:
jupyter kernelspec list
启动jupyter:
jupyter-notebook --config='~/.jupyter/jupyter_notebook_config.py'
3.5、出错为:Unsupported major.minor version 52.0的解决办法
报错如下:
Exception in thread "main" java.lang.UnsupportedClassVersionError: com/typesafe/config/ConfigMergeable : Unsupported major.minor version 52.0
出错原因为:Unsupported major.minor version 52.0
经分析,问题应该出在版本不对应,
查阅资料:
Java SE 9 = 53,
Java SE 8 = 52,
Java SE 7 = 51,
Java SE 6.0 = 50,
Java SE 5.0 = 49,
JDK 1.4 = 48,
JDK 1.3 = 47,
JDK 1.2 = 46,
JDK 1.1 = 45
检查版本情况:
Java版本:
java -version
结果:
java version "1.8.0_151"
Java(TM) SE Runtime Environment (build 1.8.0_151-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.151-b12, mixed mode)
javac版本
javac -version
结果:
javac 1.8.0_121
检查是否有多个java JDK被安装,
sudo update-alternatives --config javac
结果
There is 1 program that provides 'javac'.
Selection Command
-----------------------------------------------
*+ 1 java-1.8.0-openjdk.x86_64 (/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.121-0.b13.el7_3.x86_64/bin/javac)
以本机为例,root用户检查/etc/profile文件
vim /etc/profile
可以看到这里的JAVA_HOME JAVA_BIN配置均为jdk1.8.0.151
判断应该出在spark的Java版本和本机java版本不对应的问题
参考这里:spark提交jar包时出现unsupported major.minor version 52.0错误的解决方案
检查spark安装conf目录下的spark-env.sh文件
vim /otp/spark-2.0.0-bin-hadoop2.7/conf/spark-env.sh
结果:
果然这里的java路径配置出错,和系统的环境/etc/profile文件不一致,应该是之前学弟在旧版本java时候配置的,将/etc/profile文件的JAVA_HOME和JAVA_BIN粘贴过来,保存。
查看/usr/java目录
这里应该是有一个学弟之前配置spark环境,尝试了jdk1.7和jdk1.8发现jdk1.7会和spark2.0.0不兼容。然而他的工作并没有留下文档和日志之类的,这里严重体现了工作记录得重要性!!!
重新启动jupyter,终于成功了。
本机环境版本:
linux(centos)+jdk1.8+Spark 2.0.0+Scala 2.11.8+hadoop 2.7.3+ Python 2.7.12 |Anaconda 4.2.0 (64-bit)
参考文章:
https://m.2cto.com/kf/201611/566880.html
https://www.cnblogs.com/NaughtyBaby/p/5469469.html
http://blog.csdn.net/u012948976/article/details/52372644
http://blog.csdn.net/qq_30901367/article/details/73296887
http://blog.csdn.net/xmo_jiao/article/details/72674687?utm_source=itdadao&utm_medium=referral
https://datascience.stackexchange.com/questions/6555/issue-with-ipython-jupyter-on-spark-unrecognized-alias
https://issues.apache.org/jira/browse/TOREE-354
http://blog.csdn.net/qq_30901367/article/details/73296887
https://stackoverflow.com/questions/39535858/installing-scala-kernel-or-spark-toree-for-jupyter-anaconda
http://jupyter-client.readthedocs.io/en/latest/kernels.html#kernelspecs
https://www.cnblogs.com/liujStudy/p/7217480.html