kylin版本:apache-kylin-3.0.0-beta-bin-hadoop3
HDP版本:3.1.0.0
1、Permission denied: user=root, access=WRITE, inode="/kylin":hdfs:hdfs
解决方法:
# su - hdfs
# hdfs dfs -mkdir /kylin
# hdfs dfs -chmod a+rwx /kylin
2、Something wrong with Hive CLI or Beeline, please execute Hive CLI or Beeline CLI in terminal to find the root cause.
解决方法:
# vim bin/find-hive-dependency.sh (第37行)
hive_env=`hive ${hive_conf_properties} -e set 2>&1 | grep 'env:CLASSPATH'` 中的变量 ${hive_conf_properties}去掉(未配置此变量),即修改为
hive_env=`hive -e set 2>&1 | grep 'env:CLASSPATH'`
再启动即可
# bin/kylin.sh start
3、spark not found, set SPARK_HOME, or run bin/download-spark.sh
解决方法:
# vim /etc/profile
添加如下三行:
export SPARK_HOME=/usr/hdp/current/spark2-client
export HIVE_CONF=/etc/hive/conf
export HCAT_HOME=/usr/hdp/current/hive-webhcat
# source /etc/profile
4、${KYLIN_HOME}/tomcat/conf/.keystore (没有那个文件或目录)
解决方法:在kylin内置tomcat的server.xml中里边有个对https的支持那一段没启用的话 注释掉
<Connector port="7443" protocol="org.apache.coyote.http11.Http11Protocol"
maxThreads="150" SSLEnabled="true" scheme="https" secure="true"
keystoreFile="conf/.keystore" keystorePass="changeit"
clientAuth="false" sslProtocol="TLS" />
5、Caused by: java.lang.NoClassDefFoundError: org/apache/commons/configuration/ConfigurationException
解决方法:将commons-configuration-*.jar 复制到kylin的tomcat/lib 下
# cp /usr/hdp/share/hst/hst-common/lib/commons-configuration-1.10.jar tomcat/lib/
6、spark2/jars/derbyLocale_cs.jar (没有那个文件或目录)
spark2/jars/derbyLocale_de_DE.jar (没有那个文件或目录)
spark2/jars/derbyLocale_es.jar (没有那个文件或目录)
spark2/jars/derbyLocale_fr.jar (没有那个文件或目录)
spark2/jars/derbyLocale_hu.jar (没有那个文件或目录)
spark2/jars/derbyLocale_it.jar (没有那个文件或目录)
spark2/jars/derbyLocale_ja_JP.jar (没有那个文件或目录)
spark2/jars/derbyLocale_ko_KR.jar (没有那个文件或目录)
spark2/jars/derbyLocale_pl.jar (没有那个文件或目录)
spark2/jars/derbyLocale_pt_BR.jar (没有那个文件或目录
spark2/jars/derbyLocale_ru.jar (没有那个文件或目录)
spark2/jars/derbyLocale_zh_CN.jar (没有那个文件或目录)
spark2/jars/derbyLocale_zh_TW.jar (没有那个文件或目录)
解决方法:警告可以不用管(你也可以去中央仓库mvn中下载相应jar包放入目录下,目前版本号为10.15.1.3)
比如下载derbyLocale_cs-10.15.1.3.jar放入:/usr/hdp/3.1.0.0-78/spark2/jars/ 目录下
# cd /usr/hdp/3.1.0.0-78/spark2/jars/
# ln -s derbyLocale_cs-10.15.1.3.jar derbyLocale_cs.jar
7、Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify dfs.replication at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
这个是运行样例执行 # bin/sample.sh报错
解决方法:conf/kylin_hive_conf.xml中注释掉dfs.replication这个属性,还有mapreduce.job.split.metainfo.maxsize这个属性也注释掉
8、ERROR : FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=root, access=EXECUTE, inode="/warehouse/tablespace/managed/hive":hive:hadoop:drwx------
构建cube报错,解决方法:hdfs服务修改dfs.permissions.enabled参数,设置为false,
重启hdfs服务组件和kylin即可
用demo测试:bin/sample.sh
Restart Kylin Server or click Web UI => System Tab => Reload Metadata to take effect
界面才会显示出来
9、logs/kylin.log日志中报错:SLF4J: Class path contains multiple SLF4Jbindings.,以为hadoop和hive中jar包含多个StaticLoggerBinder.class引起的,后来发现还存在如下错误信息打印
Error: Error while processing statement: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=42000,code=1)
构建cube报错,解决方法:hive-site新增自定义参数
<property>
<name>hive.security.authorization.sqlstd.confwhitelist</name>
<value>mapred.*|hive.*|mapreduce.*|spark.*</value>
</property>
<property>
<name>hive.security.authorization.sqlstd.confwhitelist.append</name>
<value>mapred.*|hive.*|mapreduce.*|spark.*</value>
</property>