1. hudi写完文件后同步hive元数据报错:
Caused by: java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. Exception thrown when executing query : SELECT DISTINCT 'org.apache.hadoop.hive.metastore.model.MPartition' AS `NUCLEUS_TYPE`,`A0`.`CREATE_TIME`,`A0`.`LAST_ACCESS_TIME`,`A0`.`PART_NAME`,`A0`.`PART_ID`,`A0`.`PART_NAME` AS `NUCORDER0` FROM `PARTITIONS` `A0` LEFT OUTER JOIN `TBLS` `B0` ON `A0`.`TBL_ID` = `B0`.`TBL_ID` LEFT OUTER JOIN `DBS` `C0` ON `B0`.`DB_ID` = `C0`.`DB_ID` WHERE `B0`.`TBL_NAME` = ? AND `C0`.`NAME` = ? ORDER BY `NUCORDER0`
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:296)
at org.apache.hudi.hive.HoodieHiveClient.updateHiveSQL(HoodieHiveClient.java:367)
... 53 more
sql直接在mysql的hive库执行是正常的
2. 把hive服务日志导出,发现如下错误
java.sql.SQLException: Error writing file '/tmp/MYzjeP0J' (Errcode: 28 - No space left on device)
原因:查看mysql所在的物理机,发现mysql存储mysql-slow.log的磁盘分区空间满了