博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
Hive报错日志记录
阅读量:6711 次
发布时间:2019-06-25

本文共 7233 字,大约阅读时间需要 24 分钟。

报错日志:Zero length BigInteger

日志内容:

java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) [Error getting row data with exception java.lang.NumberFormatException: Zero length BigInteger    at java.math.BigInteger.
(BigInteger.java:171) at org.apache.hadoop.hive.serde2.io.HiveDecimalWritable.getHiveDecimal(HiveDecimalWritable.java:85) at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:43) at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:322) at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:392) at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:392) at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:392) at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:236) at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:222) at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:462) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:408) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:443) ] at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:282) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:462) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:408) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:443)Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row (tag=0) [Error getting row data with exception java.lang.NumberFormatException: Zero length BigInteger at java.math.BigInteger.
(BigInteger.java:171) at org.apache.hadoop.hive.serde2.io.HiveDecimalWritable.getHiveDecimal(HiveDecimalWritable.java:85) at org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableHiveDecimalObjectInspector.getPrimitiveJavaObject(WritableHiveDecimalObjectInspector.java:43) at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:322) at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:392) at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:392) at org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:392) at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:236) at org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:222) at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:265) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:462) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:408) at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:443) ] at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:270) ... 3 moreCaused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NumberFormatException: Zero length BigInteger at org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:808) at org.apache.hadoop.hive.ql.exec.mr.ExecReducer.reduce(ExecReducer.java:261) ... 3 moreCaused by: java.lang.NumberFormatException: Zero length BigInteger at java.math.BigInteger.
(BigInteger.java:171) at org.apache.hadoop.hive.serde2.io.HiveDecimalWritable.getHiveDecimal(HiveDecimalWritable.java:96) at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryHiveDecimal.init(LazyBinaryHiveDecimal.java:48) at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.uncheckedGetField(LazyBinaryStruct.java:216) at org.apache.hadoop.hive.serde2.lazybinary.LazyBinaryStruct.getField(LazyBinaryStruct.java:197) at org.apache.hadoop.hive.serde2.lazybinary.objectinspector.LazyBinaryStructObjectInspector.getStructFieldData(LazyBinaryStructObjectInspector.java:64) at org.apache.hadoop.hive.ql.udf.generic.GenericUDAFAverage$AbstractGenericUDAFAverageEvaluator.merge(GenericUDAFAverage.java:353) at org.apache.hadoop.hive.ql.udf.generic.GenericUDAFEvaluator.aggregate(GenericUDAFEvaluator.java:186) at org.apache.hadoop.hive.ql.exec.GroupByOperator.updateAggregations(GroupByOperator.java:641) at org.apache.hadoop.hive.ql.exec.GroupByOperator.processAggr(GroupByOperator.java:905) at org.apache.hadoop.hive.ql.exec.GroupByOperator.processKey(GroupByOperator.java:737) at org.apache.hadoop.hive.ql.exec.GroupByOperator.processOp(GroupByOperator.java:803) ... 4 more

问题分析:可能两个子查询UNION ALL合并,存在合并字段的数据类型不一致导致

解决方案:检查各个字段数据类型并转换为同一数据类型

报错日志:java.lang.ArrayIndexOutOfBoundsException: 2

日志内容:

Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row         at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:185)        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)        at java.security.AccessController.doPrivileged(Native Method)        at javax.security.auth.Subject.doAs(Subject.java:415)        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row         at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:52)        at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:176)        ... 8 moreCaused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error evaluating 1        at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.processOp(VectorSelectOperator.java:126)        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)        at org.apache.hadoop.hive.ql.exec.vector.VectorFilterOperator.processOp(VectorFilterOperator.java:111)        at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:815)        at org.apache.hadoop.hive.ql.exec.TableScanOperator.processOp(TableScanOperator.java:95)        at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:157)        at org.apache.hadoop.hive.ql.exec.vector.VectorMapOperator.process(VectorMapOperator.java:45)        ... 9 moreCaused by: java.lang.ArrayIndexOutOfBoundsException: 2        at org.apache.hadoop.hive.ql.exec.vector.expressions.ConstantVectorExpression.evaluateLong(ConstantVectorExpression.java:102)        at org.apache.hadoop.hive.ql.exec.vector.expressions.ConstantVectorExpression.evaluate(ConstantVectorExpression.java:150)        at org.apache.hadoop.hive.ql.exec.vector.VectorSelectOperator.processOp(VectorSelectOperator.java:124)        ... 15 more

问题分析:可能两个子查询UNION ALL合并,存在合并字段的数据类型不一致导致

解决方案:检查各个字段数据类型并转换为同一数据类型

转载地址:http://eealo.baihongyu.com/

你可能感兴趣的文章
总结一波安卓组件化开源方案
查看>>
oss-server 简单对象存储系统
查看>>
使用fastcgi_finish_request提高页面响应速度
查看>>
javascript成神之路(4):深入理解this关键字,是的就是this
查看>>
Flink1.4 Fault Tolerance源码解析-1
查看>>
Spring Cloud配置中心
查看>>
React 路由状态管理总结
查看>>
JAVA 几种引用类型学习
查看>>
Android WindowManager悬浮窗:不需要申请权限实现悬浮
查看>>
偶遇到客户的奇葩需求
查看>>
禅道 11.3 版本发布,主要完善细节,修复 bug
查看>>
无人机新用途,可精确识别危险海洋生物并向游泳者发出预警
查看>>
国家天文台-阿里云天文大数据联合中心成立
查看>>
Lua的文件操作
查看>>
更好的以太坊,会是怎样的?
查看>>
计算与推断思维 六、可视化
查看>>
阿里建“猫茂”线下购物中心,将实现新零售技术的真正落地
查看>>
高等教育转型:确保数据可用性是关键
查看>>
【基础】这15种CSS居中的方式,你都用过哪几种?
查看>>
网站的通用注册原型设计
查看>>