下边是错误日志,这是为啥?有人指定么,感谢感谢!
Python 3.5.2+ (default, Sep 22 2016, 12:18:14)
[GCC 6.2.0 20160927] on linux
Type "help", "copyright", "credits" or "license" for more information.
2018-04-04 14:44:34 WARN  Utils:66 - Your hostname, ubuntu resolves to a loopback address: 127.0.1.1; using 172.16.0.2 instead (on interface enp2s0)
2018-04-04 14:44:34 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2018-04-04 14:44:42 WARN  NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 2.3.0
      /_/
Using Python version 3.5.2+ (default, Sep 22 2016 12:18:14)
SparkSession available as 'spark'.
>>> text_file = sc.textFile("/home/sujian/log.log")
>>> text_file.count()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/spark/python/pyspark/rdd.py", line 1056, in count
    return self.mapPartitions(lambda i: [sum(1 for _ in i)]).sum()
  File "/usr/local/spark/python/pyspark/rdd.py", line 1047, in sum
    return self.mapPartitions(lambda x: [sum(x)]).fold(0, operator.add)
  File "/usr/local/spark/python/pyspark/rdd.py", line 921, in fold
    vals = self.mapPartitions(func).collect()
  File "/usr/local/spark/python/pyspark/rdd.py", line 824, in collect
    port = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
  File "/usr/local/spark/python/lib/py4j-0.10.6-src.zip/py4j/java_gateway.py", line 1160, in __call__
  File "/usr/local/spark/python/pyspark/sql/utils.py", line 63, in deco
    return f(*a, **kw)
  File "/usr/local/spark/python/lib/py4j-0.10.6-src.zip/py4j/protocol.py", line 320, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: java.lang.NullPointerException
       	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
       	at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
       	at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
       	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
       	at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
       	at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
       	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
       	at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)
       	at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
       	at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
       	at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
       	at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
       	at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
       	at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
       	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
       	at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
       	at scala.collection.immutable.List.foreach(List.scala:381)
       	at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
       	at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
       	at org.apache.spark.SparkContext.clean(SparkContext.scala:2292)
       	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2066)
       	at org.apache.spark.SparkContext.runJob(SparkContext.scala:2092)
       	at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:939)
       	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
       	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
       	at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
       	at org.apache.spark.rdd.RDD.collect(RDD.scala:938)
       	at org.apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:153)
       	at org.apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)
       	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(java.base@9-Ubuntu/Native Method)
       	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(java.base@9-Ubuntu/NativeMethodAccessorImpl.java:62)
       	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(java.base@9-Ubuntu/DelegatingMethodAccessorImpl.java:43)
       	at java.lang.reflect.Method.invoke(java.base@9-Ubuntu/Method.java:535)
       	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
       	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
       	at py4j.Gateway.invoke(Gateway.java:282)
       	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
       	at py4j.commands.CallCommand.execute(CallCommand.java:79)
       	at py4j.GatewayConnection.run(GatewayConnection.java:214)
       	at java.lang.Thread.run(java.base@9-Ubuntu/Thread.java:843)
>>>
|      1hcymk2      2018-04-04 17:02:50 +08:00  1 | 
|      3hcymk2      2018-04-04 17:15:22 +08:00 你的文件 /home/sujian/log.log 是在 hdfs 里面还是本地文件? | 
|  |      4ufo22940268      2018-04-04 17:17:43 +08:00  1 ``` at org.apache.xbean.asm5.ClassReader.a(Unknown Source) at org.apache.xbean.asm5.ClassReader.b(Unknown Source) at org.apache.xbean.asm5.ClassReader.accept(Unknown Source) at org.apache.xbean.asm5.ClassReader.accept(Unknown Source) ``` 看起来好像是 xbean 报的,要么把 xbean 的依赖移除掉试试? | 
|  |      6sujin190 OP @ufo22940268 #4 要怎么移除。。这还能移除啊? | 
|  |      7ufo22940268      2018-04-04 17:21:49 +08:00 | 
|  |      8ufo22940268      2018-04-04 17:23:33 +08:00 @sujin190 不对,好像我这个猜测有问题,你无视掉吧 | 
|  |      9sujin190 OP @ufo22940268 #8 conf 下还是 spark-env.sh.template,那么应该没有配置的吧,这也太奇怪了。。 |