凯发娱发k8

spark hadoop free 安装遇到的问题 -凯发娱发k8

2023-10-20,,

运行 ./sbin/start-master.sh :


    command:/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java -cp /home/server/spark/conf/:/home/server/spark/jars/*:/home/server/hadoop/etc/hadoop/:/home/server/hadoop/share/hadoop/common/lib/:/home/server/hadoop/share/hadoop/common/:/home/server/hadoop/share/hadoop/mapreduce/:/home/server/hadoop/share/hadoop/mapreduce/lib/:/home/server/hadoop/share/hadoop/yarn/:/home/server/hadoop/share/hadoop/yarn/lib/ -xmx1g org.apache.spark.deploy.master.master --host thinkpad-w550s-lab --port 7077 --webui-port 8080
    ========================================
    error: a jni error has occurred, please check your installation and try again
    exception in thread "main" java.lang.noclassdeffounderror: org/slf4j/logger
    at java.lang.class.getdeclaredmethods0(native method)
    at java.lang.class.privategetdeclaredmethods(class.java:2701)
    at java.lang.class.privategetmethodrecursive(class.java:3048)
    at java.lang.class.getmethod0(class.java:3018)
    at java.lang.class.getmethod(class.java:1784)
    at sun.launcher.launcherhelper.validatemainclass(launcherhelper.java:544)
    at sun.launcher.launcherhelper.checkandloadmain(launcherhelper.java:526)
    caused by: java.lang.classnotfoundexception: org.slf4j.logger
    at java.net.urlclassloader.findclass(urlclassloader.java:381)
    at java.lang.classloader.loadclass(classloader.java:424)
    at sun.misc.launcher$appclassloader.loadclass(launcher.java:331)
    at java.lang.classloader.loadclass(classloader.java:357)
    ... 7 more​​

参考spark凯发娱发k8官网:http://spark.apache.org/docs/latest/hadoop-provided.html

spark uses hadoop client libraries for hdfs and yarn. starting in version spark 1.4, the project packages “hadoop free” builds that lets you more easily connect a single spark binary to any hadoop version. to use these builds, you need to modify spark_dist_classpath to include hadoop’s package jars. the most convenient place to do this is by adding an entry in conf/spark-env.sh.

this page describes how to connect spark to hadoop for different types of distributions.

for apache distributions, you can use hadoop’s ‘classpath’ command. for instance:


    ### in conf/spark-env.sh ###
    # if 'hadoop' binary is on your path
    export spark_dist_classpath=$(hadoop classpath)
    # with explicit path to 'hadoop' binary
    export spark_dist_classpath=$(/path/to/hadoop/bin/hadoop classpath)
    # passing a hadoop configuration directory
    export spark_dist_classpath=$(hadoop --config /path/to/configs classpath)​

最终在spark-env.sh文件添加如下配置:


    export spark_dist_classpath=$(/usr/local/hadoop/hadoop-2.7.3/bin/hadoop classpath)

启动运行,成功!

spark hadoop free 安装遇到的问题的相关教程结束。

网站地图