如何找到Spark的安装目录?

2022-09-01 12:49:20

我想改变.如何在 ubuntu 中找到安装目录?spark-env.sh

我查看了UI,但没有找到任何东西。

whereis spark 

结果:spark:

以下是定位命令定位火花的日志

/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/11
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/blockmgr-db3a931b-7f1a-423e-b5da-b75a958a1909/13
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/files
/home/sys6002/.netbeans/8.0.2/apache-tomcat-8.0.15.0_base/temp/spark-ba1ea995-b959-43f4-ab6c-7d9f1ee5fcc1/httpd-16b4313e-72dc-4665-b4ac-df491869386d/jars
/home/sys6002/Desktop/diff spark hadoop.png
/home/sys6002/Desktop/sparkmain
/home/sys6002/Downloads/learning-spark-master.zip
/home/sys6002/Downloads/mongo-spark-master
/home/sys6002/Downloads/spark-1.5.1
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6 (2)
/home/sys6002/Downloads/spark-1.5.1-bin-hadoop2.6.tgz
/home/sys6002/Downloads/spark-1.5.1-bin-without-hadoop
/home/sys6002/Downloads/spark-cassandra-connector-master
/home/sys6002/Downloads/spark-core_2.9.3-0.8.0-incubati
home/sys6002/anaconda3/pkgs/odo-0.3.2-np19py34_0/lib/python3.4/site-packages/odo/backends/tests/__pycache__/test_sparksql.cpython-34.pyc
/home/sys6002/spark-example/a.txt
/home/sys6002/spark-example/a.txt~
/home/sys6002/spark-example/pom.xml
/home/sys6002/spark-example/pom.xml~
/home/sys6002/spark-example/src
/home/sys6002/spark-example/src/main
/home/sys6002/spark-example/src/test
/home/sys6002/spark-example/src/main/java
/home/sys6002/spark-example/src/main/java/com
/home/sys6002/spark-example/src/main/java/com/geekcap
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/App.java
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/WordCount.java~
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java
/home/sys6002/spark-example/src/main/java/com/geekcap/javaworld/sparkexample/WordCount.java~

/home/sys6002/spark-example/src/test/java/com/geekcap/javaworld/AppTest.java
/usr/share/app-install/desktop/lightspark:lightspark.desktop
/usr/share/app-install/desktop/sparkleshare:sparkleshare-invite-opener.desktop
/usr/share/app-install/desktop/sparkleshare:sparkleshare.desktop

答案 1

echo 'sc.getConf.get("spark.home")' | spark-shell

片刻之后,你的Spark home将被打印出来,你会看到这样的东西:

scala> sc.getConf.get("spark.home")
res0: String = /usr/local/lib/python3.7/site-packages/pyspark

所以在这种情况下,我的火花之家是/usr/local/lib/python3.7/site-packages/pyspark


答案 2

您可以尝试以下两个命令:

  1. locate spark

  2. whereis spark

locate - 对于每个给定的模式,locate 搜索一个或多个文件名数据库,并显示包含该模式的文件名。模式可以包含 shell 样式的元字符:“”、“?”和“[]”。元字符不特别处理“/”或“.”。因此,模式“foobar”可以与包含“foo3/bar”的文件名匹配,而模式“duck”可以与包含“lake/.ducky”的文件名匹配。应引用包含元字符的模式,以保护它们免受 shell 的扩展。

其中,找到指定文件的源代码/二进制文件和手册部分。提供的名称首先去掉前导路径名组件和格式为 .ext 的任何(单个)尾随扩展名,例如 .c。还处理了因使用源代码控制而产生的 s. 前缀。其中is然后尝试在标准Linux位置列表中找到所需的程序。