如何在 Hive 支持下创建 SparkSession(失败并显示“找不到 Hive 类”)?
2022-09-01 11:50:54
						我在尝试运行以下代码时遇到错误:
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
public class App {
  public static void main(String[] args) throws Exception {
    SparkSession
      .builder()
      .enableHiveSupport()
      .getOrCreate();        
  }
}
输出:
Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.
    at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:778)
    at com.training.hivetest.App.main(App.java:21)
如何解决?
 
					 
				 
				    		 
				    		 
				    		