1.背景
我本地测试,大部分代码是scla开发,少部分是java代码,然后本地测试都是正确的。
19/09/04 20:01:32 INFO TopoSparkSubmitter: 加载Spark默认配置文件:Some(/etc/spark2/conf/spark-defaults.conf)
19/09/04 20:01:35 INFO SparkConfigUtils: 初始化数组:configShouleNotAdd : [SPARK_MASTER_WEBUI_PORT, HADOOP_CONF_DIR, JAVA_HOME, SCALA_HOME, SPARK_MASTER_PORT, SPARK_MASTER_HOST, SPARK_WORKER_INSTANCES, SPARK_WORKER_PORT, SPARK_WORKER_MEMORY, SPARK_WORKER_CORES, SPARK_WORKER_WEBUI_PORT]
19/09/04 20:01:40 INFO SparkConfigUtils: 加载CDH环境变量 path:/etc/spark2/conf/spark-defaults.conf
19/09/04 20:01:41 INFO SparkConfigUtils: 从path:/etc/spark2/conf/spark-defaults.conf 加载运行的环境参数
19/09/04 20:02:01 INFO SparkConfigUtils: SPARK_HOME:/Users/lcc/soft/spark/spark-2.3.0-bin-hadoop2.7
19/09/04 20:02:01 INFO SparkConfigUtils: 从path:/Users/lcc/soft/spark/spark-2.3.0-bin-hadoop2.7/conf/spark-defaults.conf.template 加载运行的环境参数
19/09/04 20:02:02 INFO SparkConfigUtils: 从path:/Users/lcc/soft/spark/spark-2.3.0-bin-hadoop2.7/conf/spark-defaults.conf 加载运行的环境参数
19/09/04 20:02:03 WARN SparkConfigUtils: 警告 path:/Users/lcc/soft/spark/spark-2.3.0-bin-hadoop2.7/conf/spark-defaults.conf 文件不存在或者是不是文件
19/09/04 20:02:03 INFO SparkConfigUtils: 从path:/Users/lcc/soft/spark/spark-2.3.0-bin-hadoop2.7/conf/spark-env.sh 加载运行的环境参数
这一段是java打下的日志。但是打包到服务器,日志是这样的
19/09/04 19:52:52 INFO TopoSparkSubmitter: 加载Spark默认配置文件:Some(/etc/spark2/conf/spark-defaults.conf)
19/09/04 19:52:52 INFO internal.SharedState: loading hive config file: file:/etc/spark2/conf.cloudera.spark2_on_yarn2/hive-site.xml
19/09/04 19:52:52 INFO internal.SharedState: spark.sql.warehouse.dir is not set, but hive.metastore.warehouse.dir is set. Setting spark.sql.warehouse.dir to the value of hive.metastore.warehouse.dir ('/user/hive/warehouse').
19/09/04 19:52:52 INFO internal.SharedState: Warehouse path is '/user/hive/warehouse'.
java部分的日志没打印。
2.环境
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>cheetah-streaming</artifactId>
<groupId>com.dtwave.cheetah</groupId>
<version>1.1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>spark-structured-streaming</artifactId>
<packaging>jar</packaging>
<dependencies>
<!-- 引入协议包 -->
<dependency>
<groupId>com.dtwave.dipper</groupId>
<artifactId>node-protocol</artifactId>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.avro</groupId>
<artifactId>avro</artifactId>
<version>${avro.version}</version>
</dependency>
<!--kudu.version-->
<dependency>
<groupId>org.apache.kudu</groupId>
<artifactId>kudu-client</artifactId>
<version>${kudu.version}</version>
</dependency>
<dependency>
<groupId>com.oracle.jdbc</groupId>
<artifactId>ojdbc8</artifactId>
</dependency>
<!-- 数据库 mysql -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>
<!--spark -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.compat.version}</artifactId>
<exclusions>
<exclusion>
<artifactId>avro-mapred</artifactId>
<groupId>org.apache.avro</groupId>
</exclusion>
</exclusions>
</dependency>
<!-- spark sql kafka-->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_${scala.compat.version}</artifactId>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.compat.version}</artifactId>
<exclusions>
<exclusion>
<artifactId>avro-mapred</artifactId>
<groupId>org.apache.avro</groupId>
</exclusion>
<exclusion>
<artifactId>hive-metastore</artifactId>
<groupId>org.apache.hive</groupId>
</exclusion>
<exclusion>
<artifactId>parquet-hadoop</artifactId>
<groupId>com.twitter</groupId>
</exclusion>
<exclusion>
<artifactId>hive-serde</artifactId>
<groupId>org.apache.hive</groupId>
</exclusion>
<exclusion>
<artifactId>hive-shims</artifactId>
<groupId>org.apache.hive</groupId>
</exclusion>
</exclusions>
</dependency>
<!--kudu spark-->
<dependency>
<groupId>org.apache.kudu</groupId>
<artifactId>kudu-spark2_${scala.compat.version}</artifactId>
</dependency>
<dependency>
<groupId>com.dtwave.boot</groupId>
<artifactId>wolf</artifactId>
<exclusions>
<exclusion>
<artifactId>spring-core</artifactId>
<groupId>org.springframework</groupId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j-api.version}</version>
</dependency>
<!-- Test -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>${net.alchim31.version}</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven.surefire.version}</version>
<configuration>
<!-- Tests will be run with scalatest-maven-plugin instead -->
<skipTests>true</skipTests>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.version}</version>
<configuration>
<encoding>utf-8</encoding>
<source>${maven.compiler.source}</source>
<target>${maven.compiler.target}</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>${maven.shade.version}</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<finalName>spark-structured-streaming</finalName>
<createDependencyReducedPom>false</createDependencyReducedPom>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer implementation = "org.apache.maven.plugins.shade.resource.AppendingTransformer" >
<resource>META-INF/services/org.apache.spark.sql.sources.DataSourceRegister</resource>
</transformer>
<transformer implementation ="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>com.dtwave.cheetah.node.spark.structured.streaming.StructureStreamingExecutor</mainClass>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
我打开编译后的代码,发现java代码中是有相关的日志的,但是却不打印。