JAVA通过kerberos认证登陆调起HDFS-进行HDAOOP文件上传下载成功版

2023-05-16

2019独角兽企业重金招聘Python工程师标准>>> hot3.png

1、遇到错误一:java.lang.ClassNotFindException: org.apache.commons.configuration.Configuration
    已解决:maven中添加commons-configuration
2、遇到错误二:Can't get Kerberos realm
    已解决:HDFSMain.java中添加两个系统变量
    System.setProperty("java.security.krb5.kdc","192.168.13.7:21732");
    System.setProperty("java.security.krb5.realm","HADOOP.com");
    后发现可用
    System.setProperty("java.security.krb5.conf","/tmp/hcweb/krb5.conf");
    代替删除。
3、遇到错误三:java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set
    已解决:HDFSMain.java中添加 hadoop.home.dir 系统变量
    System.setProperty("hadoop.home.dir", "/opt/client/HDFS/hadoop");
4、遇到错误四: javax.security.auth.login.LoginException: Unable to obtain password from user(无法获取到密码)
    已解决,尝试过的解决方案:
    a、将/opt/client/JDK/jdk/jre/lib/security/下的cacerts复制到/home/run/jre/lib/security/下,用以替换原来的证书
    b、将/opt/client/JDK/jdk/jre/lib/security下的local_policy.jar复制到/home/run/jre/lib/security/下。
    c、更改客户端HDFS目录下的component_env配置。
    d、从/opt/client/JDK/jdk/jre/lib/security下复制US_export_policy.jar到/home/run/jre/lib/security/,解决无法使用AES192、256位加密解密的问题
    e、在/tomcat/bin/catalina.bat添加JVM启动参数
        set JAVA_OPTS=-Dhttps.protocols="TLSv1.1,TLSv1.2"
        set JAVA_OPTS=-Djdk.tls.client.protocols="TLSv1.1,TLSv1.2"
    f、代码中添加JVM启动参数
        System.setProperty("https.protocols", "TLSv1.1,TLSv1.2");
        System.setProperty("jdk.tls.client.protocols", "TLSv1.1,TLSv1.2");
    g、从csdn下载local_policy.jar、US_export_policy.jar替换/home/run/jre/lib/security/下相同jar包,解决无法使用AES192、256位加密解密的问题
5、遇到错误五:javax.security.auth.login.LoginException: ICMP Port Unreachable(端口不可达)
    已解决,尝试过的解决方案:
    a、手动在/etc/hosts中添加一条记录"192.168.13.7"
    b、查看hadoop客户端中的krb5.conf,根据里面realms参数配置java.security.krb5.kdc请求路径跟端口
    c、删除启动参数
        System.setProperty("java.security.krb5.kdc","192.168.13.7:21732");
        System.setProperty("java.security.krb5.realm","HADOOP.com");
6、遇到错误六:java.lang.NoClassDefFoundError: org/apache/htrace/SamplerBuilder
    已解决 导入:htrace-core-3.1.0-incubating.jar(耗时0.5小时)
7、遇到错误七: java.lang.NoClassDefFoundError: org/apache/commons/cli/ParseException
    导入commons-cli-1.2.jar     
    检查缺失jar包,将案列下缺失jar全部导入(耗时1.5小时)
  *成功登陆并初始化FileSystem 代码如下


package com.talkweb.huicai.io;

import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;

import com.talkweb.huicai.common.PropertyUtil;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.BlockLocation;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.LocatedFileStatus;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.RemoteIterator;
import org.apache.hadoop.security.UserGroupInformation;
import org.springframework.stereotype.Service;

import javax.annotation.Resource;

/**
 * Hadoop相关的文件传输方法
 *
 * @author zhaoxinyu
 * @create 2018/02/02
 */
@Service("fileSystemHadoop")
public class FileSystemHadoop {

    @Resource(name = "propertyUtil")
    private PropertyUtil propertyUtil;
    /**
     * hadoop文件存放基本路径
     */
    public final String hadoopBasePath = propertyUtil.getProperty("hadoopBasePath");
    /**
     * 生成文件本地基础存储路径
     */
    public final String localBasePath = propertyUtil.getProperty("localBasePath");

    /**
     * 配置文件的绝对路径
     */
    private static String CONFIG_PATH = "/tmp/hcweb/";

    private static final String PRINCIPAL = "username.client.kerberos.principal";
    private static final String KEYTAB = "username.client.keytab.file";
    private static final String KRBFILE = "java.security.krb5.conf";
    
    private static String HDFS_SITE_PATH = CONFIG_PATH+"hdfs-site.xml";
    private static String CORE_SITE_PATH = CONFIG_PATH+"core-site.xml";
    private static String USER_KEYTAB_PATH = CONFIG_PATH+"hwcdm_user.keytab";
    private static String KRB5_CONF_PATH = CONFIG_PATH+"krb5.conf";
    
    private static Configuration conf;
    private static FileSystem fileSystem;

    private static String PRNCIPAL_NAME = "hwcdm@HADOOP.COM";
    /**
     * 初始化,获取一个FileSystem实例
     * @throws IOException
     */
    public void init() throws IOException {
            confLoad();
            authentication();
            instanceBuild();
    }
    /**
     *
     * Add configuration file)
     */
    public void confLoad() throws IOException {
        conf = new Configuration();
        conf.addResource(new Path(HDFS_SITE_PATH));
        conf.addResource(new Path(CORE_SITE_PATH));
    }

    /**
     * kerberos security authentication
     */
    public void authentication() throws IOException{
        if ("kerberos".equalsIgnoreCase(conf.get("hadoop.security.authentication"))) {
            System.setProperty("java.security.krb5.conf",KRB5_CONF_PATH);
            System.setProperty("hadoop.home.dir","/opt/client/HDFS/hadoop");
            System.setProperty("sun.security.krb5.debug", "true");
            System.setProperty("https.protocols", "TLSv1.1,TLSv1.2");
            System.setProperty("jdk.tls.client.protocols", "TLSv1.1,TLSv1.2");
            
            System.out.println("https.protocols == "+System.getProperty("https.protocols"));
            System.out.println("jdk.tls.client.protocols == "+System.getProperty("jdk.tls.client.protocols"));
            conf.set(PRINCIPAL, PRNCIPAL_NAME);
            conf.set(KEYTAB, USER_KEYTAB_PATH);
            conf.set("hdfs.connection.timeout","5000");
            System.setProperty(KRBFILE, KRB5_CONF_PATH);
            UserGroupInformation.setConfiguration(conf);
            try {
                UserGroupInformation.loginUserFromKeytab(conf.get(PRINCIPAL), conf.get(KEYTAB));
                System.out.println("Login success!!!!!!!!!!!!!!");
            } catch (IOException e) {
                e.printStackTrace();
            }
        }
    }

    /**
     * build HDFS instance
     */
    public void instanceBuild() throws IOException{
        fileSystem = FileSystem.get(conf);
    }

    /**
     * 往hdfs上传文件
     */
    public void addFileToHdfs(String localPath,String fileName) throws Exception {
        Path src = new Path(localPath);
        boolean isExists = fileSystem.exists(new Path(hadoopBasePath));
        if(!isExists){
            fileSystem.mkdirs(new Path(hadoopBasePath));
        }
        Path dst = new Path(hadoopBasePath+ File.separator +fileName);
        fileSystem.copyFromLocalFile(src, dst);
        fileSystem.close();
    }

    /**
     * 从hdfs中复制文件到本地文件系统
     */
    public void downloadFileToLocal(String resultPath) throws IllegalArgumentException, IOException {
        fileSystem.copyToLocalFile(new Path("resultPath"), new Path(localBasePath));
        fileSystem.close(); 
    }

    /**
     * 删除文件
     */
    public static void delete(String filePath) throws IOException{
        Path path = new Path(filePath);
        boolean isok = fileSystem.deleteOnExit(path);
        if(isok){
            System.out.println("delete ok!");
        }else{
            System.out.println("delete failure");
        }
        fileSystem.close();
    }

    /**
     * 查看目录信息,只显示文件
     */
    public void listFiles() throws FileNotFoundException, IllegalArgumentException, IOException {
        System.out.println("--------------查看目录信息,只显示文件--------------");
        RemoteIterator<LocatedFileStatus> listFiles = fileSystem.listFiles(new Path("/"), true);

        while (listFiles.hasNext()) {
            LocatedFileStatus fileStatus = listFiles.next();

            System.out.println(fileStatus.getPath().getName());
            System.out.println(fileStatus.getBlockSize());
            System.out.println(fileStatus.getPermission());
            System.out.println(fileStatus.getLen());
            BlockLocation[] blockLocations = fileStatus.getBlockLocations();
            for (BlockLocation bl : blockLocations) {
                System.out.println("block-length:" + bl.getLength() + "--" + "block-offset:" + bl.getOffset());
                String[] hosts = bl.getHosts();
                for (String host : hosts) {
                    System.out.println(host);
                }
            }
            System.out.println("--------------分割线--------------");
        }
    }

    /**
     * 查看文件及文件夹信息
     */
    public void listAll() throws FileNotFoundException, IllegalArgumentException, IOException {
        System.out.println("--------------查看文件及文件夹信息--------------");
        FileStatus[] listStatus = fileSystem.listStatus(new Path("/"));
        String flag = "d--             ";

        for (FileStatus fstatus : listStatus) {
            if (fstatus.isFile())
                flag = "f--         ";
            System.out.println(flag + fstatus.getPath().getName());
        }
    }
}  

pom.xml如下,hdfs的包在本地(com.huicai 的包都是本地导入maven仓库的)


<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
  <modelVersion>4.0.0</modelVersion>
  <groupId>huicai</groupId>
  <artifactId>huicai</artifactId>
  <packaging>war</packaging>
  <version>1.0-SNAPSHOT</version>
  <name>huicai Maven Webapp</name>
  <url>http://maven.apache.org</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <junit.version>4.12</junit.version>
    <spring.version>3.2.3.RELEASE</spring.version>
    <mybatis-spring.version>1.2.1</mybatis-spring.version>
    <mybatis.version>3.2.6</mybatis.version>
    <jackson.version>1.9.3</jackson.version>
    <servlet.version>2.5</servlet.version>
    <slf4j.version>1.7.6</slf4j.version>
    <commons-dbcp.version>1.4</commons-dbcp.version>
    <mysql.version>5.1.45</mysql.version>
    <gson.version>2.4</gson.version>
    <hadoop.version>2.7.2</hadoop.version>
  </properties>

  <dependencies>
    <!-- 远程调用shell依赖包 -->
    <dependency>
      <groupId>org.jvnet.hudson</groupId>
      <artifactId>ganymed-ssh2</artifactId>
      <version>build210-hudson-1</version>
    </dependency>
    <!-- test -->
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>${junit.version}</version>
      <scope>test</scope>
    </dependency>

    <!-- servlet -->
    <dependency>
      <groupId>javax.servlet</groupId>
      <artifactId>servlet-api</artifactId>
      <version>${servlet.version}</version>
      <scope>provided</scope>
    </dependency>

    <!-- spring -->
    <dependency>
      <groupId>org.springframework</groupId>
      <artifactId>spring-orm</artifactId>
      <version>${spring.version}</version>
    </dependency>
    <dependency>
      <groupId>org.springframework</groupId>
      <artifactId>spring-context-support</artifactId>
      <version>${spring.version}</version>
    </dependency>
    <dependency>
      <groupId>org.springframework</groupId>
      <artifactId>spring-beans</artifactId>
      <version>${spring.version}</version>
    </dependency>
    <dependency>
      <groupId>org.springframework</groupId>
      <artifactId>spring-context</artifactId>
      <version>${spring.version}</version>
    </dependency>
    <dependency>
      <groupId>org.springframework</groupId>
      <artifactId>spring-webmvc</artifactId>
      <version>${spring.version}</version>
    </dependency>
    <dependency>
      <groupId>com.jcraft</groupId>
      <artifactId>jsch</artifactId>
      <version>0.1.53</version>
    </dependency>

    <dependency>
      <groupId>javax.servlet</groupId>
      <artifactId>jstl</artifactId>
      <version>1.2</version>
    </dependency>

    <!-- aspect -->
    <dependency>
      <groupId>org.aspectj</groupId>
      <artifactId>aspectjweaver</artifactId>
      <version>1.7.4</version>
    </dependency>

    <!-- mybatis -->
    <dependency>
      <groupId>org.mybatis</groupId>
      <artifactId>mybatis-spring</artifactId>
      <version>${mybatis-spring.version}</version>
    </dependency>
    <dependency>
      <groupId>org.mybatis</groupId>
      <artifactId>mybatis</artifactId>
      <version>${mybatis.version}</version>
    </dependency>

    <!-- json -->
    <dependency>
      <groupId>org.codehaus.jackson</groupId>
      <artifactId>jackson-core-asl</artifactId>
      <version>${jackson.version}</version>
    </dependency>
    <dependency>
      <groupId>org.codehaus.jackson</groupId>
      <artifactId>jackson-mapper-asl</artifactId>
      <version>${jackson.version}</version>
    </dependency>
    <dependency>
      <groupId>com.fasterxml.jackson.core</groupId>
      <artifactId>jackson-databind</artifactId>
      <version>2.4.4</version>
    </dependency>

    <!-- log -->
    <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-api</artifactId>
      <version>${slf4j.version}</version>
    </dependency>
    <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <version>${slf4j.version}</version>
    </dependency>
    
    <!-- dbcp -->
    <dependency>
      <groupId>commons-dbcp</groupId>
      <artifactId>commons-dbcp</artifactId>
      <version>${commons-dbcp.version}</version>
    </dependency>
    <!-- fileupload -->
    <dependency>
      <groupId>commons-fileupload</groupId>
      <artifactId>commons-fileupload</artifactId>
      <version>1.2.2</version>
    </dependency>
    <dependency>
      <groupId>commons-io</groupId>
      <artifactId>commons-io</artifactId>
      <version>2.4</version>
    </dependency>

    <!-- mysql -->
    <dependency>
      <groupId>mysql</groupId>
      <artifactId>mysql-connector-java</artifactId>
      <version>${mysql.version}</version>
    </dependency>

    <!--hadoop -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>hadoop-annotations</artifactId>
      <version>2.7.2</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>hadoop-auth</artifactId>
      <version>2.7.2</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>2.7.2</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>2.7.2</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>hadoop-hdfs</artifactId>
      <version>2.7.2</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>hadoop-hdfs-client</artifactId>
      <version>2.7.2</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>hadoop-hdfs-colocation</artifactId>
      <version>2.7.2</version>
    </dependency>
    
    <!-- hdfs-jetty-util -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>jetty-util</artifactId>
      <version>6.1.26</version>
    </dependency>
    
    <!-- hdfs-jetty -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>jetty</artifactId>
      <version>6.1.26</version>
    </dependency>
    
    <!-- hdfs-jersey-server -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>jersey-server</artifactId>
      <version>1.9</version>
    </dependency>
    
    <!-- hdfs-jersey-core -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>jersey-core</artifactId>
      <version>1.9</version>
    </dependency>
    
    <!-- hdfs-commons-daemon -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>commons-daemon</artifactId>
      <version>1.0.13</version>
    </dependency>

    <!-- hdfs-inode-provider -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>hdfs-inode-provider</artifactId>
      <version>2.7.2</version>
    </dependency>

    <!-- hdfs-javaluator -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>javaluator</artifactId>
      <version>3.0.1</version>
    </dependency>
    
    <!-- hdfs-commons-configuration -->
    <dependency>
      <groupId>commons-configuration</groupId>
      <artifactId>commons-configuration</artifactId>
      <version>1.7</version>
    </dependency>

    <!-- hdfs-htrace-core -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>htrace-core</artifactId>
      <version>3.1.0</version>
    </dependency>

    <!-- hdfs-commons-codec -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>commons-codec</artifactId>
      <version>1.4</version>
    </dependency>
    
    <!-- hdfs-commons-cli -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>commons-cli</artifactId>
      <version>1.2</version>
    </dependency>

    <!-- hdfs-asm -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>asm</artifactId>
      <version>3.2</version>
    </dependency>

    <!-- hdfs-jsr305 -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>jsr305</artifactId>
      <version>3.0.0</version>
    </dependency>

    <!-- hdfs-leveldbjni-all -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>leveldbjni-all</artifactId>
      <version>1.8</version>
    </dependency>

    <!-- hdfs-log4j -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>log4j</artifactId>
      <version>1.2.17</version>
    </dependency>

    <!-- hdfs-netty -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>netty</artifactId>
      <version>3.6.2</version>
    </dependency>

    <!-- hdfs-netty-all -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>netty-all</artifactId>
      <version>4.0.23</version>
    </dependency>

    <!-- hdfs-protobuf-java -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>protobuf-java</artifactId>
      <version>2.5.0</version>
    </dependency>

    <!-- hdfs-xercesImpl -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>xercesImpl</artifactId>
      <version>2.9.1</version>
    </dependency>

    <!-- hdfs-xml-apis -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>xml-apis</artifactId>
      <version>1.3.04</version>
    </dependency>

    <!-- hdfs-xmlenc -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>xmlenc</artifactId>
      <version>0.52</version>
    </dependency>

    <!-- greenplum -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>greenplum</artifactId>
      <version>1.0</version>
    </dependency>

    <!-- sso -->
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>sso-client</artifactId>
      <version>1.0</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>sso-common</artifactId>
      <version>1.0</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>commons-lang</artifactId>
      <version>1.0</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>jdom</artifactId>
      <version>1.0</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>jdom</artifactId>
      <version>1.0</version>
    </dependency>
    <dependency>
      <groupId>com.huicai</groupId>
      <artifactId>commons-logging</artifactId>
      <version>1.0</version>
    </dependency>
    
    <!-- json -->
    <dependency>
      <groupId>net.sf.json-lib</groupId>
      <artifactId>json-lib</artifactId>
      <version>2.4</version>
      <classifier>jdk15</classifier>
    </dependency>
    <dependency>
      <groupId>org.json</groupId>
      <artifactId>json</artifactId>
      <version>20160212</version>
    </dependency>
    <!-- gson -->
    <dependency>
      <groupId>com.google.code.gson</groupId>
      <artifactId>gson</artifactId>
      <version>${gson.version}</version>
    </dependency>
    
    <!-- guava -->
    <dependency>
      <groupId>com.google.guava</groupId>
      <artifactId>guava</artifactId>
      <version>18.0</version>
    </dependency>
    
    <!--commons-httpclient -->
    <dependency>
      <groupId>commons-httpclient</groupId>
      <artifactId>commons-httpclient</artifactId>
      <version>3.1</version>
    </dependency>
    
    <!-- poi -->
    <dependency>
      <groupId>org.apache.poi</groupId>
      <artifactId>poi</artifactId>
      <version>3.9</version>
    </dependency>
    <dependency>
      <groupId>org.apache.poi</groupId>
      <artifactId>poi-ooxml</artifactId>
      <version>3.9</version>
    </dependency>
    <dependency>
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-surefire-plugin</artifactId>
      <version>2.12.4</version>
    </dependency>
    <!-- quartz -->
    <dependency>
      <groupId>org.quartz-scheduler</groupId>
      <artifactId>quartz</artifactId>
      <version>1.8.5</version>
    </dependency>
    <dependency>
      <groupId>org.apache.httpcomponents</groupId>
      <artifactId>httpclient</artifactId>
      <version>4.5</version>
    </dependency>
  </dependencies>

  <build>
    <finalName>huicai</finalName>
    <plugins>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <configuration>
          <source>1.7</source>
          <target>1.7</target>
        </configuration>
      </plugin>
    </plugins>
    <resources>
      <resource>
        <directory>src/main/java</directory>
        <includes>
          <include>**/*.xml</include>
          <!-- 将源码包中的xml文件打包 -->
        </includes>
        <filtering>true</filtering>
      </resource>
      <resource>
        <directory>src/main/resources</directory>
        <includes>
          <!-- 将src/main/resources目录下的properties、xml和json文件打包 -->
          <include>**/*.properties</include>
          <include>**/*.xml</include>
          <include>**/*.json</include>
        </includes>
        <filtering>true</filtering>
      </resource>
    </resources>
  </build>
</project>  

转载于:https://my.oschina.net/dreambreeze/blog/1811269

本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)

JAVA通过kerberos认证登陆调起HDFS-进行HDAOOP文件上传下载成功版 的相关文章

随机推荐

  • Permutation Test 置换检验

    显著性检验通常可以告诉我们一个观测值是否是有效的 xff0c 例如检测两组样本均值差异的假设检验可以告诉我们这两组样本的均值是否相等 xff08 或者那个均值更大 xff09 我们在实验中经常会因为各种问题 xff08 时间 经费 人力 物
  • LaTeX 中使用三级标题

    需要在导言区加入命令 xff1a setcounter secnumdepth 4 而后 xff1a section 一级标题 subsection 二级标题 subsubsection 三级标题
  • 为啥程序员下班后只关显示器从不关电脑?

    阅读本文大概需要 3 分钟 你下班时是不是只将显示器一关 xff0c 揣上手机就走了 xff1f 曾有安保人员晚上来办公室巡查时问 xff0c 为什么这些人不关机就下班呢 xff1f 作为程序员 xff0c 你会心一笑 对方不明白如果关机了
  • 美国 ZIP Code 一览表

    Zip Code 这个是美国的邮政编码 美国目前只有邮政是国营的 其余的产业都不是国营的 今天给大家提供美国的Zip Code的原因是大家在注册国外的账号时 需要提供这个Zip Code 因为一般美国的服务默认是面向美国的 甚至是仅支持美国
  • pytorch .detach() .detach_() 和 .data用于切断反向传播

    参考 xff1a https pytorch cn readthedocs io zh latest package references torch autograd detachsource 当我们再训练网络的时候可能希望保持一部分的网
  • UPX使用教程

    UPX是一个通用可执行文件压缩器 xff0c 由于其具有 xff1a 压缩率高 xff1a 压缩效果优于zip gzip xff1b 解压速度快 xff1a 在奔腾133上即可达到大约10MB 秒 xff1b 压缩的可执行文件没有额外的内存
  • Prestashop--配置到阿里云

    Prestashop版本 xff1a v1 6 阿里云环境 xff1a 前段时间出来的免费虚拟主机 之前没搞过网站 xff0c 所以这一切都是蛮新鲜的 因为没有接触过 xff0c 所以必然要遇到蛮多的坑 xff0c 将遇到的坑 填好的坑都记
  • 名词解释

    payload http中的payload 有效载荷在一个数据包或者其他传输单元中运载的基本必要数据 xff0c 即加载的基本数据 记载着信息的那部分数据 通常在传输数据中 xff0c 为了使数据传输更可靠 xff0c 要把原始数据分批传输
  • ubuntu和debian_Debian Ubuntu“ netstat:未找到命令”错误解决方案和Netstat安装

    ubuntu和debian Ubuntu is most used Linux distributions Canonical provides enterprise support for Ubuntu desktops and serv
  • ListControl双击实现可编辑

    ON NOTIFY 处理 listControl 消息 列表控件的消息映射同样使用ON NOTIFY宏 xff0c 形式如同 xff1a ON NOTIFY wNotifyCode id memberFxn xff0c wNotifyCod
  • 如何在创业公司工作保持激情?试试这六条建议

    在创业公司里 xff0c 创始人都表现出雄心勃勃的状态 同样 xff0c 他们也希望自己的手下能和他们一样对正在开创的事业充满激情 相比于在成熟的大公司工作 xff0c 创业公司的求职赛场遵循的或许是另一套规则 JasonFreedman是
  • linux pts设备,linux 系统tty、pty和pts 的概念及区别

    基本概念 xff1a 1 tty 终端设备的统称 tty一词源于Teletypes xff0c 或者teletypewriters xff0c 原来指的是电传打字机 xff0c 是通过串行线用打印机键盘通过阅读和发送信息的东西 xff0c
  • sonic——可替代Elasticsearch的简单搜索引擎

    简介 近期 xff0c 笔者在github上发现了一个十分好玩的开源项目 sonic sonic项目的介绍十分简单 Fast lightweight amp schema less search backend An alternative
  • 头文件与源文件中都分别存放哪些东西

    在C代码文件中 xff0c 我们经常会看到两类文件 xff1a 一类是 xff1a 34 h 34 文件 一类是 xff1a 34 c 34 文件 34 h 34 文件 xff0c 就是我们常说的头文件 34 c 34 文件 xff0c 就
  • Radmin远程控制软件

    Radmin远程控制软件 日期 xff1a 2015 08 20 作者 xff1a lujl Radmin是一款快速的远程控制软件 xff0c 可以用来远程管理公司或个人计算机来实现远程办公 你可以通过鼠标和键盘来控制远程的电脑 xff0c
  • barefoot公司和Tofino芯片

    https barefootnetworks com 2005年秋季 xff0c Clean State项目已经在斯坦福成立 xff0c 作为项目主管的Nick Mckeown教授和他的学生Martin Casado xff0c 来自伯克利
  • 合理的布局,绚丽的样式,谈谈Winform程序的界面设计

    从事Winform开发很多年了 xff0c 由于项目的需要 xff0c 设计过各种各样的界面效果 一般来说 xff0c 运用传统的界面控件元素 xff0c 合理设计布局 xff0c 能够设计出比较中规中矩的标准界面 xff1b 利用一些换肤
  • BGP路由协议详解(完整篇)

    原文链接 xff1a http xuanbo blog 51cto com 499334 465596 2010 12 27 12 02 45 上个月我写一篇关于BGP协议的博文 xff0c 曾许诺过要完善这个文档 xff0c 但因最近的工
  • vnc默认端口号_什么是VNC远程桌面连接默认端口号?

    vnc默认端口号 VNC is a protocol used to connect remote systems with GUI It is especially popular in Linux world but supports
  • JAVA通过kerberos认证登陆调起HDFS-进行HDAOOP文件上传下载成功版

    2019独角兽企业重金招聘Python工程师标准 gt gt gt 1 遇到错误一 xff1a java lang ClassNotFindException org apache commons configuration Configu