目录

一. 基本概念介绍

二. Elasticsearch服务端安装

三. Http rest api简单使用介绍

四. 整合到Springboot及使用RestHighLevelClient高级客户端

五. 后续


网络上关于Elasticsearch搜索引擎的教程不少, 但大多数都是比较老旧的, 甚至包括Elasticsearch官网的教程也是很久没有更新, 再加上Elasticsearch本身升级过程中不断的抛弃老旧概念, 新版本完全不兼容旧版本, 所以老旧教程给新入门的童鞋带来很多困惑.

这里使用当前Elasticsearch最新版本7.10.2结合Springboot2.X版本, 使用http rest api接口和RestHighLevelClient高级客户端封装的一些常用接口给新到的童鞋一些参考.

一. 基本概念介绍

本文侧重于讲述Elasticsearch的基本使用, 关于它的特性和基本概念, 您可以通过官网或其他网络文章了解. 这里只指出与本文有关的一些重要概念.

  • index:索引
    在7.0以前的版本中类似于数据库中的database概念, 由于在7.0版本后移除了type(或者或type只能为_doc这一种类型), 现在的index类似于数据库中的表(table).
  • document:文档
    类似于数据库中的一条记录
  • mapping:映射
    类似于数据库中的表结构, es中mapping可以支持动态映射, dynamic属性有true, false, strict 3种类型的值
  • field:字段
    类似于数据库中的column字段, 字段类型后面会介绍

二. Elasticsearch服务端安装

Elasticsearch官网下载相应操作系统最新版本压缩包, 解压即可, 无需任何额外配置. Elasticsearch运行环境依赖java, 必须先配置好java环境.

以Windows为例: 解压至D:\Apps\elasticsearch-7.10.2

启动: 执行bin目录下的.\elasticsearch.bat 即可启动. 默认http端口为9200, tcp端口为9300

启动成功后在浏览器地址栏输入 localhost:9200, 出现以下信息:

从GitHub上下载相同版本号的ik分词器插件: https://github.com/medcl/elasticsearch-analysis-ik, 解压后复制到elasticsearch/plugins目录下, 重启elasticsearch即可生效.

三. Http rest api简单使用介绍

elasticsearch.rest

### 查看索引 ${index}
GET {{esClusterNode}}/index_user
Accept: application/json### 创建索引 ${index} 指定设置和映射 address字段使用IK分词器分词
### analyzer和search_analyzer的区别:
### 在创建索引,指定analyzer,ES在创建时会先检查是否设置了analyzer字段,如果没定义就用ES预设的
### 在查询时,指定search_analyzer,ES查询时会先检查是否设置了search_analyzer字段,
### 如果没有设置,还会去检查创建索引时是否指定了analyzer,还是没有还设置才会去使用ES预设的
### ES分析器主要有两种情况会被使用:
### 1. 插入文档时,将text类型的字段做分词然后插入倒排索引,此时就可能用到analyzer指定的分词器
### 2. 在查询时,先对要查询的text类型的输入做分词,再去倒排索引搜索,此时就可能用到search_analyzer指定的分词器
PUT {{esClusterNode}}/index_user
Content-Type: application/json{"settings": {"number_of_shards": 3,"number_of_replicas": 2},"mappings": {"properties": {"id": {"type": "keyword"},"name": {"type": "keyword"},"age": {"type": "integer"},"sex": {"type": "keyword"},"email": {"type": "keyword"},"address": {"type": "text","analyzer": "ik_max_word","search_analyzer": "ik_max_word"}}}
}### 删除索引 ${index}
DELETE {{esClusterNode}}/index_user
Content-Type: application/x-www-form-urlencoded### 查看映射 ${index}/_mapping
GET {{esClusterNode}}/index_user/_mapping
Content-Type: application/x-www-form-urlencoded### 新增映射 ${index}/_mapping
PUT {{esClusterNode}}/index_user/_mapping
Content-Type: application/json{"properties": {"score": {"type": "float"}}
}### 新增文档 ${index}/_doc 自动生成ID(不推荐) 这里的ID是指es注解_id, 不是自定义的id字段
POST {{esClusterNode}}/index_user/_doc
Content-Type: application/json{"id": "100001","name": "张三","age": 23,"sex": "男","email": "zhangsan@163.com","address": "广东省深圳市龙华新区民治工业园29号"
}### 搜索文档 ${index}/_search 查询全部文档
GET {{esClusterNode}}/index_user/_search
Content-Type: application/json{"query": {"match_all": {}}
}### 查看文档 ${index}/_doc/${_id}
GET {{esClusterNode}}/index_user/_doc/-MVMlXcBnpDI88FS39oA
Accept: application/json### 新增文档 ${index}/_doc/${_id} 指定ID 推荐
POST {{esClusterNode}}/index_user/_doc/100002
Content-Type: application/json{"id": "100002","name": "李四","age": 45,"sex": "男","email": "lisi@163.com","address": "广东省深圳市龙华新区民治工业园27号"
}### 查看文档 ${index}/_doc/${_id}
GET {{esClusterNode}}/index_user/_doc/100002
Accept: application/json### 更新文档 ${index}/_doc/${_id} ID不存在则新增
PUT {{esClusterNode}}/index_user/_doc/100002
Content-Type: application/json{"id": "100002","name": "李四","age": 35,"sex": "男","email": "lisi666@163.com","address": "广东省广州市越秀区中心工业园81号"
}### 删除文档 ${index}/_doc/${_id}
DELETE {{esClusterNode}}/index_user/_doc/6WoDdncBA_HwaxaTfHFC
Content-Type: application/json### 搜索文档 ${index}/_search 查询全部文档
GET {{esClusterNode}}/index_user/_search
Content-Type: application/json{"query": {"match_all": {}}
}### 搜索文档 ${index}/_search 查询关键字
GET {{esClusterNode}}/index_user/_search
Content-Type: application/json{"query": {"match": {"address": "广东省"}}
}### 搜索文档 ${index}/_search 查询关键字
GET {{esClusterNode}}/index_user/_search
Content-Type: application/json{"query": {"match": {"age": "23"}}
}### 搜索文档 ${index}/_search 查询关键字 关键字也会被分词 例如, 输入深圳市, 匹配市会按 深圳市 深圳 市 进行搜索
GET {{esClusterNode}}/index_user/_search
Content-Type: application/json{"query": {"match": {"address": "深圳市"}}
}### 搜索文档 ${index}/_search 查询关键字 模糊查找 query match 高亮显示
GET {{esClusterNode}}/index_user/_search
Content-Type: application/json{"query": {"match": {"address": "广东省深圳市龙华新区民治工业园"}},"highlight": {"fields": {"address": {}}}
}### 搜索文档 ${index}/_search 查询关键字 精确查找 query term 高亮显示
GET {{esClusterNode}}/index_user/_search
Content-Type: application/json{"query": {"term": {"address": "广州"}},"highlight": {"fields": {"address": {}}}
}### 查看分词结果GET /${index}/_doc/${id}/_termvectors?fields=${fields_name}
GET {{esClusterNode}}/index_user/_doc/100002/_termvectors?fields=address
Accept: application/json

四. 整合到Springboot及使用RestHighLevelClient高级客户端

1. 为Springboot项目pom.xml添加elasticsearch及RestHighLevelClient高级客户端依赖

        <dependency><groupId>org.elasticsearch.client</groupId><artifactId>elasticsearch-rest-high-level-client</artifactId><version>${elasticsearch.version}</version></dependency><dependency><groupId>org.elasticsearch.client</groupId><artifactId>elasticsearch-rest-client</artifactId><version>${elasticsearch.version}</version></dependency><dependency><groupId>org.elasticsearch</groupId><artifactId>elasticsearch</artifactId><version>${elasticsearch.version}</version></dependency>

版本均选择7.10.2 这里不推荐使用spring-boot-starter-data-elasticsearch, starter难以控制elasticsearch及其依赖项的版本号.

elasticsearch-rest-high-level-client高级客户端依赖于elasticsearch-rest-client低级客户端, 由于elasticsearch版本间的兼容性非常差, 所以为避免出现一些意向不到的问题, 请保证elasticsearch, elasticsearch-rest-high-level-client, elasticsearch-rest-client, 如果后续需要使用ik分词器, kibana, logstash 使用同一个版本号.

2. 配置RestHighLevelClient高级客户端

  • 添加集群配置文件
    多个集群节点之间以英文逗号 , 分隔, 如 cluster-nodes: localhost:9200,localhost:9201
# es搜索引擎
elasticsearch:cluster-nodes: localhost:9200
  • RestHighLevelClient实例配置
    elasticsearch已经实现了负载均衡功能, 具体如何实现的, 资料很少.
    ElasticSearchConfig.java
import lombok.extern.slf4j.Slf4j;
import org.apache.http.HttpHost;
import org.elasticsearch.client.RestClient;
import org.elasticsearch.client.RestHighLevelClient;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;import java.util.Arrays;/*** Elasticsearch配置** @author Aylvn* @date 2021-02-03*/
@Slf4j
@Configuration
public class ElasticSearchConfig {@Value("${elasticsearch.cluster-nodes}")private String clusterNodes;@Beanpublic RestHighLevelClient restHighLevelClient() {String[] nodes = clusterNodes.split(",");return new RestHighLevelClient(RestClient.builder(Arrays.stream(nodes).map(this::buildHttpHost).toArray(HttpHost[]::new)));}private HttpHost buildHttpHost(String node) {String[] nodeInfo = node.split(":");return new HttpHost(nodeInfo[0].trim(), Integer.parseInt(nodeInfo[1].trim()), "http");}
}

3. 常用的接口类

索引有关的操作
elasticsearch数据存取使用json格式进行交互的, 由于字段的不确定性, 这里使用Map<String, Object>的key-value键值对作为与elasticsearch交互的参数类型.
常量定义: EsConstant.java


import org.elasticsearch.client.RequestOptions;public class EsConstant {public static final RequestOptions DEFAULT_OPTIONS = RequestOptions.DEFAULT;public static final String ID = "id";public static final String INDEX_MAPPING_PROPERTIES = "properties";public static final String INDEX_MAPPING_DYNAMIC = "dynamic";public static final String INDEX_MAPPING_DYNAMIC_STRICT = "strict";public static final String INDEX_COMPANY = "index_company";public static final String DATE_FORMAT = "format";public static final String DATE_FORMAT_PATTERN = "yyyy-MM-dd HH:mm:ss || yyyy-MM-dd || yyyy/MM/dd HH:mm:ss|| yyyy/MM/dd ||epoch_millis";private EsConstant() {}
}

定义Mapping属性, text类型字段默认使用ik分词器, 分词类型为ik_max_word: MappingProperty.java


import java.util.HashMap;import static com.aylvn.bn.elasticsearch.constant.EsConstant.DATE_FORMAT;
import static com.aylvn.bn.elasticsearch.constant.EsConstant.DATE_FORMAT_PATTERN;public class MappingProperty extends HashMap<String, Object> {/*** 1:支持分词,全文检索,支持模糊、精确查询,不支持聚合,排序操作;* 2:test类型的最大支持的字符长度无限制,适合大字段存储;* 使用场景:* 存储全文搜索数据, 例如: 邮箱内容、地址、代码块、博客文章内容等。* 默认结合standard analyzer(标准解析器)对文本进行分词、倒排索引。* 默认结合标准分析器进行词命中、词频相关度打分。*/public static final String TEXT = "text";/*** 1:不进行分词,直接索引,支持模糊、支持精确匹配,支持聚合、排序操作。* 2:keyword类型的最大支持的长度为——32766个UTF-8类型的字符,可以通过设置ignore_above指定自持字符长度,超过给定长度后的数据将不被索引,无法通过term精确匹配检索返回结果。* <p>* 使用场景:* 存储邮箱号码、url、name、title,手机号码、主机名、状态码、邮政编码、标签、年龄、性别等数据。* 用于筛选数据(例如: select * from x where status='open')、排序、聚合(统计)。* 直接将完整的文本保存到倒排索引中。*/public static final String KEYWORD = "keyword";public static final String BYTE = "byte";public static final String SHORT = "short";public static final String INTEGER = "integer";public static final String LONG = "long";public static final String FLOAT = "float";public static final String HALF_FLOAT = "half_float";public static final String SCALED_FLOAT = "scaled_float";public static final String DOUBLE = "double";public static final String DATE = "date";public static final String INTEGER_RANGE = "integer_range";public static final String LONG_RANGE = "long_range";public static final String FLOAT_RANGE = "float_range";public static final String DOUBLE_RANGE = "double_range";public static final String DATE_RANGE = "date_range";public static final String BOOLEAN = "boolean";public static final String BINARY = "binary";public static final String OBJECT = "object";public static final String IP = "ip";/*** 如果想要在创建索引和查询时分别使用不同的分词器,ElasticSearch也是支持的。* <p>* 在创建索引,指定analyzer,ES在创建时会先检查是否设置了analyzer字段,如果没定义就用ES预设的* <p>* 在查询时,指定search_analyzer,ES查询时会先检查是否设置了search_analyzer字段,* 如果没有设置,还会去检查创建索引时是否指定了analyzer,还是没有还设置才会去使用ES预设的* <p>* ES分析器主要有两种情况会被使用:* <p>* 插入文档时,将text类型的字段做分词然后插入倒排索引,此时就可能用到analyzer指定的分词器* 在查询时,先对要查询的text类型的输入做分词,再去倒排索引搜索,此时就可能用到search_analyzer指定的分词器*/public static final String ANALYZER = "analyzer";public static final String SEARCH_ANALYZER = "search_analyzer";/*** 会做最粗粒度的拆分,比如会将“中华人民共和国”拆分为中华人民共和国*/public static final String ANALYZER_IK_SMART = "ik_smart";/*** 会将文本做最细粒度的拆分,比如会将“中华人民共和国人民大会堂”拆分为“中华人民共和国、中华人民、中华、华人、人民共和国、人民、共和国等词语*/public static final String ANALYZER_IK_MAX_WORD = "ik_max_word";public MappingProperty() {}public MappingProperty(String type) {this.put("type", type);// 日期格式处理if (DATE.equals(type)) {this.put(DATE_FORMAT, DATE_FORMAT_PATTERN);} else if (TEXT.equals(type)) {// text类型字段使用IK分词器this.put(ANALYZER, ANALYZER_IK_MAX_WORD);this.put(SEARCH_ANALYZER, ANALYZER_IK_MAX_WORD);}}}

index索引操作接口: EsIndexOperation.java


import lombok.extern.slf4j.Slf4j;
import org.elasticsearch.action.admin.indices.delete.DeleteIndexRequest;
import org.elasticsearch.action.support.master.AcknowledgedResponse;
import org.elasticsearch.client.RestHighLevelClient;
import org.elasticsearch.client.indices.CreateIndexRequest;
import org.elasticsearch.client.indices.CreateIndexResponse;
import org.elasticsearch.client.indices.GetIndexRequest;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;import java.io.IOException;
import java.util.HashMap;
import java.util.Map;import static com.aylvn.bn.elasticsearch.constant.EsConstant.*;@Slf4j
@Service
public class EsIndexOperation {@Autowiredprivate RestHighLevelClient restHighLevelClient;/*** 判断索引是否存在*/public boolean checkIndex(String indexName) {try {return restHighLevelClient.indices().exists(new GetIndexRequest(indexName), DEFAULT_OPTIONS);} catch (IOException e) {e.printStackTrace();}return Boolean.FALSE;}/*** 创建索引*/public boolean createIndex(String indexName, Map<String, ?> mapping) {return createIndex(indexName, null, null, mapping);}/*** 创建索引*/public boolean createIndex(String indexName, Map<String, ?> aliases, Map<String, ?> settings, Map<String, ?> properties) {if (checkIndex(indexName)) {return Boolean.TRUE;}try {CreateIndexRequest request = new CreateIndexRequest(indexName);if (aliases != null && aliases.size() > 0) {request.aliases(aliases);}if (settings != null && settings.size() > 0) {request.settings(settings);}if (properties != null && properties.size() > 0) {Map<String, Object> mapping = new HashMap<>();mapping.put(INDEX_MAPPING_PROPERTIES, properties);// 禁止自动添加字段mapping.put(INDEX_MAPPING_DYNAMIC, INDEX_MAPPING_DYNAMIC_STRICT);request.mapping(mapping);}CreateIndexResponse createIndexResponse = restHighLevelClient.indices().create(request, DEFAULT_OPTIONS);log.info("createIndexResponse:{}", createIndexResponse);return Boolean.TRUE;} catch (IOException e) {e.printStackTrace();}return Boolean.FALSE;}/*** 删除索引*/public boolean deleteIndex(String indexName) {try {if (checkIndex(indexName)) {DeleteIndexRequest request = new DeleteIndexRequest(indexName);AcknowledgedResponse response = restHighLevelClient.indices().delete(request, DEFAULT_OPTIONS);return response.isAcknowledged();}} catch (Exception e) {e.printStackTrace();}return Boolean.FALSE;}
}

测试索引操作接口: EsIndexController.java


import com.aylvn.bn.common.dto.BaseResp;
import com.aylvn.bn.elasticsearch.constant.EsConstant;
import com.aylvn.bn.elasticsearch.constant.MappingProperty;
import com.aylvn.bn.elasticsearch.service.EsIndexOperation;
import io.swagger.annotations.Api;
import io.swagger.annotations.ApiOperation;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.RestController;import javax.annotation.Resource;
import java.util.HashMap;
import java.util.Map;@Api(tags = "ES索引操作")
@RestController
public class EsIndexController {@Resourceprivate EsIndexOperation esIndexOperation;@ApiOperation("checkIndexCompany")@GetMapping("/checkIndexCompany")public BaseResp<Boolean> checkIndexCompany() {return BaseResp.ok(esIndexOperation.checkIndex(EsConstant.INDEX_COMPANY));}@ApiOperation("createIndexCompany")@PostMapping("/createIndexCompany")public BaseResp<Boolean> createIndexCompany() {Map<String, Object> properties = new HashMap<>();properties.put("id", new MappingProperty(MappingProperty.INTEGER));properties.put("companyName", new MappingProperty(MappingProperty.TEXT));properties.put("tel", new MappingProperty(MappingProperty.KEYWORD));properties.put("tax", new MappingProperty(MappingProperty.KEYWORD));properties.put("mobile", new MappingProperty(MappingProperty.KEYWORD));properties.put("responsible", new MappingProperty(MappingProperty.KEYWORD));properties.put("contact", new MappingProperty(MappingProperty.KEYWORD));properties.put("position", new MappingProperty(MappingProperty.TEXT));properties.put("address", new MappingProperty(MappingProperty.TEXT));properties.put("postCode", new MappingProperty(MappingProperty.KEYWORD));properties.put("economicType", new MappingProperty(MappingProperty.KEYWORD));properties.put("registeredCapital", new MappingProperty(MappingProperty.KEYWORD));properties.put("operationMode", new MappingProperty(MappingProperty.KEYWORD));properties.put("annualSales", new MappingProperty(MappingProperty.KEYWORD));properties.put("buildDate", new MappingProperty(MappingProperty.KEYWORD));properties.put("numberOfWorkers", new MappingProperty(MappingProperty.KEYWORD));properties.put("email", new MappingProperty(MappingProperty.KEYWORD));properties.put("mainBusiness", new MappingProperty(MappingProperty.TEXT));properties.put("mainProduct", new MappingProperty(MappingProperty.TEXT));properties.put("website", new MappingProperty(MappingProperty.KEYWORD));properties.put("createdDt", new MappingProperty(MappingProperty.DATE));properties.put("createdBy", new MappingProperty(MappingProperty.KEYWORD));properties.put("updatedDt", new MappingProperty(MappingProperty.DATE));properties.put("updatedBy", new MappingProperty(MappingProperty.KEYWORD));return BaseResp.ok(esIndexOperation.createIndex(EsConstant.INDEX_COMPANY, properties));}@ApiOperation("deleteIndexCompany")@DeleteMapping("/deleteIndexCompany")public BaseResp<Boolean> deleteIndexCompany() {return BaseResp.ok(esIndexOperation.deleteIndex(EsConstant.INDEX_COMPANY));}

文档有关操作
文档的增删改查: EsDataOperation.java

import lombok.extern.slf4j.Slf4j;
import org.elasticsearch.action.DocWriteRequest;
import org.elasticsearch.action.bulk.BulkRequest;
import org.elasticsearch.action.bulk.BulkResponse;
import org.elasticsearch.action.delete.DeleteRequest;
import org.elasticsearch.action.delete.DeleteResponse;
import org.elasticsearch.action.index.IndexRequest;
import org.elasticsearch.action.index.IndexResponse;
import org.elasticsearch.action.support.WriteRequest;
import org.elasticsearch.action.update.UpdateRequest;
import org.elasticsearch.action.update.UpdateResponse;
import org.elasticsearch.client.RestHighLevelClient;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;import java.util.List;
import java.util.Map;import static com.aylvn.bn.elasticsearch.constant.EsConstant.DEFAULT_OPTIONS;
import static com.aylvn.bn.elasticsearch.constant.EsConstant.ID;@Slf4j
@Service
public class EsDataOperation {@Autowiredprivate RestHighLevelClient restHighLevelClient;/*** 写入数据*/public boolean insert(String indexName, Map<String, Object> source) {try {IndexRequest request = new IndexRequest(indexName).id(source.get(ID).toString())// opType:create:id不能为空,必须指定id,id相同时报错// opType:index:id可以为空,不指定id时自动生成,id相同时覆盖.opType(DocWriteRequest.OpType.CREATE).source(source);IndexResponse indexResponse = restHighLevelClient.index(request, DEFAULT_OPTIONS);log.info("indexResponse:{}", indexResponse);return Boolean.TRUE;} catch (Exception e) {e.printStackTrace();}return Boolean.FALSE;}/*** 批量写入数据*/public boolean batchInsert(String indexName, List<Map<String, Object>> sourceList) {try {BulkRequest request = new BulkRequest();for (Map<String, Object> source : sourceList) {request.add(new IndexRequest(indexName).id(source.get(ID).toString())// opType:create:id不能为空,必须指定id,id相同时报错// opType:index:id可以为空,不指定id时自动生成,id相同时覆盖.opType(DocWriteRequest.OpType.CREATE).source(source));}BulkResponse bulkResponse = restHighLevelClient.bulk(request, DEFAULT_OPTIONS);log.info("bulkResponse:{}", bulkResponse);return Boolean.TRUE;} catch (Exception e) {e.printStackTrace();}return Boolean.FALSE;}/*** 更新数据,可以直接修改索引结构*/public boolean update(String indexName, Map<String, Object> source) {try {UpdateRequest updateRequest = new UpdateRequest(indexName, source.get(ID).toString());updateRequest.setRefreshPolicy(WriteRequest.RefreshPolicy.IMMEDIATE);// 此处大坑 如果要使用Map方法, 入参必须为Map<String, Object>类型updateRequest.doc(source);UpdateResponse updateResponse = restHighLevelClient.update(updateRequest, DEFAULT_OPTIONS);log.info("updateResponse:{}", updateResponse);return Boolean.TRUE;} catch (Exception e) {e.printStackTrace();}return Boolean.FALSE;}/*** 删除数据*/public boolean delete(String indexName, String id) {try {DeleteRequest deleteRequest = new DeleteRequest(indexName, id);DeleteResponse deleteResponse = restHighLevelClient.delete(deleteRequest, DEFAULT_OPTIONS);log.info("deleteResponse:{}", deleteResponse);return Boolean.TRUE;} catch (Exception e) {e.printStackTrace();}return Boolean.FALSE;}
}

测试文档操作接口: EsDataController.java

import com.aylvn.bn.common.dao.entity.CompanyInfo;
import com.aylvn.bn.common.dao.mapper.CompanyInfoMapper;
import com.aylvn.bn.common.dto.BaseResp;
import com.aylvn.bn.common.utils.BeanUtil;
import com.aylvn.bn.common.utils.DateUtil;
import com.aylvn.bn.elasticsearch.constant.EsConstant;
import com.aylvn.bn.elasticsearch.dto.CompanyInfoDto;
import com.aylvn.bn.elasticsearch.service.EsDataOperation;
import io.swagger.annotations.Api;
import lombok.SneakyThrows;
import lombok.extern.slf4j.Slf4j;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.util.CollectionUtils;
import org.springframework.web.bind.annotation.DeleteMapping;
import org.springframework.web.bind.annotation.PostMapping;
import org.springframework.web.bind.annotation.PutMapping;
import org.springframework.web.bind.annotation.RestController;import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;@Api(tags = "ES文档操作")
@Slf4j
@RestController
public class EsDataController {@Autowiredprivate EsDataOperation esDataOperation;@Autowiredprivate CompanyInfoMapper companyInfoMapper;@SneakyThrows@PostMapping("/insert")public BaseResp<Boolean> insert(Integer companyInfoId) {CompanyInfo companyInfo = companyInfoMapper.getById(companyInfoId);if (companyInfo == null) {return BaseResp.error();}log.info("companyInfo:{}", companyInfo);Map<String, Object> source = new HashMap<>(BeanUtil.beanToMap(companyInfo));log.info("source:{}", source);return BaseResp.ok(esDataOperation.insert(EsConstant.INDEX_COMPANY, source));}@SneakyThrows@PostMapping("/batchInsert")public BaseResp<Boolean> batchInsert(CompanyInfoDto companyInfoDto) {List<CompanyInfo> companyInfos = companyInfoMapper.listByIds(companyInfoDto.getCompanyInfoIds());if (CollectionUtils.isEmpty(companyInfos)) {return BaseResp.error();}List<Map<String, Object>> sourceList = new ArrayList<>();companyInfos.forEach(companyInfo -> sourceList.add(new HashMap<>(BeanUtil.beanToMap(companyInfo))));return BaseResp.ok(esDataOperation.batchInsert(EsConstant.INDEX_COMPANY, sourceList));}@SneakyThrows@PutMapping("/update")public BaseResp<Boolean> update(Integer companyInfoId) {CompanyInfo companyInfo = companyInfoMapper.getById(companyInfoId);if (companyInfo == null) {return BaseResp.error();}companyInfo.setUpdatedDt(DateUtil.getDate());Map<String, String> sourceMap = BeanUtil.beanToMap(companyInfo);Map<String, Object> source = new HashMap<>(sourceMap);// 注意入参的类型return BaseResp.ok(esDataOperation.update(EsConstant.INDEX_COMPANY, source));}@DeleteMapping("/delete")public BaseResp<Boolean> delete(Integer companyInfoId) {return BaseResp.ok(esDataOperation.delete(EsConstant.INDEX_COMPANY, companyInfoId.toString()));}}

查询操作
查询操作接口: EsQueryOperation.java

import com.aylvn.bn.common.dto.Pager;
import lombok.extern.slf4j.Slf4j;
import org.elasticsearch.action.search.SearchRequest;
import org.elasticsearch.action.search.SearchResponse;
import org.elasticsearch.client.RestHighLevelClient;
import org.elasticsearch.client.core.CountRequest;
import org.elasticsearch.client.core.CountResponse;
import org.elasticsearch.index.query.BoolQueryBuilder;
import org.elasticsearch.index.query.QueryBuilder;
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.search.SearchHit;
import org.elasticsearch.search.builder.SearchSourceBuilder;
import org.elasticsearch.search.sort.SortOrder;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;import java.util.ArrayList;
import java.util.List;
import java.util.Map;import static com.aylvn.bn.elasticsearch.constant.EsConstant.DEFAULT_OPTIONS;@Slf4j
@Service
public class EsQueryOperation {@Autowiredprivate RestHighLevelClient restHighLevelClient;/*** 查询总数*/public Long countAll(String indexName) {CountRequest countRequest = new CountRequest(indexName);QueryBuilder queryBuilder = QueryBuilders.matchAllQuery();countRequest.query(queryBuilder);try {CountResponse countResponse = restHighLevelClient.count(countRequest, DEFAULT_OPTIONS);return countResponse.getCount();} catch (Exception e) {e.printStackTrace();}return 0L;}/*** 查询总数*/public Long countMatches(String indexName, Map<String, Object> matches) {CountRequest countRequest = new CountRequest(indexName);BoolQueryBuilder boolQueryBuilder = QueryBuilders.boolQuery();matches.forEach((k, v) -> boolQueryBuilder.must(QueryBuilders.matchQuery(k, v)));log.info("boolQueryBuilder:{}", boolQueryBuilder);countRequest.query(boolQueryBuilder);try {CountResponse countResponse = restHighLevelClient.count(countRequest, DEFAULT_OPTIONS);return countResponse.getCount();} catch (Exception e) {e.printStackTrace();}return 0L;}/*** 分页查询*/public List<Map<String, Object>> queryMatches(String indexName, Map<String, Object> matches, Pager pager) {// 查询条件SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder();// 分页参数searchSourceBuilder.from(pager.getOffset());searchSourceBuilder.size(pager.getPageSize());log.info("pager:{}, offset:{}", pager, pager.getOffset());// 过滤参数BoolQueryBuilder boolQueryBuilder = QueryBuilders.boolQuery();matches.forEach((k, v) -> boolQueryBuilder.must(QueryBuilders.matchQuery(k, v)));log.info("boolQueryBuilder:{}", boolQueryBuilder);searchSourceBuilder.query(boolQueryBuilder);// 排序searchSourceBuilder.sort("updatedDt", SortOrder.DESC);searchSourceBuilder.sort("buildDate", SortOrder.ASC);SearchRequest searchRequest = new SearchRequest(indexName);searchRequest.source(searchSourceBuilder);try {SearchResponse searchResp = restHighLevelClient.search(searchRequest, DEFAULT_OPTIONS);List<Map<String, Object>> data = new ArrayList<>();SearchHit[] searchHitArr = searchResp.getHits().getHits();for (SearchHit searchHit : searchHitArr) {Map<String, Object> temp = searchHit.getSourceAsMap();data.add(temp);}return data;} catch (Exception e) {e.printStackTrace();}return new ArrayList<>();}}

测试查询接口: EsQueryController.java

import com.aylvn.bn.common.dto.BaseResp;
import com.aylvn.bn.common.dto.Pager;
import com.aylvn.bn.elasticsearch.constant.EsConstant;
import com.aylvn.bn.elasticsearch.service.EsQueryOperation;
import io.swagger.annotations.Api;
import io.swagger.annotations.ApiOperation;
import lombok.extern.slf4j.Slf4j;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;import javax.annotation.Resource;
import java.util.List;
import java.util.Map;@Api(tags = "ES查询操作")
@Slf4j
@RestController
public class EsQueryController {@Resourceprivate EsQueryOperation esQueryOperation;@ApiOperation("countAll")@GetMapping("/countAll")public BaseResp<Long> countAll() {return BaseResp.ok(esQueryOperation.countAll(EsConstant.INDEX_COMPANY));}@ApiOperation("countMatches")@GetMapping("/countMatches")public BaseResp<Long> countMatches(@RequestParam Map<String, Object> matches) {log.info("matches:{}", matches);return BaseResp.ok(esQueryOperation.countMatches(EsConstant.INDEX_COMPANY, matches));}@ApiOperation("queryMatches")@GetMapping("/queryMatches")public BaseResp<List<Map<String, Object>>> queryMatches(@RequestParam Map<String, Object> matches, Integer pageNum, Integer pageSize) {Pager pager = new Pager(pageNum, pageSize);log.info("matches:{}, pager:{}", matches, pager);matches.remove("pageNum");matches.remove("pageSize");return BaseResp.ok(esQueryOperation.queryMatches(EsConstant.INDEX_COMPANY, matches, pager));}}

相关的工具类:
Pager.java


/*** Mysql分页** @author aylvn* @date 2020-01-01*/
public class Pager {/*** 记录数*/private Long total = 0L;/*** 页码*/private Integer pageNum = 1;/*** 每页要显示的记录数*/private Integer pageSize = 10;/*** 页数*/private Integer pages;/*** 当前页的记录数*/private Integer size;public Pager() {}public Pager(Integer pageNum, Integer pageSize) {this.pageNum = pageNum;this.pageSize = pageSize;}public Long getTotal() {return total;}public void setTotal(Long total) {this.total = total;}public Integer getPageNum() {return pageNum;}public void setPageNum(Integer pageNum) {this.pageNum = pageNum;}public Integer getPageSize() {return pageSize;}public void setPageSize(Integer pageSize) {this.pageSize = pageSize;}/*** 查询起始位置*/public Integer getOffset() {return (pageNum - 1) * pageSize;}/*** 总页数*/public Long getPageCount() {Long pageCount = total / pageSize;if (total % pageSize != 0) {pageCount++;}return pageCount;}public Integer getPages() {return pages;}public void setPages(Integer pages) {this.pages = pages;}public Integer getSize() {return size;}public void setSize(Integer size) {this.size = size;}@Overridepublic String toString() {return "Pager{" +"total=" + total +", pageNum=" + pageNum +", pageSize=" + pageSize +", pages=" + pages +", size=" + size +'}';}
}

BeanUtil.java

import org.apache.commons.beanutils.BeanUtils;
import org.apache.commons.beanutils.BeanUtilsBean;
import org.apache.commons.beanutils.ConvertUtilsBean;
import org.apache.commons.beanutils.converters.DateConverter;
import org.springframework.context.ConfigurableApplicationContext;import java.lang.reflect.InvocationTargetException;
import java.util.HashMap;
import java.util.Map;/*** Bean工具类** @author Aylvn* @date 2021-02-02*/
public class BeanUtil<T> {/*** 将管理上下文的applicationContext设置成静态变量,供全局调用*/public static ConfigurableApplicationContext applicationContext;/** BeanUtils.describe()日期格式*/static {System.out.println("-------------------fsdfsdfsfsdf-----------");ConvertUtilsBean convertUtils = BeanUtilsBean.getInstance().getConvertUtils();DateConverter dateConverter = new DateConverter();dateConverter.setPattern("yyyy-MM-dd HH:mm:ss");convertUtils.register(dateConverter, String.class);}private BeanUtil() {}/*** 定义一个获取已经实例化bean的方法*/public static <T> T getBean(Class<T> c) {return applicationContext.getBean(c);}/*** Bean转Map, 格式化时间, 去class** @author Aylvn* @date 2021-02-09*/public static Map<String, String> beanToMap(Object bean) {try {ConvertUtilsBean convertUtils = BeanUtilsBean.getInstance().getConvertUtils();DateConverter dateConverter = new DateConverter();dateConverter.setPattern("yyyy-MM-dd HH:mm:ss");convertUtils.register(dateConverter, String.class);Map<String, String> map = BeanUtils.describe(bean);// 移除key为class的对象map.remove("class");return map;} catch (IllegalAccessException | InvocationTargetException | NoSuchMethodException e) {e.printStackTrace();}return new HashMap<>();}
}

DateUtil.java


import java.text.SimpleDateFormat;
import java.util.Calendar;
import java.util.Date;public class DateUtil {public static final String DATE_TIME = "yyyy-MM-dd HH:mm:ss";public static final String DATE = "yyyy-MM-dd";public static final String TIME = "HH:mm:ss";private static Calendar calendar;private static Date date;public static synchronized String getDateStr(String dateFormat, int offset) {SimpleDateFormat sdf = new SimpleDateFormat(dateFormat);calendar = Calendar.getInstance();date = calendar.getTime();if (offset != 0) {date = new Date(date.getTime() + offset);}return sdf.format(date);}public static String getDateStr(String dateFormat) {return getDateStr(dateFormat, 0);}public static String getDateStr() {return getDateStr(DATE_TIME);}public static synchronized Date getDate(String dateFormat, int offset) {calendar = Calendar.getInstance();date = calendar.getTime();if (offset != 0) {date = new Date(date.getTime() + offset);}return date;}public static Date getDate(String dateFormat) {return getDate(dateFormat, 0);}public static Date getDate() {return getDate(DATE_TIME, 0);}}

BaseResp.java

import com.alibaba.fastjson.annotation.JSONField;
import com.aylvn.bn.common.constant.HttpStatus;
import io.swagger.annotations.Api;
import io.swagger.annotations.ApiModelProperty;
import lombok.Getter;
import lombok.Setter;
import lombok.ToString;import java.util.Date;/*** 通过此对象封装控制层返回的JSON结果** @author aylvn*/
@Api(tags = "通用响应Bean")
@Setter
@Getter
@ToString
public class BaseResp<T> {@ApiModelProperty(value = "状态码")private int code;@ApiModelProperty(value = "状态: ok | error")private String status;@ApiModelProperty(value = "信息")private String msg;/*** 参数类型泛型T时在运行期进行强制转换的, 编译期不检测转换类型*/@ApiModelProperty(value = "具体业务数据")private T data;@ApiModelProperty(value = "分页参数")private Pager pager;@ApiModelProperty(value = "日期时间")// fastjson序列化@JSONField(format = "yyyy-MM-dd HH:mm:ss")private Date dateTime = new Date();@ApiModelProperty(value = "时间戳")private Long timeStamp = System.currentTimeMillis();@ApiModelProperty(value = "token")private String token;@ApiModelProperty(value = "expires")private String expires;public BaseResp() {}private BaseResp(int code, String msg, T data, Pager pager) {this.code = code;this.status = HttpStatus.status(code);this.msg = msg;this.data = data;this.pager = pager;}/***********************正常返回***************************/public static <T> BaseResp<T> ok() {return status(HttpStatus.OK.code());}public static <T> BaseResp<T> ok(String msg) {return status(HttpStatus.OK.code(), msg);}public static <T> BaseResp<T> ok(T data) {return status(HttpStatus.OK.code(), data);}public static <T> BaseResp<T> ok(String msg, T data) {return status(HttpStatus.OK.code(), msg, data);}public static <T> BaseResp<T> ok(T data, Pager pager) {return status(HttpStatus.OK.code(), data, pager);}public static <T> BaseResp<T> ok(String msg, T data, Pager pager) {return status(HttpStatus.OK.code(), msg, data, pager);}/***********************异常返回***************************/public static <T> BaseResp<T> error() {return status(HttpStatus.ERROR.code());}public static <T> BaseResp<T> error(String msg) {return status(HttpStatus.ERROR.code(), msg);}public static <T> BaseResp<T> error(Exception e) {return status(HttpStatus.ERROR.code(), e != null ? e.getMessage() : null);}/***********************异常状态自定义状态码返回***************************/public static <T> BaseResp<T> status(int code) {return status(code, null, null, null);}public static <T> BaseResp<T> status(int code, String msg) {return status(code, msg, null, null);}private static <T> BaseResp<T> status(int code, T data) {return status(code, null, data, null);}public static <T> BaseResp<T> status(int code, String msg, T data) {return status(code, msg, data, null);}private static <T> BaseResp<T> status(int code, T data, Pager pager) {return status(code, null, data, pager);}private static <T> BaseResp<T> status(int code, String msg, T data, Pager pager) {return new BaseResp<>(code, msg, data, pager);}}

HttpStatus.java


/*** 自定义Http状态码** @author Aylvn* @date 2019/10/15 0:33*/public enum HttpStatus {/*** 正常返回*/OK(200, "Ok"),/*** 未认证*/UNAUTHORIZED(401, "UnAuthorized"),/*** 服务端拒绝执行该请求*/FORBIDDEN(403, "Forbidden"),/*** 错误*/ERROR(500, "Error");/*** 状态码*/private final int code;/*** 状态描述*/private final String status;/*** @param code   状态码* @param status 状态名称* @author Aylvn* @date 2020-01-01*/HttpStatus(Integer code, String status) {this.code = code;this.status = status;}/*** 根据状态码获取状态描述** @param code 状态码* @return 状态描述* @author Aylvn* @date 2020-01-01*/public static String status(int code) {for (HttpStatus httpStatus : HttpStatus.values()) {if (httpStatus.code() == code) {return httpStatus.status();}}return "Unknown";}/*** 状态码** @return 状态码* @author Aylvn* @date 2020-01-01*/public int code() {return this.code;}/*** 状态描述** @return 状态描述* @author Aylvn* @date 2020-01-01*/public String status() {return this.status;}}

五. 后续

ELK(Elasticsearch + Logstash + Kibana)
ELK, 俗称elastic全家桶, “ELK”是三个开源项目的首字母缩写,这三个项目分别是:Elasticsearch、Logstash 和 Kibana。Elasticsearch 是一个搜索和分析引擎。Logstash 是服务器端数据处理管道,能够同时从多个来源采集数据,转换数据,然后将数据发送到诸如 Elasticsearch 等“存储库”中。Kibana 则可以让用户在 Elasticsearch 中使用图形和图表对数据进行可视化。

1. Kibana安装

从elasticsearch官网下载与elasticsearch相同版本的kibana, 解压至磁盘. 打开kibana目录下的config/kibana.yml文件, 添加es集群配置, 默认为: elasticsearch.hosts: ["http://localhost:9200"]  执行bin/kibana.bat即可启动kibana. 从浏览器输入http://localhost:5601/进入kibana首页. 以下为kibana的discover页面, 可通过配置index pattern查看某个或某类索引的详情, 也可以在页面使用KQL语音进行查询.

2. logstash安装
这里可以将logstash当做ETL工具使用, logstash自带jdbc-input-plugins, 可以方便地从mysql库同步数据到elasticsearch库, 默认同步频率为1分钟一次. 注意: 同步时不能够处理mysql中被删除的记录, 可以使用软删除(mysql库中添加删除标识), 或先软删除, 再设定一个定时任务去清理mysql库以及elasticsearch中标识为删除状态的数据.
从elasticsearch官网下载与elasticsearch同版本的logstash, 解压至磁盘. 打开config目录, 添加mysql.conf文件,

这里使用的是可以进行多表同步的配置, mysql.conf

input {stdin {}jdbc {jdbc_driver_library => "D:\Code\maven-repository\mysql\mysql-connector-java\8.0.19\mysql-connector-java-8.0.19.jar"jdbc_driver_class => "com.mysql.cj.jdbc.Driver"jdbc_connection_string => "jdbc:mysql://localhost:3306/db_notes?characterEncoding=utf8&useSSL=false&serverTimezone=UTC&rewriteBatchedStatements=true&allowPublicKeyRetrieval=true"jdbc_user => "dev"jdbc_password => "1234"# 时区# jdbc_default_timezone => "Asia/Shanghai"# 分页jdbc_paging_enabled => "true"jdbc_page_size => "1000"# 记录上一次运行的值record_last_run => "true"# 设置监听间隔 各字段含义(由左至右)分、时、天、月、年,全部为*默认含义为每分钟都更新schedule => "* * * * *"# 注意这个sql不能出现type,这是es的保留字段# statement => "SELECT # id, company_name, tel, tax, mobile, responsible, contact, position, address, post_code, # economic_type, registered_capital, operation_mode, annual_sales, build_date, number_of_workers, # email, main_business, main_product, website, created_dt, created_by, updated_dt, updated_by # FROM company_info# WHERE updated_dt >= :sql_last_value"statement_filepath => "D:\Apps\logstash-7.10.2\config\syncsql\company_info.sql"# 禁止将列名小写lowercase_column_names => "false"# 使用递增列的值 递增值取时间时, 应该写updated_dt >= :sql_last_value, 防止漏掉数据use_column_value => "true"# 递增字段的类型,numeric 表示数值类型, timestamp 表示时间戳类型tracking_column_type => "timestamp"# 递增字段的名称, 取结果集中映射后的字段名称tracking_column => "updatedDt"last_run_metadata_path => "syncpoint_table"}
}
# 针对从mysql查询出来的数据进行一些处理
filter {# 这个的意思是 test字段为null时,使用ruby这个语言进行处理,code =>'' 这里面就是写代码的# event.set("test","")意思就是 设置test字段的内容为""# 当然我们也可以先event.get("test"),获取test字段的内容,然后在进行一系列处理后if ![test]{ruby{code =>'event.set("test","")'}} mutate{# 将id字段的类型转化为integerconvert => { "id" => "integer" }# 移除查询结果中自动生成的字段, 否则严格模式下报错remove_field => ["test"]remove_field => ["@version", "@timestamp"]}
}
output {elasticsearch {# ES集群地址(如果是单机的填一个就行)hosts => ["localhost:9200"]# 索引名称index => "index_company"document_id => "%{id}"template_overwrite => "true"}# 这里输出调试,正式运行时可以注释掉stdout {# JSON格式输出codec => "json_lines"}
}

syncsql/company_info.sql

SELECTid,company_name AS companyName,tel,tax,mobile,responsible,contact,position,address,post_code AS postCode,economic_type AS economicType,registered_capital AS registeredCapital,operation_mode AS operationMode,annual_sales AS annualSales,build_date AS buildDate,number_of_workers AS numberOfWorkers,email,main_business AS mainBusiness,main_product AS mainProduct,website,DATE_FORMAT( created_dt, '%Y-%m-%d %T' ) AS createdDt,created_by AS createdBy,DATE_FORMAT( updated_dt, '%Y-%m-%d %T' ) AS updatedDt,updated_by AS updatedBy
FROMcompany_info
WHERE updated_dt >= :sql_last_value

文章较长, 希望对大家有所帮助!

Elasticsearch7.10搜索引擎RestHighLevelClient高级客户端整合Springboot基础教程相关推荐

  1. 前后端分离必备工具:Swagger快速搞定(整合SpringBoot详细教程)

    本文根据狂神教学视屏同步所做笔记 目录 一.Swagger简介 1. 前后端分离 2. Swagger引入 二.SpringBoot集成Swagger 1. 新建springboot项目 2. 导入S ...

  2. MybatisPlus整合SpringBoot全教程,用起来不要太方便

    一.快速开始 本文基于springboot.maven.jdk1.8.mysql开发,所以开始前我们需要准备好这套环境. 新建如下数据库: 建议大家选择utf8mb4这种字符集,做过微信的同学应该会知 ...

  3. linux服务器和客户端配置,Linux基础教程:YUM服务端与客户端配置步骤

    服务端配置:(分两种情况,rhce无需掌握,感兴趣的话可以研究): 情况1.yum直接使用光盘(光盘仓库默认做好了)做服务端配置 方式一:直接挂载光驱使用 mkdir /mnt/dvd mount / ...

  4. SpringBoot基础教程1-1-2 配置文件介绍

    2019独角兽企业重金招聘Python工程师标准>>> 1. 概述 SpringBoot极大的简化了配置,常用配置都可以application.yml或者application.pr ...

  5. SpringBoot基础教程2-1-6 日志规范-使用AOP统一处理Web日志

    2019独角兽企业重金招聘Python工程师标准>>> 1. 概述 Web层作为服务的入口,对请求参数和响应结果的日志记录是必不可少的,本文结合AOP切面技术,统一处理Web日志. ...

  6. 【SpringBoot高级篇】SpringBoot集成Elasticsearch搜索引擎

    [SpringBoot高级篇]SpringBoot集成Elasticsearch搜索引擎 1. 什么是Elasticsearch? 2. 安装并运行Elasticsearch 2.1 拉取镜像 2.2 ...

  7. 【Java笔记+踩坑】SpringBoot基础3——开发。热部署+配置高级+整合NoSQL/缓存/任务/邮件/监控

      导航: [黑马Java笔记+踩坑汇总]JavaSE+JavaWeb+SSM+SpringBoot+瑞吉外卖+SpringCloud/SpringCloudAlibaba+黑马旅游+谷粒商城 目录 ...

  8. Elasticsearch入门(包含整合SpringBoot和简单实战demo)

    ElasticSearch 前言:本文的ElasticSearch版本是7.6.x 一.ElasticSearch概述 1.Lucene 在学习ElasticSearch之前,先简单了解一下Lucen ...

  9. 好玩的ES--第三篇之过滤查询,整合SpringBoot

    好玩的ES--第三篇之过滤查询,整合SpringBoot 过滤查询 过滤查询 使用 类型 term . terms Filter ranage filter exists filter ids fil ...

最新文章

  1. PNAS:水稻微生物组
  2. Mysql —— linux下使用c语言访问mySql数据库
  3. 为了输出“,可以使用如下语句print(“““)————(×)
  4. 基于数据接口文件读取的自动轨迹绘制
  5. java三层架构是不是策略模式,把「策略模式」应用到实际项目中
  6. hibernate 数据处理
  7. 非常好的在网页中显示pdf的方法
  8. 坐标轨迹计算_机器人的轨迹规划与自动导引
  9. 圣诞节美食聚会派对海报设计
  10. python redis模块_大数据入门4 | Redis安装及python中的redis模块加载
  11. MasterPage下的FindControl
  12. Python使用matplotlib可视化模拟烧烤摊每月营业额折线图
  13. C# 操作Word——设置Word文档背景色(纯色、渐变色、图片背景色)
  14. Unity 检测手机性能,区分高中低端机型(URP)
  15. 统计学家的矫情和人工智能专家的反驳
  16. Windows CMD中的findstr命令详解
  17. 统计分析中贝叶斯学派介绍
  18. natapp实现内网穿透(详解)
  19. 大规模分布式应用储能技术
  20. AutoCAD2007写的VBA,在Win7上可以正常运行,为什么在Win10系统上运行费劲

热门文章

  1. 【NLP】从双曲面到双曲几何庞加莱盘
  2. springboot+vue毕业生离校系统
  3. 使用oshi-core开发服务监控模块
  4. 我上学的时候班里有一个德国女孩儿
  5. 如果你喜欢上了一个程序员小伙,献给所有的程序员女友
  6. OpenWrt R20.10.20 / LuCI Master (git-20.256.12360-1a54222) 不能访问内网网址
  7. JD京东物流电子面单接口对接文档-快递鸟
  8. 到微软Tech•ED 2006技术大会一游
  9. 供应链管理的十一个考核指标
  10. 开源Kettle 包装Kettle 深度功能缺陷