Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
A
amos-boot-zx-biz
Project
Project
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Jobs
Commits
Open sidebar
项目统一框架
一体化_户用光伏项目代码
amos-boot-zx-biz
Commits
d502cfd8
Commit
d502cfd8
authored
Jun 04, 2024
by
wujiang
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
添加alarm工程
parent
0d71826d
Expand all
Hide whitespace changes
Inline
Side-by-side
Showing
25 changed files
with
1240 additions
and
0 deletions
+1240
-0
.factorypath
amos-boot-data/amos-boot-data-alarm/.factorypath
+0
-0
pom.xml
amos-boot-data/amos-boot-data-alarm/pom.xml
+161
-0
AlarmApplication.java
...larm/src/main/java/com/yeejoin/amos/AlarmApplication.java
+60
-0
ClusterDbConfig.java
...va/com/yeejoin/amos/api/alarm/config/ClusterDbConfig.java
+73
-0
EquipExecutorConfig.java
...om/yeejoin/amos/api/alarm/config/EquipExecutorConfig.java
+40
-0
KafkaInitialConfiguration.java
...join/amos/api/alarm/config/KafkaInitialConfiguration.java
+51
-0
MasterDbConfig.java
...ava/com/yeejoin/amos/api/alarm/config/MasterDbConfig.java
+71
-0
BizInfo.java
...src/main/java/com/yeejoin/amos/api/alarm/dto/BizInfo.java
+41
-0
DynamicDetails.java
...n/java/com/yeejoin/amos/api/alarm/dto/DynamicDetails.java
+22
-0
TabContent.java
.../main/java/com/yeejoin/amos/api/alarm/dto/TabContent.java
+23
-0
WarningDto.java
.../main/java/com/yeejoin/amos/api/alarm/dto/WarningDto.java
+35
-0
BaseEntity.java
...in/java/com/yeejoin/amos/api/alarm/entity/BaseEntity.java
+37
-0
PointSystem.java
...n/java/com/yeejoin/amos/api/alarm/entity/PointSystem.java
+51
-0
JumpConfig.java
...n/java/com/yeejoin/amos/api/alarm/entity2/JumpConfig.java
+19
-0
PointSystemMapper.java
.../com/yeejoin/amos/api/alarm/mapper/PointSystemMapper.java
+16
-0
JumpConfigMapper.java
.../com/yeejoin/amos/api/alarm/mapper2/JumpConfigMapper.java
+7
-0
IPointSystemService.java
...m/yeejoin/amos/api/alarm/service/IPointSystemService.java
+14
-0
AlarmKafkaConsumer.java
...ejoin/amos/api/alarm/service/impl/AlarmKafkaConsumer.java
+82
-0
PointSystemServiceImpl.java
...n/amos/api/alarm/service/impl/PointSystemServiceImpl.java
+179
-0
producerServers.java
.../yeejoin/amos/api/alarm/service/impl/producerServers.java
+80
-0
HttpContentTypeUtil.java
...com/yeejoin/amos/api/alarm/utils/HttpContentTypeUtil.java
+0
-0
application-dev.properties
...-data-alarm/src/main/resources/application-dev.properties
+68
-0
application.properties
...boot-data-alarm/src/main/resources/application.properties
+63
-0
logback-dev.xml
...a/amos-boot-data-alarm/src/main/resources/logback-dev.xml
+46
-0
pom.xml
amos-boot-data/pom.xml
+1
-0
No files found.
amos-boot-data/amos-boot-data-alarm/.factorypath
0 → 100644
View file @
d502cfd8
This diff is collapsed.
Click to expand it.
amos-boot-data/amos-boot-data-alarm/pom.xml
0 → 100644
View file @
d502cfd8
<?xml version="1.0" encoding="UTF-8"?>
<project
xmlns=
"http://maven.apache.org/POM/4.0.0"
xmlns:xsi=
"http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation=
"http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>
4.0.0
</modelVersion>
<parent>
<artifactId>
amos-boot-data
</artifactId>
<groupId>
com.amosframework.boot
</groupId>
<version>
1.0.0
</version>
</parent>
<artifactId>
amos-boot-data-alarm
</artifactId>
<name>
amos-boot-data-alarm
</name>
<dependencies>
<dependency>
<groupId>
org.springframework.cloud
</groupId>
<artifactId>
spring-cloud-starter-netflix-eureka-client
</artifactId>
</dependency>
<dependency>
<groupId>
org.springframework.boot
</groupId>
<artifactId>
spring-boot-starter-actuator
</artifactId>
</dependency>
<dependency>
<groupId>
org.typroject
</groupId>
<artifactId>
tyboot-core-foundation
</artifactId>
<version>
${tyboot-version}
</version>
</dependency>
<dependency>
<groupId>
org.typroject
</groupId>
<artifactId>
tyboot-core-restful
</artifactId>
<version>
${tyboot-version}
</version>
<exclusions>
<exclusion>
<groupId>
org.typroject
</groupId>
<artifactId>
*
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.typroject
</groupId>
<artifactId>
tyboot-core-auth
</artifactId>
<version>
${tyboot-version}
</version>
<exclusions>
<exclusion>
<groupId>
org.typroject
</groupId>
<artifactId>
*
</artifactId>
</exclusion>
</exclusions>
</dependency>
<!--kafka依赖-->
<dependency>
<groupId>
org.springframework.kafka
</groupId>
<artifactId>
spring-kafka
</artifactId>
</dependency>
<dependency>
<groupId>
org.typroject
</groupId>
<artifactId>
tyboot-component-emq
</artifactId>
<version>
1.1.20
</version>
</dependency>
<dependency>
<groupId>
org.typroject
</groupId>
<artifactId>
tyboot-component-event
</artifactId>
<version>
${tyboot-version}
</version>
<exclusions>
<exclusion>
<groupId>
org.typroject
</groupId>
<artifactId>
*
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.typroject
</groupId>
<artifactId>
tyboot-component-opendata
</artifactId>
<version>
${tyboot-version}
</version>
<exclusions>
<exclusion>
<groupId>
org.typroject
</groupId>
<artifactId>
*
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
com.yeejoin
</groupId>
<artifactId>
amos-feign-systemctl
</artifactId>
<version>
${amos.version}
</version>
</dependency>
<dependency>
<groupId>
com.yeejoin
</groupId>
<artifactId>
amos-component-config
</artifactId>
<version>
${amos.version}
</version>
</dependency>
<dependency>
<groupId>
org.typroject
</groupId>
<artifactId>
tyboot-core-rdbms
</artifactId>
<version>
${tyboot-version}
</version>
<exclusions>
<exclusion>
<groupId>
org.typroject
</groupId>
<artifactId>
*
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.typroject
</groupId>
<artifactId>
tyboot-component-cache
</artifactId>
<version>
${tyboot-version}
</version>
<exclusions>
<exclusion>
<groupId>
org.typroject
</groupId>
<artifactId>
*
</artifactId>
</exclusion>
<exclusion>
<groupId>
org.springframework.boot
</groupId>
<artifactId>
spring-boot-starter-data-redis
</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>
org.springframework.boot
</groupId>
<artifactId>
spring-boot-starter-redis
</artifactId>
<version>
1.4.5.RELEASE
</version>
</dependency>
<dependency>
<groupId>
org.jetbrains
</groupId>
<artifactId>
annotations
</artifactId>
<version>
19.0.0
</version>
<scope>
compile
</scope>
</dependency>
<dependency>
<groupId>
org.codehaus.jettison
</groupId>
<artifactId>
jettison
</artifactId>
<version>
1.3.7
</version>
</dependency>
<dependency>
<groupId>
com.alibaba
</groupId>
<artifactId>
druid-spring-boot-starter
</artifactId>
<version>
1.1.10
</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>
org.springframework.boot
</groupId>
<artifactId>
spring-boot-maven-plugin
</artifactId>
</plugin>
</plugins>
</build>
</project>
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/AlarmApplication.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
;
import
org.apache.logging.log4j.LogManager
;
import
org.apache.logging.log4j.Logger
;
import
org.mybatis.spring.annotation.MapperScan
;
import
org.springframework.boot.SpringApplication
;
import
org.springframework.boot.autoconfigure.SpringBootApplication
;
import
org.springframework.boot.context.properties.EnableConfigurationProperties
;
import
org.springframework.boot.web.servlet.ServletComponentScan
;
import
org.springframework.cloud.client.discovery.EnableDiscoveryClient
;
import
org.springframework.cloud.netflix.eureka.EnableEurekaClient
;
import
org.springframework.cloud.openfeign.EnableFeignClients
;
import
org.springframework.context.ConfigurableApplicationContext
;
import
org.springframework.context.annotation.ComponentScan
;
import
org.springframework.core.env.Environment
;
import
org.springframework.scheduling.annotation.EnableAsync
;
import
org.springframework.scheduling.annotation.EnableScheduling
;
import
org.springframework.transaction.annotation.EnableTransactionManagement
;
import
org.typroject.tyboot.core.restful.exception.GlobalExceptionHandler
;
import
java.net.InetAddress
;
/**
*
* <pre>
*
* </pre>
*
* @author gwb
* @version $Id: OpenapiApplication.java, v 0.1 2021年9月27日 下午3:29:30 gwb Exp $
*/
@SpringBootApplication
@EnableTransactionManagement
@EnableConfigurationProperties
@ServletComponentScan
@EnableDiscoveryClient
@EnableFeignClients
@EnableAsync
@EnableEurekaClient
@EnableScheduling
@MapperScan
(
value
=
{
"org.typroject.tyboot.*.*.face.orm.dao"
,
"com.yeejoin.amos.api.*.face.orm.dao"
,
"org.typroject.tyboot.face.*.orm.dao*"
,
"com.yeejoin.amos.api.*.mapper"
,
"com.yeejoin.amos.boot.biz.common.dao.mapper"
,
"com.yeejoin.amos.api.*.mapper2"
})
@ComponentScan
({
"org.typroject"
,
"com.yeejoin.amos"
})
public
class
AlarmApplication
{
private
static
final
Logger
logger
=
LogManager
.
getLogger
(
AlarmApplication
.
class
);
public
static
void
main
(
String
[]
args
)
throws
Exception
{
ConfigurableApplicationContext
context
=
SpringApplication
.
run
(
AlarmApplication
.
class
,
args
);
GlobalExceptionHandler
.
setAlwaysOk
(
true
);
Environment
env
=
context
.
getEnvironment
();
String
ip
=
InetAddress
.
getLocalHost
().
getHostAddress
();
String
port
=
env
.
getProperty
(
"server.port"
);
String
path
=
env
.
getProperty
(
"server.servlet.context-path"
);
logger
.
info
(
"\n----------------------------------------------------------\n\t"
+
"Application Amos-Biz-Boot is running! Access URLs:\n\t"
+
"Swagger文档: \thttp://"
+
ip
+
":"
+
port
+
path
+
"/doc.html\n"
+
"----------------------------------------------------------"
);
}
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/config/ClusterDbConfig.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
config
;
import
com.alibaba.druid.pool.DruidDataSource
;
import
com.baomidou.mybatisplus.extension.spring.MybatisSqlSessionFactoryBean
;
import
org.apache.ibatis.session.SqlSessionFactory
;
import
org.mybatis.spring.annotation.MapperScan
;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
org.springframework.beans.factory.annotation.Qualifier
;
import
org.springframework.beans.factory.annotation.Value
;
import
org.springframework.context.annotation.Bean
;
import
org.springframework.context.annotation.Configuration
;
import
org.springframework.core.io.support.PathMatchingResourcePatternResolver
;
import
org.springframework.jdbc.datasource.DataSourceTransactionManager
;
import
javax.sql.DataSource
;
/**
* 从数据源配置
* 若需要配置更多数据源 , 直接在yml中添加数据源配置再增加相应的新的数据源配置类即可
*/
@Configuration
@MapperScan
(
basePackages
=
"com.yeejoin.amos.api.alarm.mapper2"
,
sqlSessionFactoryRef
=
"clusterSqlSessionFactory"
)
public
class
ClusterDbConfig
{
private
Logger
logger
=
LoggerFactory
.
getLogger
(
ClusterDbConfig
.
class
);
// 精确到 cluster 目录,以便跟其他数据源隔离
private
static
final
String
MAPPER_LOCATION
=
"classpath*:mapper/cluster/*.xml"
;
@Value
(
"${spring.db2.datasource.url}"
)
private
String
dbUrl
;
@Value
(
"${spring.db2.datasource.username}"
)
private
String
username
;
@Value
(
"${spring.db2.datasource.password}"
)
private
String
password
;
@Value
(
"${spring.db2.datasource.driver-class-name}"
)
private
String
driverClassName
;
@Bean
(
name
=
"clusterDataSource2"
)
//声明其为Bean实例
public
DataSource
clusterDataSource
()
{
DruidDataSource
datasource
=
new
DruidDataSource
();
datasource
.
setUrl
(
this
.
dbUrl
);
datasource
.
setUsername
(
username
);
datasource
.
setPassword
(
password
);
datasource
.
setDriverClassName
(
driverClassName
);
return
datasource
;
}
@Bean
(
name
=
"clusterTransactionManager"
)
public
DataSourceTransactionManager
clusterTransactionManager
()
{
return
new
DataSourceTransactionManager
(
clusterDataSource
());
}
@Bean
(
name
=
"clusterSqlSessionFactory"
)
public
SqlSessionFactory
clusterSqlSessionFactory
(
@Qualifier
(
"clusterDataSource2"
)
DataSource
culsterDataSource
)
throws
Exception
{
final
MybatisSqlSessionFactoryBean
sessionFactory
=
new
MybatisSqlSessionFactoryBean
();
sessionFactory
.
setDataSource
(
culsterDataSource
);
sessionFactory
.
setMapperLocations
(
new
PathMatchingResourcePatternResolver
()
.
getResources
(
ClusterDbConfig
.
MAPPER_LOCATION
));
sessionFactory
.
setTypeAliasesPackage
(
"com.yeejoin.amos.boot.module.jxiop.biz.entity2"
);
//mybatis 数据库字段与实体类属性驼峰映射配置
sessionFactory
.
getObject
().
getConfiguration
().
setMapUnderscoreToCamelCase
(
true
);
return
sessionFactory
.
getObject
();
}
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/config/EquipExecutorConfig.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
config
;
import
lombok.extern.slf4j.Slf4j
;
import
org.springframework.context.annotation.Bean
;
import
org.springframework.context.annotation.Configuration
;
import
org.springframework.scheduling.annotation.EnableAsync
;
import
org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor
;
import
java.util.concurrent.Executor
;
import
java.util.concurrent.ThreadPoolExecutor
;
@Slf4j
@Configuration
@EnableAsync
public
class
EquipExecutorConfig
{
@Bean
(
name
=
"equipAsyncExecutor"
)
public
Executor
asyncServiceExecutor
()
{
ThreadPoolTaskExecutor
executor
=
new
ThreadPoolTaskExecutor
();
//配置核心线程数
executor
.
setCorePoolSize
(
10
);
//配置最大线程数
executor
.
setMaxPoolSize
(
500
);
//配置队列大小
executor
.
setQueueCapacity
(
2000
);
//配置线程池中的线程的名称前缀
executor
.
setThreadNamePrefix
(
"namePrefix"
);
//线程池维护线程所允许的空闲时间
executor
.
setKeepAliveSeconds
(
30
);
// rejection-policy:当pool已经达到max size的时候,如何处理新任务
// CALLER_RUNS:不在新线程中执行任务,而是有调用者所在的线程来执行--拒绝策略
executor
.
setRejectedExecutionHandler
(
new
ThreadPoolExecutor
.
CallerRunsPolicy
());
//执行初始化
executor
.
initialize
();
//等待所有任务结束后再关闭线程池
executor
.
setWaitForTasksToCompleteOnShutdown
(
true
);
return
executor
;
}
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/config/KafkaInitialConfiguration.java
0 → 100644
View file @
d502cfd8
//package com.yeejoin.amos.api.alarm.config;
//
//import org.apache.kafka.clients.admin.AdminClient;
//import org.apache.kafka.clients.admin.AdminClientConfig;
//import org.apache.kafka.clients.admin.NewTopic;
//import org.springframework.beans.factory.annotation.Value;
//import org.springframework.context.annotation.Bean;
//import org.springframework.context.annotation.Configuration;
//import org.springframework.kafka.core.KafkaAdmin;
//
//import java.util.HashMap;
//import java.util.Map;
//
//import static org.bouncycastle.asn1.pkcs.PKCSObjectIdentifiers.md5;
//
//@Configuration
//public class KafkaInitialConfiguration {
//
//
//
//
// /***
// * 创建top 10个分区1个副本
// * 通过bean创建(bean的名字为initialTopic)
// * @return
// */
//
// @Bean
// public NewTopic initialTopic1() {
//
// return new NewTopic("jf1",3, (short) 1 );
// }
//
//
// @Bean
// public KafkaAdmin kafkaAdmin() {
// Map<String, Object> props = new HashMap<>();
// //配置Kafka实例的连接地址
// props.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, "121.199.39.218:9092");
// KafkaAdmin admin = new KafkaAdmin(props);
// return admin;
// }
//
// @Bean
// public AdminClient adminClient() {
// return AdminClient.create(kafkaAdmin().getConfig());
// }
//
//
//
//}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/config/MasterDbConfig.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
config
;
import
com.alibaba.druid.pool.DruidDataSource
;
import
com.baomidou.mybatisplus.extension.spring.MybatisSqlSessionFactoryBean
;
import
org.apache.ibatis.session.SqlSessionFactory
;
import
org.mybatis.spring.annotation.MapperScan
;
import
org.slf4j.Logger
;
import
org.slf4j.LoggerFactory
;
import
org.springframework.beans.factory.annotation.Qualifier
;
import
org.springframework.beans.factory.annotation.Value
;
import
org.springframework.context.annotation.Bean
;
import
org.springframework.context.annotation.Configuration
;
import
org.springframework.context.annotation.Primary
;
import
org.springframework.core.io.support.PathMatchingResourcePatternResolver
;
import
org.springframework.jdbc.datasource.DataSourceTransactionManager
;
import
javax.sql.DataSource
;
import
java.util.Properties
;
@Configuration
@MapperScan
(
basePackages
=
"com.yeejoin.amos.api.alarm.mapper"
,
sqlSessionFactoryRef
=
"masterSqlSessionFactory1"
)
public
class
MasterDbConfig
{
private
Logger
logger
=
LoggerFactory
.
getLogger
(
MasterDbConfig
.
class
);
// 精确到 master 目录,以便跟其他数据源隔离
private
static
final
String
MAPPER_LOCATION
=
"classpath*:mapper/*.xml"
;
@Value
(
"${spring.datasource.url}"
)
private
String
dbUrl
;
@Value
(
"${spring.datasource.username}"
)
private
String
username
;
@Value
(
"${spring.datasource.password}"
)
private
String
password
;
@Value
(
"${spring.datasource.driver-class-name}"
)
private
String
driverClassName
;
@Bean
(
name
=
"masterDataSource"
)
//声明其为Bean实例
@Primary
//在同样的DataSource中,首先使用被标注的DataSource
public
DataSource
masterDataSource
()
{
DruidDataSource
datasource
=
new
DruidDataSource
();
datasource
.
setUrl
(
this
.
dbUrl
);
datasource
.
setUsername
(
username
);
datasource
.
setPassword
(
password
);
datasource
.
setDriverClassName
(
driverClassName
);
return
datasource
;
}
@Bean
(
name
=
"masterTransactionManager"
)
@Primary
public
DataSourceTransactionManager
masterTransactionManager
()
{
return
new
DataSourceTransactionManager
(
masterDataSource
());
}
@Bean
(
name
=
"masterSqlSessionFactory1"
)
@Primary
public
SqlSessionFactory
masterSqlSessionFactory
(
@Qualifier
(
"masterDataSource"
)
DataSource
masterDataSource
)
throws
Exception
{
final
MybatisSqlSessionFactoryBean
sessionFactory
=
new
MybatisSqlSessionFactoryBean
();
sessionFactory
.
setDataSource
(
masterDataSource
);
sessionFactory
.
setMapperLocations
(
new
PathMatchingResourcePatternResolver
()
.
getResources
(
MasterDbConfig
.
MAPPER_LOCATION
));
sessionFactory
.
setTypeAliasesPackage
(
"com.yeejoin.amos.boot.module.jxiop.api.entity"
);
//mybatis 数据库字段与实体类属性驼峰映射配置
sessionFactory
.
getObject
().
getConfiguration
().
setMapUnderscoreToCamelCase
(
true
);
return
sessionFactory
.
getObject
();
}
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/dto/BizInfo.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
dto
;
import
lombok.Data
;
import
java.util.List
;
/**
* @description:
* @author: tw
* @createDate: 2023/6/19
*/
@Data
public
class
BizInfo
{
private
String
sourceAttributionDesc
;
private
String
sourceAttribution
;
private
List
<
DynamicDetails
>
dynamicDetails
;
private
String
warningObjectCode
;
private
String
warningTime
;
private
String
warningObjectName
;
private
String
warningObjectType
;
private
String
warningObjectLinkUrl
;
public
BizInfo
(
String
sourceAttributionDesc
,
String
sourceAttribution
,
List
<
DynamicDetails
>
dynamicDetails
,
String
warningObjectCode
,
String
warningTime
,
String
warningObjectName
,
String
warningObjectType
,
String
warningObjectLinkUrl
)
{
this
.
sourceAttributionDesc
=
sourceAttributionDesc
;
this
.
sourceAttribution
=
sourceAttribution
;
this
.
dynamicDetails
=
dynamicDetails
;
this
.
warningObjectCode
=
warningObjectCode
;
this
.
warningTime
=
warningTime
;
this
.
warningObjectName
=
warningObjectName
;
this
.
warningObjectType
=
warningObjectType
;
this
.
warningObjectLinkUrl
=
warningObjectLinkUrl
;
}
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/dto/DynamicDetails.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
dto
;
import
lombok.Data
;
import
java.util.List
;
/**
* @description:
* @author: tw
* @createDate: 2023/6/19
*/
@Data
public
class
DynamicDetails
{
private
String
tabName
;
private
List
<
TabContent
>
tabContent
;
public
DynamicDetails
(
String
tabName
,
List
<
TabContent
>
tabContent
)
{
this
.
tabName
=
tabName
;
this
.
tabContent
=
tabContent
;
}
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/dto/TabContent.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
dto
;
import
lombok.Data
;
/**
* @description:
* @author: tw
* @createDate: 2023/6/19
*/
@Data
public
class
TabContent
{
private
String
label
;
private
String
type
;
private
Object
value
;
private
String
key
;
public
TabContent
(
String
label
,
String
type
,
Object
value
,
String
key
)
{
this
.
label
=
label
;
this
.
type
=
type
;
this
.
value
=
value
;
this
.
key
=
key
;
}
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/dto/WarningDto.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
dto
;
import
lombok.Data
;
import
java.util.List
;
/**
* @description:
* @author: tw
* @createDate: 2023/6/19
*/
@Data
public
class
WarningDto
{
private
BizInfo
bizInfo
;
private
String
indexKey
;
private
String
indexValue
;
private
String
traceId
;
public
WarningDto
(
String
indexKey
,
String
indexValue
,
String
traceId
,
String
sourceAttributionDesc
,
String
sourceAttribution
,
List
<
DynamicDetails
>
dynamicDetails
,
String
warningObjectCode
,
String
warningTime
,
String
warningObjectName
,
String
warningObjectType
,
String
warningObjectLinkUrl
)
{
this
.
bizInfo
=
new
BizInfo
(
sourceAttributionDesc
,
sourceAttribution
,
dynamicDetails
,
warningObjectCode
,
warningTime
,
warningObjectName
,
warningObjectType
,
warningObjectLinkUrl
);
this
.
indexKey
=
indexKey
;
this
.
indexValue
=
indexValue
;
this
.
traceId
=
traceId
;
}
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/entity/BaseEntity.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
entity
;
import
com.baomidou.mybatisplus.annotation.FieldFill
;
import
com.baomidou.mybatisplus.annotation.IdType
;
import
com.baomidou.mybatisplus.annotation.TableField
;
import
com.baomidou.mybatisplus.annotation.TableId
;
import
com.fasterxml.jackson.databind.annotation.JsonSerialize
;
import
com.fasterxml.jackson.databind.ser.std.ToStringSerializer
;
import
lombok.Data
;
import
lombok.experimental.Accessors
;
import
java.io.Serializable
;
import
java.util.Date
;
/**
* @description: 公共实体
* @author: duanwei
**/
@Data
@Accessors
(
chain
=
true
)
public
class
BaseEntity
implements
Serializable
{
private
static
final
long
serialVersionUID
=
-
5464322936854328207L
;
@TableId
(
type
=
IdType
.
ID_WORKER
)
@JsonSerialize
(
using
=
ToStringSerializer
.
class
)
private
Long
id
;
/**
* 新增和更新执行
*/
@TableField
(
value
=
"create_date"
,
fill
=
FieldFill
.
INSERT
)
private
Date
createDate
;
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/entity/PointSystem.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
entity
;
import
com.baomidou.mybatisplus.annotation.TableField
;
import
com.baomidou.mybatisplus.annotation.TableName
;
import
io.swagger.annotations.ApiModel
;
import
io.swagger.annotations.ApiModelProperty
;
import
lombok.Data
;
import
lombok.EqualsAndHashCode
;
import
lombok.experimental.Accessors
;
/**
* @description:
* @author: tw
* @createDate: 2023/6/19
*/
@Data
@EqualsAndHashCode
(
callSuper
=
true
)
@Accessors
(
chain
=
true
)
@TableName
(
"dz_point_system"
)
@ApiModel
(
value
=
"PointSystem对象"
,
description
=
""
)
public
class
PointSystem
extends
BaseEntity
{
@ApiModelProperty
(
value
=
"场站"
)
@TableField
(
"station"
)
private
String
station
;
@ApiModelProperty
(
value
=
"二维码"
)
@TableField
(
"number"
)
private
String
number
;
@ApiModelProperty
(
value
=
"类型"
)
@TableField
(
"type"
)
private
String
type
;
@ApiModelProperty
(
value
=
"'地址'"
)
@TableField
(
"address"
)
private
String
address
;
@ApiModelProperty
(
value
=
"测点类型"
)
@TableField
(
"point_type"
)
private
String
pointType
;
@ApiModelProperty
(
value
=
"测点值"
)
@TableField
(
"value"
)
private
String
value
;
@ApiModelProperty
(
value
=
"功能码"
)
@TableField
(
"function_num"
)
private
String
functionNum
;
@ApiModelProperty
(
value
=
"kks码"
)
@TableField
(
"kks"
)
private
String
kks
;
@ApiModelProperty
(
value
=
"網管地址"
)
@TableField
(
"gateway_id"
)
private
String
gatewayId
;
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/entity2/JumpConfig.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
entity2
;
import
com.baomidou.mybatisplus.annotation.TableField
;
import
com.baomidou.mybatisplus.annotation.TableName
;
import
lombok.Data
;
import
lombok.experimental.Accessors
;
@Data
@Accessors
(
chain
=
true
)
@TableName
(
"jump_config"
)
public
class
JumpConfig
{
@TableField
(
"id"
)
private
Integer
id
;
@TableField
(
"url"
)
private
String
url
;
@TableField
(
"type"
)
private
String
type
;
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/mapper/PointSystemMapper.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
mapper
;
import
com.baomidou.mybatisplus.core.mapper.BaseMapper
;
import
com.yeejoin.amos.api.alarm.entity.PointSystem
;
/**
* @description:
* @author: tw
* @createDate: 2023/6/19
*/
public
interface
PointSystemMapper
extends
BaseMapper
<
PointSystem
>
{
//推送预警
public
void
sendWarning
();
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/mapper2/JumpConfigMapper.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
mapper2
;
import
com.baomidou.mybatisplus.core.mapper.BaseMapper
;
import
com.yeejoin.amos.api.alarm.entity2.JumpConfig
;
public
interface
JumpConfigMapper
extends
BaseMapper
<
JumpConfig
>
{
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/service/IPointSystemService.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
service
;
/**
* @description:
* @author: tw
* @createDate: 2023/6/19
*/
public
interface
IPointSystemService
{
//触发风险预警
public
void
sendWarning
(
String
address
,
String
value
,
String
valueLabe
,
String
gatewayId
);
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/service/impl/AlarmKafkaConsumer.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
service
.
impl
;
import
org.apache.kafka.clients.consumer.ConsumerRecord
;
import
org.springframework.beans.factory.annotation.Autowired
;
import
org.springframework.kafka.annotation.KafkaListener
;
import
org.springframework.kafka.annotation.TopicPartition
;
import
org.springframework.kafka.support.Acknowledgment
;
import
org.springframework.stereotype.Service
;
/**
* @description: 监听设备告警信息
* @author: tw
* @createDate: 2023/6/27
*/
@Service
public
class
AlarmKafkaConsumer
{
@Autowired
PointSystemServiceImpl
pointSystemServiceImpl
;
//消费者来处理消息
@KafkaListener
(
id
=
"alarmInfo"
,
topics
={
"${kafka.equipment.alarm}"
})
public
void
message1
(
String
record
,
Acknowledgment
ack
){
// 处理业务
String
date
=
record
;
System
.
out
.
println
(
"消息进来了"
+
record
);
//异步触发预警
pointSystemServiceImpl
.
sendWarningAsync
(
date
);
//手动提交
ack
.
acknowledge
();
}
@KafkaListener
(
id
=
"user2"
,
topics
={
"${kafka.equipment.test}"
})
public
void
message2
(
String
record
,
Acknowledgment
ack
){
String
date
=
record
;
System
.
out
.
println
(
"消息进来了 8888888888888888888888"
);
}
// public void message1( ConsumerRecord<?, ?> record, Acknowledgment ack){
// // 消费的哪个topic、partition的消息,打印出消息内容
//
// StringBuffer sb = new StringBuffer();
// // 主题
// sb.append(record.topic() + "-");
// // 分区
// sb.append(record.partition() + "-");
// // 需要消费的值
// sb.append(record.value() + "-");
// // 位移
// sb.append(record.offset());
//
// System.out.println( "消费者进行消费:"+ sb);
// ack.acknowledge();
//
// }
// // 简单消费者,groupId可以任意起
// @KafkaListener(id = "Consumer0", groupId = "jf0-group", topics = "jf1", topicPartitions = {
// @TopicPartition(topic = "jf1", partitions = {"0"}),
// }, containerFactory = "kafkaListenerContainerFactory")
// public void consumer0(ConsumerRecord<String, String> records, Acknowledgment ack) {
// this.message1(records,ack);
// }
//
// @KafkaListener(id = "Consumer1", groupId = "jf1-group", topics = "jf1", topicPartitions = {
// @TopicPartition(topic = "jf1", partitions = {"1"}),
// }, containerFactory = "kafkaListenerContainerFactory")
// public void consumer1(ConsumerRecord<String, String> records, Acknowledgment ack) {
// this.message1(records,ack);
// }
//
// @KafkaListener(id = "Consumer2", groupId = "jf2-group", topics = "jf1", topicPartitions = {
// @TopicPartition(topic = "jf1", partitions = {"2"}),
// }, containerFactory = "kafkaListenerContainerFactory")
// public void consumer3(ConsumerRecord<String, String> records, Acknowledgment ack) {
// this.message1(records,ack);
// }
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/service/impl/PointSystemServiceImpl.java
0 → 100644
View file @
d502cfd8
package
com
.
yeejoin
.
amos
.
api
.
alarm
.
service
.
impl
;
import
com.alibaba.fastjson.JSON
;
import
com.alibaba.fastjson.JSONArray
;
import
com.alibaba.fastjson.JSONObject
;
import
com.baomidou.mybatisplus.core.conditions.query.QueryWrapper
;
import
com.baomidou.mybatisplus.extension.service.impl.ServiceImpl
;
import
com.github.xiaoymin.knife4j.core.util.StrUtil
;
import
com.yeejoin.amos.api.alarm.dto.DynamicDetails
;
import
com.yeejoin.amos.api.alarm.dto.TabContent
;
import
com.yeejoin.amos.api.alarm.dto.WarningDto
;
import
com.yeejoin.amos.api.alarm.entity2.JumpConfig
;
import
com.yeejoin.amos.api.alarm.entity.PointSystem
;
import
com.yeejoin.amos.api.alarm.mapper.PointSystemMapper
;
import
com.yeejoin.amos.api.alarm.mapper2.JumpConfigMapper
;
import
com.yeejoin.amos.api.alarm.service.IPointSystemService
;
import
com.yeejoin.amos.api.alarm.utils.HttpContentTypeUtil
;
import
org.apache.commons.lang3.StringUtils
;
import
org.apache.logging.log4j.LogManager
;
import
org.apache.logging.log4j.Logger
;
import
org.springframework.beans.factory.annotation.Autowired
;
import
org.springframework.beans.factory.annotation.Value
;
import
org.springframework.scheduling.annotation.Async
;
import
org.springframework.stereotype.Service
;
import
org.typroject.tyboot.component.emq.EmqKeeper
;
import
java.text.SimpleDateFormat
;
import
java.util.*
;
import
java.util.stream.Collectors
;
/**
* @description:
* @author: tw
* @createDate: 2023/6/19
*/
@Service
public
class
PointSystemServiceImpl
extends
ServiceImpl
<
PointSystemMapper
,
PointSystem
>
implements
IPointSystemService
{
private
static
final
Logger
logger
=
LogManager
.
getLogger
(
PointSystemServiceImpl
.
class
);
@Autowired
PointSystemMapper
pointSystemMapper
;
@Value
(
"${power.station.url}"
)
private
String
powerStationUrl
;
private
final
String
TABNAME
=
"预警问题"
;
private
final
String
TEXT
=
"text"
;
@Value
(
"${power.station.warning:104/data/analysis}"
)
private
String
STATIONWARNING
;
@Autowired
protected
EmqKeeper
emqKeeper
;
@Autowired
private
JumpConfigMapper
jumpConfigMapper
;
public
String
getJumpUrlByInfo
(
String
sbbm
)
{
List
<
JumpConfig
>
jumpConfigs
=
jumpConfigMapper
.
selectList
(
null
);
Map
<
String
,
String
>
collect
=
jumpConfigs
.
stream
().
collect
(
Collectors
.
toMap
(
JumpConfig:
:
getType
,
JumpConfig:
:
getUrl
));
if
(
StringUtils
.
isEmpty
(
sbbm
))
{
return
""
;
}
if
(
sbbm
.
indexOf
(
"BAT"
)
!=
-
1
)
{
return
collect
.
get
(
"箱变"
);
}
else
if
(
sbbm
.
indexOf
(
"WG"
)
!=
-
1
)
{
return
collect
.
get
(
"汇流箱"
);
}
else
if
(
sbbm
.
indexOf
(
"WC"
)
!=
-
1
)
{
return
collect
.
get
(
"逆变器"
);
}
else
if
(
sbbm
.
length
()
==
12
&&
sbbm
.
indexOf
(
"MD"
)
!=
-
1
)
{
return
collect
.
get
(
"风机"
);
}
else
if
(
sbbm
.
length
()
>
12
&&
sbbm
.
indexOf
(
"MD"
)
!=
-
1
)
{
return
collect
.
get
(
"风机子系统"
);
}
else
{
return
collect
.
get
(
"默认"
);
}
}
@Async
(
"equipAsyncExecutor"
)
public
void
sendWarningAsync
(
String
date
){
try
{
logger
.
info
(
"收到告警信息"
+
date
);
com
.
alibaba
.
fastjson
.
JSONObject
messageObj
=
JSON
.
parseObject
(
date
);
String
address
=
messageObj
.
get
(
"address"
).
toString
();
String
value
=
messageObj
.
get
(
"value"
).
toString
();
String
valueLabe
=
messageObj
.
get
(
"valueLabel"
).
toString
();
String
gatewayId
=
messageObj
.
get
(
"gatewayId"
).
toString
();
this
.
sendWarning
(
address
,
value
,
valueLabe
,
gatewayId
);
}
catch
(
Exception
e
)
{
e
.
printStackTrace
();
}
}
@Override
public
void
sendWarning
(
String
address
,
String
value
,
String
valueLabe
,
String
gatewayId
)
{
try
{
//通过测点地址获取,和对应值 获取kks
QueryWrapper
<
PointSystem
>
pointSystemWrapper
=
new
QueryWrapper
<>();
pointSystemWrapper
.
lambda
().
eq
(
PointSystem:
:
getAddress
,
address
);
if
(!
value
.
equals
(
"false"
)
&&
!
value
.
equals
(
"true"
)){
pointSystemWrapper
.
lambda
().
eq
(
PointSystem:
:
getValue
,
value
);
}
pointSystemWrapper
.
lambda
().
eq
(
PointSystem:
:
getGatewayId
,
gatewayId
);
List
<
PointSystem
>
pointSystems
=
pointSystemMapper
.
selectList
(
pointSystemWrapper
);
if
(
pointSystems
==
null
||
pointSystems
.
size
()
<
1
)
{
throw
new
RuntimeException
(
"获取kks码失败!"
);
}
PointSystem
pointSystem
=
pointSystems
.
get
(
0
);
if
(
pointSystem
.
getType
().
equals
(
"遥信"
)){
return
;
}
//调用获取设备相关信息
Map
<
String
,
String
>
maps
=
new
HashMap
<>();
maps
.
put
(
"type"
,
"equipinfo"
);
maps
.
put
(
"kksbm"
,
pointSystem
.
getKks
());
String
data
=
HttpContentTypeUtil
.
sendHttpPost
(
powerStationUrl
,
maps
);
if
(
StringUtils
.
isEmpty
(
data
)
||
!(
Boolean
)
JSON
.
parseObject
(
data
).
get
(
"success"
))
{
throw
new
RuntimeException
(
"获取设备信息失败!"
);
}
JSONObject
json
=
JSON
.
parseObject
(
data
);
JSONObject
jsond
=
(
JSONObject
)
json
.
get
(
"dataset"
);
JSONArray
list
=
(
JSONArray
)
jsond
.
get
(
"datas"
);
JSONObject
eqdata
=
null
;
if
(
list
==
null
||
list
.
isEmpty
())
{
throw
new
RuntimeException
(
"获取设备信息失败!"
);
}
eqdata
=
(
JSONObject
)
list
.
get
(
0
);
//组装数据,发送预警
WarningDto
warningDto
=
setWarningDto
(
pointSystem
,
eqdata
,
valueLabe
);
emqKeeper
.
getMqttClient
().
publish
(
STATIONWARNING
,
JSON
.
toJSONString
(
warningDto
).
getBytes
(),
0
,
false
);
}
catch
(
Exception
e
)
{
e
.
printStackTrace
();
throw
new
RuntimeException
(
"预警消息发送失败!"
);
}
}
public
WarningDto
setWarningDto
(
PointSystem
pointSystem
,
JSONObject
eqdata
,
String
valueLabe
){
SimpleDateFormat
sdf
=
new
SimpleDateFormat
(
"yyyy-MM-dd HH:mm:ss"
);
String
time
=
sdf
.
format
(
new
Date
());
String
warningObjectCode
=
pointSystem
.
getKks
();
List
<
TabContent
>
tabContent
=
new
ArrayList
<>();
tabContent
.
add
(
new
TabContent
(
"KKS编码"
,
TEXT
,
warningObjectCode
,
"key1"
));
tabContent
.
add
(
new
TabContent
(
"设备名称"
,
TEXT
,
eqdata
.
get
(
"kksms"
),
"key2"
));
tabContent
.
add
(
new
TabContent
(
"告警原因"
,
TEXT
,
valueLabe
,
"key3"
));
tabContent
.
add
(
new
TabContent
(
"发生时间"
,
TEXT
,
time
,
"key4"
));
DynamicDetails
dynamicDetails
=
new
DynamicDetails
(
TABNAME
,
tabContent
);
List
<
DynamicDetails
>
dynamicDetailsList
=
new
ArrayList
<>();
dynamicDetailsList
.
add
(
dynamicDetails
);
StringBuilder
indexKey
=
new
StringBuilder
(
pointSystem
.
getStation
())
.
append
(
"#"
)
.
append
(
pointSystem
.
getNumber
())
.
append
(
"#"
)
.
append
(
pointSystem
.
getFunctionNum
());
String
indexValue
=
valueLabe
;
WarningDto
WarningDto
=
new
WarningDto
(
indexKey
.
toString
(),
indexValue
,
null
,
(
String
)
eqdata
.
get
(
"sourceAttributionDesc"
),
(
String
)
eqdata
.
get
(
"sourceAttribution"
),
dynamicDetailsList
,
warningObjectCode
,
time
,
(
String
)
eqdata
.
get
(
"kksms"
),
"equip"
,
getJumpUrlByInfo
(
warningObjectCode
)
);
return
WarningDto
;
}
}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/service/impl/producerServers.java
0 → 100644
View file @
d502cfd8
//package com.yeejoin.amos.api.alarm.service.impl;
//
//import com.alibaba.fastjson.JSON;
//import org.apache.kafka.clients.admin.NewTopic;
//import org.apache.kafka.clients.producer.ProducerRecord;
//import org.checkerframework.checker.units.qual.K;
//import org.springframework.beans.factory.annotation.Autowired;
//import org.springframework.kafka.core.KafkaTemplate;
//import org.springframework.kafka.support.SendResult;
//import org.springframework.scheduling.annotation.Scheduled;
//import org.springframework.stereotype.Service;
//import org.springframework.util.concurrent.ListenableFuture;
//import org.springframework.util.concurrent.ListenableFutureCallback;
//
//import javax.annotation.PostConstruct;
//import javax.annotation.Resource;
//
///**
// * @description:
// * @author: tw
// * @createDate: 2023/6/28
// */
//@Service
//public class producerServers {
//
//
// @Autowired
// private KafkaTemplate<String, String> kafkaTemplate;
//
//@Scheduled(fixedRate = 60000)
// public void send(){
// String gg1="1668801435891929089@18873";
// String gg2="1668801435891929089@18874";
// String gg3="1668801435891929089@18875";
// String gg4="1668801435891929089@18876";
// String gg5="1668801435891929089@18877";
// String gg6="1668801435891929089@18878";
// String gg7="1668801435891929089@18879";
// String gg8="1668801435891929089@18880";
//
//
// String topic="jf1";
//
// ProducerRecord<String, String> producerRecord1 = new ProducerRecord<String, String>( topic, gg1.hashCode()%3, gg1.hashCode()%3+"", gg1+"==============="+gg1.hashCode()%5);
// ProducerRecord<String, String> producerRecord2 = new ProducerRecord<String, String>( topic, gg2.hashCode()%3,gg2.hashCode()%3+"", gg2+"==============="+gg2.hashCode()%5);
// ProducerRecord<String, String> producerRecord3 = new ProducerRecord<String, String>( topic, gg3.hashCode()%3,gg3.hashCode()%3+"", gg3+"==============="+gg3.hashCode()%5);
// ProducerRecord<String, String> producerRecord4 = new ProducerRecord<String, String>( topic, gg4.hashCode()%3,gg4.hashCode()%3+"", gg4+"==============="+gg4.hashCode()%5);
// ProducerRecord<String, String> producerRecord5 = new ProducerRecord<String, String>( topic, gg5.hashCode()%3,gg5.hashCode()%3+"", gg5+"==============="+gg5.hashCode()%5);
// ProducerRecord<String, String> producerRecord6 = new ProducerRecord<String, String>( topic, gg6.hashCode()%3,gg6.hashCode()%3+"", gg6+"==============="+gg6.hashCode()%5);
// ProducerRecord<String, String> producerRecord7 = new ProducerRecord<String, String>( topic, gg7.hashCode()%3,gg7.hashCode()%3+"", gg7+"==============="+gg7.hashCode()%5);
// ProducerRecord<String, String> producerRecord8 = new ProducerRecord<String, String>( topic, gg8.hashCode()%3,gg8.hashCode()%3+"", gg8+"==============="+gg8.hashCode()%5);
//
// System.out.println(gg1.hashCode()%3);
// System.out.println(gg2.hashCode()%3);
// System.out.println(gg3.hashCode()%3);
// System.out.println(gg4.hashCode()%3);
// System.out.println(gg5.hashCode()%3);
// System.out.println(gg6.hashCode()%3);
// System.out.println(gg7.hashCode()%3);
// System.out.println(gg8.hashCode()%3);
//
// kafkaTemplate.send(producerRecord1);
// kafkaTemplate.send(producerRecord2);
// kafkaTemplate.send(producerRecord3);
// kafkaTemplate.send(producerRecord4);
// kafkaTemplate.send(producerRecord5);
// kafkaTemplate.send(producerRecord6);
// kafkaTemplate.send(producerRecord7);
// kafkaTemplate.send(producerRecord8);
//
//
//
//
// }
//
//
//
//
//
//}
amos-boot-data/amos-boot-data-alarm/src/main/java/com/yeejoin/amos/api/alarm/utils/HttpContentTypeUtil.java
0 → 100644
View file @
d502cfd8
This diff is collapsed.
Click to expand it.
amos-boot-data/amos-boot-data-alarm/src/main/resources/application-dev.properties
0 → 100644
View file @
d502cfd8
spring.application.name
=
AMOS-ALARM
server.servlet.context-path
=
/alarm
server.port
=
11007
# jdbc_config
spring.datasource.driver-class-name
=
com.mysql.cj.jdbc.Driver
spring.datasource.url
=
jdbc:mysql://172.16.10.220:3306/equipment?useUnicode=true&allowMultiQueries=true&characterEncoding=utf-8&useJDBCCompliantTimezoneShift=true&useLegacyDatetimeCode=false&serverTimezone=Asia/Shanghai
spring.datasource.username
=
root
spring.datasource.password
=
Yeejoin@2020
spring.datasource.type
=
com.zaxxer.hikari.HikariDataSource
spring.datasource.hikari.pool-name
=
DatebookHikariCP
spring.datasource.hikari.minimum-idle
=
3
spring.datasource.hikari.maximum-pool-size
=
30
spring.datasource.hikari.auto-commit
=
true
spring.datasource.hikari.idle-timeout
=
500000
spring.datasource.hikari.max-lifetime
=
1800000
spring.datasource.hikari.connection-timeout
=
60000
spring.datasource.hikari.connection-test-query
=
SELECT 1
## db2-sync_data
spring.db2.datasource.type
:
com.alibaba.druid.pool.DruidDataSource
spring.db2.datasource.url
=
jdbc:mysql://139.9.173.44:3306/jxiop_sync_data?allowMultiQueries=true&serverTimezone=GMT%2B8&characterEncoding=utf8
spring.db2.datasource.username
=
root
spring.db2.datasource.password
=
Yeejoin@2020
spring.db2.datasource.driver-class-name
:
com.mysql.cj.jdbc.Driver
# REDIS (RedisProperties)
spring.redis.database
=
1
spring.redis.host
=
172.16.10.220
spring.redis.port
=
6379
spring.redis.password
=
yeejoin@2020
spring.redis.lettuce.pool.max-active
=
200
spring.redis.lettuce.pool.max-wait
=
-1
spring.redis.lettuce.pool.max-idle
=
10
spring.redis.lettuce.pool.min-idle
=
0
spring.redis.expire.time
=
30000
#注册中心地址
eureka.client.registry-fetch-interval-seconds
=
5
management.endpoint.health.show-details
=
always
management.endpoints.web.exposure.include
=
*
eureka.instance.health-check-url-path
=
/actuator/health
eureka.instance.lease-expiration-duration-in-seconds
=
10
eureka.instance.lease-renewal-interval-in-seconds
=
5
eureka.instance.metadata-map.management.context-path
=
${server.servlet.context-path}/actuator
eureka.instance.status-page-url-path
=
/actuator/info
eureka.instance.metadata-map.management.api-docs
=
http://localhost:${server.port}${server.servlet.context-path}/doc.html
eureka.instance.hostname
=
172.16.10.220
eureka.instance.prefer-ip-address
=
true
eureka.client.serviceUrl.defaultZone
=
http://${spring.security.user.name}:${spring.security.user.password}@172.16.10.220:10001/eureka/
spring.security.user.name
=
admin
spring.security.user.password
=
a1234560
## emqx
emqx.clean-session
=
true
emqx.client-id
=
${spring.application.name}-${random.int[1024,65536]}
emqx.broker
=
tcp://172.16.10.220:1883
emqx.user-name
=
admin
emqx.password
=
public
mqtt.scene.host
=
mqtt://172.16.10.220:8083/mqtt
mqtt.client.product.id
=
mqtt
mqtt.topic
=
topic_mqtt
spring.mqtt.completionTimeout
=
3000
amos-boot-data/amos-boot-data-alarm/src/main/resources/application.properties
0 → 100644
View file @
d502cfd8
spring.profiles.active
=
dev
server.compression.enabled
=
true
spring.jackson.dateFormat
=
yyyy-MM-dd HH:mm:ss
logging.config
=
classpath:logback-${spring.profiles.active}.xml
#设置文件上传的大小限制
spring.servlet.multipart.maxFileSize
=
3MB
spring.servlet.multipart.maxRequestSize
=
3MB
## redis失效时间
redis.cache.failure.time
=
10800
# mybatis-plus
mybatis-plus.mapper-locations
=
classpath:mapper/*Mapper.xml
#消费者所在组的名称
#消费者 的broker地址
spring.kafka.consumer.bootstrap-servers
=
121.199.39.218:9092
# 是否自动提交
spring.kafka.consumer.enable-auto-commit
=
false
#offset的消费位置
spring.kafka.consumer.auto-offset-reset
=
earliest
# 配置序列化
spring.kafka.consumer.key-deserializer
=
org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer
=
org.apache.kafka.common.serialization.StringDeserializer
#手动提交方式
spring.kafka.listener.ack-mode
=
manual_immediate
#监听类型
spring.kafka.listener.type
=
single
# 并发
#spring.kafka.listener.concurrency=5
# 发生错误后,消息重发的次数。
spring.kafka.producer.retries
=
1
#配置kafak produce的broker地址
spring.kafka.producer.bootstrap-servers
=
121.199.39.218:9092
#默认批处理大小(以字节为单位)
spring.kafka.producer.batch-size
=
16384
#生产者可以用来缓冲等待发送到服务器的记录的内存总字节数
spring.kafka.producer.buffer-memory
=
33554432
# producer配置序列化
spring.kafka.producer.key-serializer
=
org.apache.kafka.common.serialization.StringSerializer
# kafka默认的String序列化器
spring.kafka.producer.value-serializer
=
org.apache.kafka.common.serialization.StringSerializer
kafka.equipment.alarm
=
EQUIPMENT_ALARM
kafka.equipment.test
=
test88888
#电站对接第三方查询设备kks码
power.station.url
=
http://139.9.169.123:5024/prod-api/fdgl/process/DataInterface
#电站104采集预警
power.station.warning
=
104/data/analysis
\ No newline at end of file
amos-boot-data/amos-boot-data-alarm/src/main/resources/logback-dev.xml
0 → 100644
View file @
d502cfd8
<?xml version="1.0" encoding="UTF-8"?>
<configuration
debug=
"false"
>
<!-- 控制台输出 -->
<appender
name=
"STDOUT"
class=
"ch.qos.logback.core.ConsoleAppender"
>
<encoder
class=
"ch.qos.logback.classic.encoder.PatternLayoutEncoder"
>
<!--格式化输出:%d表示日期,%thread表示线程名,%-5level:级别从左显示5个字符宽度%msg:日志消息,%n是换行符-->
<pattern>
%d{yyyy-MM-dd HH:mm:ss.SSS} [%thread] %-5level %logger{50} - %msg%n
</pattern>
</encoder>
</appender>
<!-- show parameters for hibernate sql 专为 Hibernate 定制
<logger name="org.hibernate.type.descriptor.sql.BasicBinder" level="TRACE" />
<logger name="org.hibernate.type.descriptor.sql.BasicExtractor" level="DEBUG" />
<logger name="org.hibernate.SQL" level="DEBUG" />
<logger name="org.hibernate.engine.QueryParameters" level="DEBUG" />
<logger name="org.hibernate.engine.query.HQLQueryPlan" level="DEBUG" />
-->
<!--myibatis log configure-->
<logger
name=
"com.apache.ibatis"
level=
"ERROR"
/>
<logger
name=
"java.sql.Connection"
level=
"ERROR"
/>
<logger
name=
"java.sql.Statement"
level=
"ERROR"
/>
<logger
name=
"java.sql.PreparedStatement"
level=
"ERROR"
/>
<logger
name=
"com.baomidou"
level=
"ERROR"
/>
<logger
name=
"org.springframework"
level=
"INFO"
/>
<logger
name=
"org.apache.activemq"
level=
"INFO"
/>
<!-- 日志输出级别 -->
<root
level=
"ERROR"
>
<appender-ref
ref=
"STDOUT"
/>
</root>
<!--日志异步到数据库 -->
<!--<appender name="DB" class="ch.qos.logback.classic.db.DBAppender">-->
<!--<!–日志异步到数据库 –>-->
<!--<connectionSource class="ch.qos.logback.core.db.DriverManagerConnectionSource">-->
<!--<!–连接池 –>-->
<!--<dataSource class="com.mchange.v2.c3p0.ComboPooledDataSource">-->
<!--<driverClass>com.mysql.jdbc.Driver</driverClass>-->
<!--<url>jdbc:mysql://127.0.0.1:3306/databaseName</url>-->
<!--<user>root</user>-->
<!--<password>root</password>-->
<!--</dataSource>-->
<!--</connectionSource>-->
<!--</appender>-->
</configuration>
\ No newline at end of file
amos-boot-data/pom.xml
View file @
d502cfd8
...
...
@@ -22,6 +22,7 @@
<modules>
<module>
amos-boot-data-housepvapi
</module>
<module>
amos-boot-data-alarm
</module>
</modules>
</project>
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment