I have a bit a problem, I have several microservices, but one throws an exception that others don't throws and work perfect ...
[2020-09-28 16:55:38.304]|ERROR|TIBCO EMS Session Dispatcher (21297)|org.hibernate.engine.jdbc.spi.SqlExceptionHelper.logExceptions^[[36m(142)^[[0;39m: --- ERROR: relation "computed.fluxeventlogging" does not exist
Position: 502
[2020-09-28 16:55:38.307]|INFO |TIBCO EMS Session Dispatcher (21297)|org.hibernate.event.internal.DefaultLoadEventListener.doOnLoad^[[36m(116)^[[0;39m: --- HHH000327: Error performing load command
org.hibernate.exception.SQLGrammarException: could not extract ResultSet
at org.hibernate.exception.internal.SQLStateConversionDelegate.convert(SQLStateConversionDelegate.java:103) ~[hibernate-core-5.4.12.Final.jar:5.4.12.Final]
at org.hibernate.exception.internal.StandardSQLExceptionConverter.convert(StandardSQLExceptionConverter.java:42) ~[hibernate-core-5.4.12.Final.jar:5.4.12.Final]
I have good permissions in a database like that:
And my entity is located in a common projet used by all microservice, added with maven dependency.
@Entity
@Data
@Table(name="fluxeventlogging", schema = "computed")
@IdClass(FluxEventLoggingIdEntity.class)
public class FluxEventLoggingEntity implements Serializable {
@Id
@Column(name = "fluxeventuuid", columnDefinition = "uuid")
private UUID fluxEventUuid;
@Column(name = "lastupdatedate")
private Instant lastUpdateDate;
@Column(name = "businessfluxtype")
private String businessFluxType;
@Column(name = "fluxprocessortype")
private String fluxProcessorType;
@Column(name = "valuewhichcauseupdate")
private String valueWhichCauseUpdate;
@Column(name = "oldvaluecause")
private String oldValueCause;
@Column(name = "newvaluecause")
private String newValueCause;
@Column(name = "currenteventlifestate")
private String currentEventLifeState;
@Column(name = "nexteventlifestate")
private String nextEventLifeState;
@Column(name = "generatederror", columnDefinition = "text")
private String generatedError;
}
however I fixed in pom the hibernate version to:
<dependency>
<groupId>org.hibernate</groupId>
<artifactId>hibernate-core</artifactId>
<version>5.4.12.Final</version>
</dependency>
because this project is the only once which use one data source for jpa like that:
custom.datasource.url: jdbc:postgresql://localhost:5432/postgres
custom.datasource.database: postgres
custom.datasource.driver: pool
custom.datasource.protocol: postgres
custom.datasource.localhost: localhost
custom.datasource.port: 5432
custom.datasource.password: postgres
custom.datasource.username: postgres
custom.datasource.driverclassname: org.postgresql.Driver
spring:
jpa:
properties:
hibernate:
dialect: org.hibernate.dialect.PostgreSQL82Dialect
hibernate:
ddl-auto: create
show-sql: false
database-platform: org.hibernate.dialect.PostgreSQLDialect
datasource:
driver-class-name: org.postgresql.Driver
url: jdbc:postgresql://localhost:5432/postgres
username: postgres
password: postgres
schema: classpath:/schema.sql
initialization-mode: always
r2dbc:
url: r2dbc:postgresql://postgres:postgres@localhost:5432/postgres
pool:
enabled: true
initial-size=: 00
max-size: 500
max-idle-time: 30m
validation-query: SELECT 1
And another data source for r2dbc (postgres reactive)
@Configuration
@Slf4j
public class DatabaseConfiguration {
private static Map<String, Object> PROPERTIES;
@Autowired
DataSource dataSource;
@Bean
public ConnectionFactory r2dbcConnectionFactory() {
if(PROPERTIES == null) {
Yaml yaml = new Yaml();
InputStream inputStream = this.getClass()
.getClassLoader()
.getResourceAsStream("application.yml");
PROPERTIES = yaml.load(inputStream);
}
log.info("Init rd2dbc with host: {}", PROPERTIES.get("custom.datasource.localhost").toString());
log.info("Init rd2dbc with port: {}", PROPERTIES.get("custom.datasource.port").toString());
log.info("Init rd2dbc with database: {}", PROPERTIES.get("custom.datasource.database").toString());
log.info("Init rd2dbc with username: {}", PROPERTIES.get("custom.datasource.username").toString());
log.info("Init rd2dbc with driver: {}", PROPERTIES.get("custom.datasource.driver").toString());
log.info("Init rd2dbc with protocol: {}", PROPERTIES.get("custom.datasource.protocol").toString());
ConnectionFactoryOptions options = ConnectionFactoryOptions.builder()
.option(ConnectionFactoryOptions.DRIVER, PROPERTIES.get("custom.datasource.driver").toString())
.option(ConnectionFactoryOptions.PROTOCOL, PROPERTIES.get("custom.datasource.protocol").toString())
.option(ConnectionFactoryOptions.USER, PROPERTIES.get("custom.datasource.username").toString())
.option(ConnectionFactoryOptions.PASSWORD, PROPERTIES.get("custom.datasource.password").toString())
.option(ConnectionFactoryOptions.HOST, PROPERTIES.get("custom.datasource.localhost").toString())
.option(ConnectionFactoryOptions.PORT, Integer.parseInt(PROPERTIES.get("custom.datasource.port").toString()))
.option(ConnectionFactoryOptions.DATABASE, PROPERTIES.get("custom.datasource.database").toString())
.build();
return ConnectionFactories.get(options);
//return ConnectionFactories.get(ConnectionFactoryOptions.parse(PROPERTIES.get("custom.r2dbc.url").toString()));
}
@Bean
public DataSource getDataSource() {
if(PROPERTIES == null) {
Yaml yaml = new Yaml();
InputStream inputStream = this.getClass()
.getClassLoader()
.getResourceAsStream("application.yml");
PROPERTIES = yaml.load(inputStream);
}
log.info("Init datasource with url: {}", PROPERTIES.get("custom.datasource.url").toString());
log.info("Init datasource with username: {}", PROPERTIES.get("custom.datasource.username").toString());
log.info("Init datasource with driver: {}", PROPERTIES.get("custom.datasource.driverclassname").toString());
DataSourceBuilder dataSourceBuilder = DataSourceBuilder.create();
dataSourceBuilder.url(PROPERTIES.get("custom.datasource.url").toString());
dataSourceBuilder.username(PROPERTIES.get("custom.datasource.username").toString());
dataSourceBuilder.password(PROPERTIES.get("custom.datasource.password").toString());
dataSourceBuilder.driverClassName(PROPERTIES.get("custom.datasource.driverclassname").toString());
return dataSourceBuilder.build();
}
@Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
//JpaVendorAdapteradapter can be autowired as well if it's configured in application properties.
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
vendorAdapter.setGenerateDdl(false);
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(vendorAdapter);
//Add package to scan for entities.
factory.setPackagesToScan("fr.microservice2.database", "fr.microservice.common");
factory.setDataSource(dataSource);
return factory;
}
@Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
JpaTransactionManager txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory(entityManagerFactory);
return txManager;
}
I don't know why only this microservice don't found my table computed.fluxeventlogging
...
while I have plenty of other tables in this diagram where it does not bring up the problem
Anyone have an idea please? Thank you and best regards