12:53:36.908 09:53:36.908 [vert.x-eventloop-thread-1] INFO LogUtil [45695eqId] org.folio.rest.RestVerticle runPostDeployHook no Post Deploy Hook implementation found, continuing with deployment 12:53:36.908 09:53:36.908 [vert.x-eventloop-thread-1] INFO LogUtil [45695eqId] org.folio.rest.RestVerticle start http server for apis and docs started on port 8081. 12:53:36.909 09:53:36.908 [vert.x-eventloop-thread-1] INFO LogUtil [45695eqId] org.folio.rest.RestVerticle start Documentation available at: http://localhost:8081/apidocs/ 12:53:36.913 Jul 13, 2021 9:53:36 AM io.vertx.core.impl.launcher.commands.VertxIsolatedDeployer 12:53:36.913 INFO: Succeeded in deploying verticle 12:53:39.816 09:53:39.815 [vert.x-worker-thread-19] INFO ostgresClientFactory [48602eqId] Creating new database connection pool for tenant diku 12:53:40.513 09:53:40.507 [vert.x-worker-thread-19] DEBUG KafkaConsumerWrapper [49294eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING, subscriptionPattern=folio\.Default\.\w{1,}\.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING) a Record has been received. key: 96 currentLoad: 2 globalLoad: 2 12:53:41.714 09:53:41.714 [vert.x-worker-thread-19] DEBUG taImportKafkaHandler [50501eqId] Data import event payload has been received with event type: DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING and correlationId: 9c91294f-375e-4488-9439-f4cf5bbd667b 12:53:42.822 09:53:42.822 [vert.x-worker-thread-19] DEBUG KafkaConsumerWrapper [51609eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING, subscriptionPattern=folio\.Default\.\w{1,}\.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING) a Record has been received. key: 97 currentLoad: 3 globalLoad: 3 12:53:43.208 09:53:43.208 [vert.x-worker-thread-19] DEBUG taImportKafkaHandler [51995eqId] Data import event payload has been received with event type: DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING and correlationId: c459488c-2116-4515-a26f-5ba0a01251f7 12:53:44.521 09:53:44.521 [vert.x-worker-thread-19] DEBUG KafkaConsumerWrapper [53308eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING, subscriptionPattern=folio\.Default\.\w{1,}\.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING) a Record has been received. key: 98 currentLoad: 4 globalLoad: 4 12:53:44.914 09:53:44.914 [vert.x-worker-thread-19] DEBUG taImportKafkaHandler [53701eqId] Data import event payload has been received with event type: DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING and correlationId: c8f80c49-240e-4b0c-8faa-4a8d82372ae3 12:53:46.118 09:53:46.118 [vert.x-worker-thread-19] DEBUG KafkaConsumerWrapper [54905eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING, subscriptionPattern=folio\.Default\.\w{1,}\.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING) a Record has been received. key: 99 currentLoad: 5 globalLoad: 5 12:53:46.322 09:53:46.314 [vert.x-worker-thread-19] DEBUG taImportKafkaHandler [55101eqId] Data import event payload has been received with event type: DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING and correlationId: bbdb7cc6-5183-4ef3-9e60-9760f5995596 12:53:47.118 09:53:47.117 [vert.x-worker-thread-19] DEBUG KafkaConsumerWrapper [55904eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING, subscriptionPattern=folio\.Default\.\w{1,}\.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING) kafkaConsumer.pause() requested 12:53:47.118 09:53:47.118 [vert.x-worker-thread-19] DEBUG KafkaConsumerWrapper [55905eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING, subscriptionPattern=folio\.Default\.\w{1,}\.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING) a Record has been received. key: 0 currentLoad: 6 globalLoad: 6 12:53:47.209 09:53:47.209 [vert.x-worker-thread-19] DEBUG taImportKafkaHandler [55996eqId] Data import event payload has been received with event type: DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING and correlationId: 64882bdd-3fd5-40ba-ba44-491ed0e67cce 12:53:50.613 09:53:50.612 [vert.x-worker-thread-19] INFO JooqLogger [59399eqId] 12:53:50.613 12:53:50.613 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ 12:53:50.613 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ 12:53:50.613 @@@@@@@@@@@@@@@@ @@ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@@@@@@@@@@@ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@@@@@@@ @@ @@ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@ @@@@ @@ @@ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@ @@ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ 12:53:50.613 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ 12:53:50.613 @@@@@@@@@@ @@ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@ @@ @@ @@@@ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@ @@ @@ @@@@ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@ @@ @ @ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@ @@ @@@@@@@@@@ 12:53:50.613 @@@@@@@@@@@@@@@@@@@@@@@ @@@@@@@@@@@@@ 12:53:50.613 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ 12:53:50.613 @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ Thank you for using jOOQ 3.13.6 12:53:50.613 12:54:14.459 Jul 13, 2021 9:54:14 AM io.vertx.sqlclient.impl.SocketConnectionBase 12:54:14.459 WARNING: Backend notice: severity='WARNING', code='57P02', message='terminating connection because of crash of another server process', detail='The postmaster has commanded this server process to roll back the current transaction and exit, because another server process exited abnormally and possibly corrupted shared memory.', hint='In a moment you should be able to reconnect to the database and repeat your command.', position='null', internalPosition='null', internalQuery='null', where='SQL statement "delete from diku_mod_source_record_storage.marc_indexers where marc_id = NEW.id" 12:54:14.459 PL/pgSQL function insert_marc_indexers() line 4 at SQL statement', file='postgres.c', line='2663', routine='quickdie', schema='null', table='null', column='null', dataType='null', constraint='null' 12:54:14.459 Jul 13, 2021 9:54:14 AM io.vertx.sqlclient.impl.SocketConnectionBase 12:54:14.459 WARNING: Backend notice: severity='WARNING', code='57P02', message='terminating connection because of crash of another server process', detail='The postmaster has commanded this server process to roll back the current transaction and exit, because another server process exited abnormally and possibly corrupted shared memory.', hint='In a moment you should be able to reconnect to the database and repeat your command.', position='null', internalPosition='null', internalQuery='null', where='SQL statement "delete from diku_mod_source_record_storage.marc_indexers where marc_id = NEW.id" 12:54:14.459 PL/pgSQL function insert_marc_indexers() line 4 at SQL statement', file='postgres.c', line='2663', routine='quickdie', schema='null', table='null', column='null', dataType='null', constraint='null' 12:54:14.459 Jul 13, 2021 9:54:14 AM io.vertx.sqlclient.impl.SocketConnectionBase 12:54:14.459 WARNING: Backend notice: severity='WARNING', code='57P02', message='terminating connection because of crash of another server process', detail='The postmaster has commanded this server process to roll back the current transaction and exit, because another server process exited abnormally and possibly corrupted shared memory.', hint='In a moment you should be able to reconnect to the database and repeat your command.', position='null', internalPosition='null', internalQuery='null', where='SQL statement "delete from diku_mod_source_record_storage.marc_indexers where marc_id = NEW.id" 12:54:14.459 PL/pgSQL function insert_marc_indexers() line 4 at SQL statement', file='postgres.c', line='2663', routine='quickdie', schema='null', table='null', column='null', dataType='null', constraint='null' 12:54:14.459 Jul 13, 2021 9:54:14 AM io.vertx.sqlclient.impl.SocketConnectionBase 12:54:14.459 WARNING: Backend notice: severity='WARNING', code='57P02', message='terminating connection because of crash of another server process', detail='The postmaster has commanded this server process to roll back the current transaction and exit, because another server process exited abnormally and possibly corrupted shared memory.', hint='In a moment you should be able to reconnect to the database and repeat your command.', position='null', internalPosition='null', internalQuery='null', where='SQL statement "delete from diku_mod_source_record_storage.marc_indexers where marc_id = NEW.id" 12:54:14.459 PL/pgSQL function insert_marc_indexers() line 4 at SQL statement', file='postgres.c', line='2663', routine='quickdie', schema='null', table='null', column='null', dataType='null', constraint='null' 12:54:14.533 Jul 13, 2021 9:54:14 AM io.vertx.sqlclient.impl.SocketConnectionBase 12:54:14.533 WARNING: Backend notice: severity='WARNING', code='57P02', message='terminating connection because of crash of another server process', detail='The postmaster has commanded this server process to roll back the current transaction and exit, because another server process exited abnormally and possibly corrupted shared memory.', hint='In a moment you should be able to reconnect to the database and repeat your command.', position='null', internalPosition='null', internalQuery='null', where='SQL statement "delete from diku_mod_source_record_storage.marc_indexers where marc_id = NEW.id" 12:54:14.533 PL/pgSQL function insert_marc_indexers() line 4 at SQL statement', file='postgres.c', line='2663', routine='quickdie', schema='null', table='null', column='null', dataType='null', constraint='null' 12:54:18.102 09:54:18.092 [vert.x-worker-thread-13] ERROR ocessingEventHandler [86879eqId] Failed to handle instance event {} 12:54:18.102 io.netty.channel.AbstractChannel$AnnotatedNoRouteToHostException: Host is unreachable: pg-folio.folijet.svc.cluster.local/10.100.205.81:5432 12:54:18.102 Caused by: java.net.NoRouteToHostException: Host is unreachable 12:54:18.102 at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[?:?] 12:54:18.102 at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:779) ~[?:?] 12:54:18.102 at io.netty.channel.socket.nio.NioSocketChannel.doFinishConnect(NioSocketChannel.java:330) ~[mod-source-record-storage-server-fat.jar:?] 12:54:18.102 at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.finishConnect(AbstractNioChannel.java:334) ~[mod-source-record-storage-server-fat.jar:?] 12:54:18.102 at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:702) ~[mod-source-record-storage-server-fat.jar:?] 12:54:18.102 at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650) ~[mod-source-record-storage-server-fat.jar:?] 12:54:18.102 at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576) ~[mod-source-record-storage-server-fat.jar:?] 12:54:18.102 at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493) ~[mod-source-record-storage-server-fat.jar:?] 12:54:18.102 at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) ~[mod-source-record-storage-server-fat.jar:?] 12:54:18.102 at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[mod-source-record-storage-server-fat.jar:?] 12:54:18.102 at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [mod-source-record-storage-server-fat.jar:?] 12:54:18.102 at java.lang.Thread.run(Thread.java:834) [?:?] 12:54:18.715 09:54:18.714 [vert.x-worker-thread-13] INFO AbstractConfig [87501eqId] ProducerConfig values: 12:54:18.715 acks = -1 12:54:18.715 batch.size = 16384 12:54:18.715 bootstrap.servers = [kafka:tcp://10.100.110.212:9092] 12:54:18.715 buffer.memory = 33554432 12:54:18.715 client.dns.lookup = default 12:54:18.715 client.id = producer-2 12:54:18.715 compression.type = none 12:54:18.715 connections.max.idle.ms = 540000 12:54:18.715 delivery.timeout.ms = 120000 12:54:18.715 enable.idempotence = true 12:54:18.715 interceptor.classes = [] 12:54:18.715 key.serializer = class org.apache.kafka.common.serialization.StringSerializer 12:54:18.715 linger.ms = 0 12:54:18.715 max.block.ms = 60000 12:54:18.715 max.in.flight.requests.per.connection = 5 12:54:18.715 max.request.size = 1048576 12:54:18.715 metadata.max.age.ms = 300000 12:54:18.715 metadata.max.idle.ms = 300000 12:54:18.715 metric.reporters = [] 12:54:18.715 metrics.num.samples = 2 12:54:18.715 metrics.recording.level = INFO 12:54:18.715 metrics.sample.window.ms = 30000 12:54:18.715 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner 12:54:18.715 receive.buffer.bytes = 32768 12:54:18.715 reconnect.backoff.max.ms = 1000 12:54:18.715 reconnect.backoff.ms = 50 12:54:18.715 request.timeout.ms = 30000 12:54:18.715 retries = 2147483647 12:54:18.715 retry.backoff.ms = 100 12:54:18.715 sasl.client.callback.handler.class = null 12:54:18.715 sasl.jaas.config = null 12:54:18.715 sasl.kerberos.kinit.cmd = /usr/bin/kinit 12:54:18.715 sasl.kerberos.min.time.before.relogin = 60000 12:54:18.715 sasl.kerberos.service.name = null 12:54:18.715 sasl.kerberos.ticket.renew.jitter = 0.05 12:54:18.715 sasl.kerberos.ticket.renew.window.factor = 0.8 12:54:18.715 sasl.login.callback.handler.class = null 12:54:18.715 sasl.login.class = null 12:54:18.715 sasl.login.refresh.buffer.seconds = 300 12:54:18.715 sasl.login.refresh.min.period.seconds = 60 12:54:18.715 sasl.login.refresh.window.factor = 0.8 12:54:18.715 sasl.login.refresh.window.jitter = 0.05 12:54:18.715 sasl.mechanism = GSSAPI 12:54:18.715 security.protocol = PLAINTEXT 12:54:18.715 security.providers = null 12:54:18.715 send.buffer.bytes = 131072 12:54:18.715 ssl.cipher.suites = null 12:54:18.715 ssl.enabled.protocols = [TLSv1.2] 12:54:18.715 ssl.endpoint.identification.algorithm = https 12:54:18.715 ssl.key.password = null 12:54:18.715 ssl.keymanager.algorithm = SunX509 12:54:18.715 ssl.keystore.location = null 12:54:18.715 ssl.keystore.password = null 12:54:18.715 ssl.keystore.type = JKS 12:54:18.715 ssl.protocol = TLSv1.2 12:54:18.715 ssl.provider = null 12:54:18.715 ssl.secure.random.implementation = null 12:54:18.715 ssl.trustmanager.algorithm = PKIX 12:54:18.715 ssl.truststore.location = null 12:54:18.715 ssl.truststore.password = null 12:54:18.715 ssl.truststore.type = JKS 12:54:18.715 transaction.timeout.ms = 60000 12:54:18.715 transactional.id = null 12:54:18.715 value.serializer = class org.apache.kafka.common.serialization.StringSerializer 12:54:18.715 12:54:18.818 09:54:18.818 [vert.x-worker-thread-13] INFO KafkaProducer [87605eqId] [Producer clientId=producer-2] Instantiated an idempotent producer. 12:54:18.822 09:54:18.821 [vert.x-worker-thread-13] INFO KafkaProducer [87608eqId] [Producer clientId=producer-2] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled. 12:54:18.822 09:54:18.822 [vert.x-worker-thread-13] INFO KafkaProducer [87609eqId] [Producer clientId=producer-2] Overriding the default acks to all since idempotence is enabled. 12:54:18.912 09:54:18.912 [vert.x-worker-thread-13] INFO ppInfoParser$AppInfo [87699eqId] Kafka version: 2.5.0 12:54:18.919 09:54:18.919 [vert.x-worker-thread-13] INFO ppInfoParser$AppInfo [87706eqId] Kafka commitId: 66563e712b0b9f84 12:54:18.920 09:54:18.920 [vert.x-worker-thread-13] INFO ppInfoParser$AppInfo [87707eqId] Kafka startTimeMs: 1626170058909 12:54:19.016 09:54:19.016 [kafka-producer-network-thread | producer-2] INFO Metadata [87803eqId] [Producer clientId=producer-2] Cluster ID: ydtd602STbyZYzkk_c9eXA 12:54:19.017 09:54:19.017 [kafka-producer-network-thread | producer-2] INFO TransactionManager [87804eqId] [Producer clientId=producer-2] ProducerId set to 334012 with epoch 0 12:54:19.312 09:54:19.311 [vert.x-worker-thread-16] INFO KafkaEventPublisher [88098eqId] Event with type: DI_ERROR and correlationId: 64882bdd-3fd5-40ba-ba44-491ed0e67cce was sent to the topic folio.Default.diku.DI_ERROR 12:54:19.410 09:54:19.409 [vert.x-worker-thread-16] DEBUG KafkaConsumerWrapper [88196eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING, subscriptionPattern=folio\.Default\.\w{1,}\.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING) Committing offset: 278 12:54:19.612 09:54:19.612 [vert.x-worker-thread-16] ERROR KafkaConsumerWrapper [88399eqId] Error while processing a record - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING, subscriptionPattern=folio\.Default\.\w{1,}\.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING) 12:54:19.612 io.vertx.core.impl.NoStackTraceThrowable: Failed to process data import event payload