15 Oct 2021 12:24:31.935 [vert.x-worker-thread-3] INFO MappingMetadataCache [18955307eqId] MappingMetadata was loaded by jobExecutionId '25dbddee-371b-4f41-b75b-a0d0d28f9653' 15 Oct 2021 12:24:32.826 [vert.x-worker-thread-2] INFO KafkaConsumerWrapper [18956198eqId] Consumer - id: 4 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_BIB_INSTANCE_HRID_SET, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_BIB_INSTANCE_HRID_SET) Committed offset: 7 15 Oct 2021 12:26:12.262 [vert.x-worker-thread-16] INFO ebRequestDiagnostics [19055634eqId] Handling GET /inventory/instances/2e740a82-9965-409c-8c12-679298b53637 15 Oct 2021 13:49:45.121 [vert.x-worker-thread-18] INFO ebRequestDiagnostics [24068493eqId] Handling GET /inventory/instances/2e740a82-9965-409c-8c12-679298b53637 15 Oct 2021 13:58:33.285 [vert.x-worker-thread-6] INFO taImportKafkaHandler [24596657eqId] Data import event payload has been received with event type: DI_SRS_MARC_BIB_RECORD_CREATED correlationId: b995cbba-5ae0-4638-b917-ec1db6d210f4 15 Oct 2021 13:58:33.332 [vert.x-worker-thread-14] INFO ProfileSnapshotCache [24596704eqId] JobProfileSnapshot was loaded by id '9eeaed69-8f9f-4af4-9798-43cd8e4969a5' 15 Oct 2021 13:58:33.616 [vert.x-worker-thread-1] INFO MappingMetadataCache [24596988eqId] MappingMetadata was loaded by jobExecutionId '9862e2d4-26b6-4b70-aef4-b2092be32a9f' 15 Oct 2021 13:58:34.317 [vert.x-worker-thread-5] INFO AbstractConfig [24597689eqId] ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:9092] buffer.memory = 33554432 client.dns.lookup = default client.id = producer-19 compression.type = gzip connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = true interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer 15 Oct 2021 13:58:34.318 [vert.x-worker-thread-5] INFO KafkaProducer [24597690eqId] [Producer clientId=producer-19] Instantiated an idempotent producer. 15 Oct 2021 13:58:34.321 [vert.x-worker-thread-5] INFO KafkaProducer [24597693eqId] [Producer clientId=producer-19] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled. 15 Oct 2021 13:58:34.322 [vert.x-worker-thread-5] INFO KafkaProducer [24597694eqId] [Producer clientId=producer-19] Overriding the default acks to all since idempotence is enabled. 15 Oct 2021 13:58:34.324 [vert.x-worker-thread-5] INFO ppInfoParser$AppInfo [24597696eqId] Kafka version: 2.5.0 15 Oct 2021 13:58:34.324 [vert.x-worker-thread-5] INFO ppInfoParser$AppInfo [24597696eqId] Kafka commitId: 66563e712b0b9f84 15 Oct 2021 13:58:34.325 [vert.x-worker-thread-5] INFO ppInfoParser$AppInfo [24597697eqId] Kafka startTimeMs: 1634306314324 15 Oct 2021 13:58:34.417 [kafka-producer-network-thread | producer-19] INFO Metadata [24597789eqId] [Producer clientId=producer-19] Cluster ID: ndvmVewOTKSTf8qW5if9eg 15 Oct 2021 13:58:34.418 [kafka-producer-network-thread | producer-19] INFO TransactionManager [24597790eqId] [Producer clientId=producer-19] ProducerId set to 4045 with epoch 0 15 Oct 2021 13:58:34.528 [vert.x-worker-thread-8] INFO KafkaEventPublisher [24597900eqId] Event with type: DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING and correlationId: b995cbba-5ae0-4638-b917-ec1db6d210f4 was sent to the topic metadata-spitfire.Default.diku.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING 15 Oct 2021 13:58:34.529 [vert.x-worker-thread-8] INFO KafkaProducer [24597901eqId] [Producer clientId=producer-19] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. 15 Oct 2021 13:58:35.433 [vert.x-worker-thread-18] INFO KafkaConsumerWrapper [24598805eqId] Consumer - id: 7 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_BIB_RECORD_CREATED, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_BIB_RECORD_CREATED) Committed offset: 9 15 Oct 2021 13:58:36.188 [vert.x-worker-thread-11] INFO eHridSetKafkaHandler [24599560eqId] Event payload has been received with event type: DI_SRS_MARC_BIB_INSTANCE_HRID_SET and correlationId: b995cbba-5ae0-4638-b917-ec1db6d210f4 15 Oct 2021 13:58:36.208 [vert.x-worker-thread-1] INFO MappingMetadataCache [24599580eqId] MappingMetadata was loaded by jobExecutionId '9862e2d4-26b6-4b70-aef4-b2092be32a9f' 15 Oct 2021 13:58:37.191 [vert.x-worker-thread-13] INFO KafkaConsumerWrapper [24600563eqId] Consumer - id: 4 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_BIB_INSTANCE_HRID_SET, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_BIB_INSTANCE_HRID_SET) Committed offset: 8 15 Oct 2021 13:58:59.726 [vert.x-worker-thread-16] INFO ebRequestDiagnostics [24623098eqId] Handling GET /inventory/instances/2e740a82-9965-409c-8c12-679298b53637 15 Oct 2021 14:07:38.573 [vert.x-worker-thread-3] INFO ebRequestDiagnostics [25141945eqId] Handling GET /inventory/instances/b6fb94de-311f-4ff9-b95e-b3a90e7960bc 15 Oct 2021 14:07:41.984 [vert.x-kafka-consumer-thread-4] INFO ConsumerCoordinator [25145356eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-5, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Revoke previously assigned partitions 15 Oct 2021 14:07:41.984 [vert.x-kafka-consumer-thread-4] INFO AbstractCoordinator [25145356eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-5, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] (Re-)joining group 15 Oct 2021 14:07:42.248 [vert.x-kafka-consumer-thread-5] INFO tbeatResponseHandler [25145620eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-6, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Attempt to heartbeat failed since group is rebalancing 15 Oct 2021 14:07:42.250 [vert.x-kafka-consumer-thread-5] INFO ConsumerCoordinator [25145622eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-6, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Revoke previously assigned partitions 15 Oct 2021 14:07:42.250 [vert.x-kafka-consumer-thread-5] INFO AbstractCoordinator [25145622eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-6, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] (Re-)joining group 15 Oct 2021 14:07:42.892 [vert.x-kafka-consumer-thread-6] INFO ConsumerCoordinator [25146264eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-7, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Revoke previously assigned partitions 15 Oct 2021 14:07:42.892 [vert.x-kafka-consumer-thread-6] INFO AbstractCoordinator [25146264eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-7, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] (Re-)joining group 15 Oct 2021 14:07:42.899 [vert.x-kafka-consumer-thread-6] INFO ConsumerCoordinator [25146271eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-7, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Finished assignment for group at generation 3: {consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-6-2f1aa5b8-930c-4703-a7ff-996a9b2c7480=Assignment(partitions=[]), consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-5-27a924cb-9dbb-4ae4-a343-9251672fa296=Assignment(partitions=[metadata-spitfire.Default.diku.DI_SRS_MARC_HOLDING_RECORD_CREATED-0]), consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-7-b35d6a23-e047-4ad4-baf6-66cf496541ad=Assignment(partitions=[])} 15 Oct 2021 14:07:42.903 [vert.x-kafka-consumer-thread-4] INFO bstractCoordinator$2 [25146275eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-5, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Successfully joined group with generation 3 15 Oct 2021 14:07:42.903 [vert.x-kafka-consumer-thread-5] INFO bstractCoordinator$2 [25146275eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-6, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Successfully joined group with generation 3 15 Oct 2021 14:07:42.903 [vert.x-kafka-consumer-thread-5] INFO ConsumerCoordinator [25146275eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-6, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Adding newly assigned partitions: 15 Oct 2021 14:07:42.903 [vert.x-kafka-consumer-thread-4] INFO ConsumerCoordinator [25146275eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-5, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Adding newly assigned partitions: metadata-spitfire.Default.diku.DI_SRS_MARC_HOLDING_RECORD_CREATED-0 15 Oct 2021 14:07:42.903 [vert.x-kafka-consumer-thread-6] INFO bstractCoordinator$2 [25146275eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-7, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Successfully joined group with generation 3 15 Oct 2021 14:07:42.903 [vert.x-kafka-consumer-thread-6] INFO ConsumerCoordinator [25146275eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-7, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Adding newly assigned partitions: 15 Oct 2021 14:07:42.904 [vert.x-kafka-consumer-thread-4] INFO FetchResponseHandler [25146276eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-5, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Found no committed offset for partition metadata-spitfire.Default.diku.DI_SRS_MARC_HOLDING_RECORD_CREATED-0 15 Oct 2021 14:07:42.921 [vert.x-kafka-consumer-thread-4] INFO SubscriptionState [25146293eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0-5, groupId=DI_SRS_MARC_HOLDING_RECORD_CREATED.mod-inventory-17.1.0] Resetting offset for partition metadata-spitfire.Default.diku.DI_SRS_MARC_HOLDING_RECORD_CREATED-0 to offset 0. 15 Oct 2021 14:07:42.928 [vert.x-worker-thread-11] INFO taImportKafkaHandler [25146300eqId] Data import event payload has been received with event type: DI_SRS_MARC_HOLDING_RECORD_CREATED correlationId: fa07a2db-f218-40f5-a5ac-ca05609440e4 15 Oct 2021 14:07:43.723 [vert.x-worker-thread-9] INFO ProfileSnapshotCache [25147095eqId] JobProfileSnapshot was loaded by id '45156b59-01a0-4c2e-acd8-4cc51d3274ba' 15 Oct 2021 14:07:44.124 [vert.x-worker-thread-18] INFO MappingMetadataCache [25147496eqId] MappingMetadata was loaded by jobExecutionId 'd49313dc-98f7-471f-b96d-309b614acc5f' 15 Oct 2021 14:07:44.536 [vert.x-worker-thread-18] INFO HoldingsEventHandler [25147908eqId] PAYLOAD: {"id":"14d80993-8a8d-4ee0-bba5-d813732eb007","holdingsTypeId":"e6da6c98-6dd0-41bc-8b4b-cfd4bbd9c3ae","formerIds":["445553"],"permanentLocationId":"53cf956f-c1df-410b-8bea-27f712cca7c0","electronicAccess":[],"callNumber":"BR140 .J86","notes":[],"holdingsStatements":[{"statement":"v.54-68 (2003-2017)"}],"holdingsStatementsForIndexes":[{"statement":"v.1/50 (1950/1999)"}],"holdingsStatementsForSupplements":[],"statisticalCodeIds":[],"sourceId":"036ee84a-6afd-4c3c-9ad3-4a12ab875f59"} 15 Oct 2021 14:07:44.537 [vert.x-worker-thread-18] INFO HoldingsEventHandler [25147909eqId] PAYLOAD 2: { "id" : "14d80993-8a8d-4ee0-bba5-d813732eb007", "holdingsTypeId" : "e6da6c98-6dd0-41bc-8b4b-cfd4bbd9c3ae", "formerIds" : [ "445553" ], "permanentLocationId" : "53cf956f-c1df-410b-8bea-27f712cca7c0", "electronicAccess" : [ ], "callNumber" : "BR140 .J86", "notes" : [ ], "holdingsStatements" : [ { "statement" : "v.54-68 (2003-2017)" } ], "holdingsStatementsForIndexes" : [ { "statement" : "v.1/50 (1950/1999)" } ], "holdingsStatementsForSupplements" : [ ], "statisticalCodeIds" : [ ], "sourceId" : "036ee84a-6afd-4c3c-9ad3-4a12ab875f59" } 15 Oct 2021 14:07:45.820 [vert.x-worker-thread-16] INFO HoldingsEventHandler [25149192eqId] Created Holding record 15 Oct 2021 14:07:45.925 [vert.x-worker-thread-16] INFO AbstractConfig [25149297eqId] ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:9092] buffer.memory = 33554432 client.dns.lookup = default client.id = producer-20 compression.type = gzip connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = true interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer 15 Oct 2021 14:07:46.013 [vert.x-worker-thread-16] INFO KafkaProducer [25149385eqId] [Producer clientId=producer-20] Instantiated an idempotent producer. 15 Oct 2021 14:07:46.017 [vert.x-worker-thread-16] INFO KafkaProducer [25149389eqId] [Producer clientId=producer-20] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled. 15 Oct 2021 14:07:46.017 [vert.x-worker-thread-16] INFO KafkaProducer [25149389eqId] [Producer clientId=producer-20] Overriding the default acks to all since idempotence is enabled. 15 Oct 2021 14:07:46.019 [vert.x-worker-thread-16] INFO ppInfoParser$AppInfo [25149391eqId] Kafka version: 2.5.0 15 Oct 2021 14:07:46.019 [vert.x-worker-thread-16] INFO ppInfoParser$AppInfo [25149391eqId] Kafka commitId: 66563e712b0b9f84 15 Oct 2021 14:07:46.019 [vert.x-worker-thread-16] INFO ppInfoParser$AppInfo [25149391eqId] Kafka startTimeMs: 1634306866018 15 Oct 2021 14:07:46.020 [kafka-producer-network-thread | producer-20] INFO Metadata [25149392eqId] [Producer clientId=producer-20] Cluster ID: ndvmVewOTKSTf8qW5if9eg 15 Oct 2021 14:07:46.021 [kafka-producer-network-thread | producer-20] INFO TransactionManager [25149393eqId] [Producer clientId=producer-20] ProducerId set to 4054 with epoch 0 15 Oct 2021 14:07:46.035 [kafka-producer-network-thread | producer-20] WARN faultMetadataUpdater [25149407eqId] [Producer clientId=producer-20] Error while fetching metadata with correlation id 4 : {metadata-spitfire.Default.diku.DI_INVENTORY_HOLDINGS_CREATED_READY_FOR_POST_PROCESSING=LEADER_NOT_AVAILABLE} 15 Oct 2021 14:07:46.230 [vert.x-worker-thread-11] INFO KafkaEventPublisher [25149602eqId] Event with type: DI_INVENTORY_HOLDINGS_CREATED_READY_FOR_POST_PROCESSING and correlationId: fa07a2db-f218-40f5-a5ac-ca05609440e4 was sent to the topic metadata-spitfire.Default.diku.DI_INVENTORY_HOLDINGS_CREATED_READY_FOR_POST_PROCESSING 15 Oct 2021 14:07:46.232 [vert.x-worker-thread-11] INFO KafkaProducer [25149604eqId] [Producer clientId=producer-20] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. 15 Oct 2021 14:07:46.614 [vert.x-worker-thread-10] INFO KafkaConsumerWrapper [25149986eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_HOLDING_RECORD_CREATED, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_HOLDING_RECORD_CREATED) Committed offset: 1 15 Oct 2021 14:08:13.269 [vert.x-kafka-consumer-thread-56] INFO ConsumerCoordinator [25176641eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-56, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Revoke previously assigned partitions 15 Oct 2021 14:08:13.270 [vert.x-kafka-consumer-thread-56] INFO AbstractCoordinator [25176642eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-56, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] (Re-)joining group 15 Oct 2021 14:08:13.625 [vert.x-kafka-consumer-thread-51] INFO tbeatResponseHandler [25176997eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-52, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Attempt to heartbeat failed since group is rebalancing 15 Oct 2021 14:08:13.627 [vert.x-kafka-consumer-thread-51] INFO ConsumerCoordinator [25176999eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-52, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Revoke previously assigned partitions 15 Oct 2021 14:08:13.627 [vert.x-kafka-consumer-thread-51] INFO AbstractCoordinator [25176999eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-52, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] (Re-)joining group 15 Oct 2021 14:08:13.919 [vert.x-kafka-consumer-thread-48] INFO ConsumerCoordinator [25177291eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-51, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Revoke previously assigned partitions 15 Oct 2021 14:08:13.920 [vert.x-kafka-consumer-thread-48] INFO AbstractCoordinator [25177292eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-51, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] (Re-)joining group 15 Oct 2021 14:08:13.921 [vert.x-kafka-consumer-thread-48] INFO ConsumerCoordinator [25177293eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-51, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Finished assignment for group at generation 3: {consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-51-d297b598-8be5-46b9-a91e-82471e58a2ba=Assignment(partitions=[metadata-spitfire.Default.diku.DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET-0]), consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-56-62159350-65bf-4793-96dc-a64910988ca2=Assignment(partitions=[]), consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-52-2de60aa5-97f6-489f-ae4e-ff3c75dec223=Assignment(partitions=[])} 15 Oct 2021 14:08:13.923 [vert.x-kafka-consumer-thread-48] INFO bstractCoordinator$2 [25177295eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-51, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Successfully joined group with generation 3 15 Oct 2021 14:08:13.923 [vert.x-kafka-consumer-thread-56] INFO bstractCoordinator$2 [25177295eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-56, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Successfully joined group with generation 3 15 Oct 2021 14:08:13.923 [vert.x-kafka-consumer-thread-51] INFO bstractCoordinator$2 [25177295eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-52, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Successfully joined group with generation 3 15 Oct 2021 14:08:13.923 [vert.x-kafka-consumer-thread-56] INFO ConsumerCoordinator [25177295eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-56, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Adding newly assigned partitions: 15 Oct 2021 14:08:13.923 [vert.x-kafka-consumer-thread-51] INFO ConsumerCoordinator [25177295eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-52, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Adding newly assigned partitions: 15 Oct 2021 14:08:13.923 [vert.x-kafka-consumer-thread-48] INFO ConsumerCoordinator [25177295eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-51, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Adding newly assigned partitions: metadata-spitfire.Default.diku.DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET-0 15 Oct 2021 14:08:13.924 [vert.x-kafka-consumer-thread-48] INFO FetchResponseHandler [25177296eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-51, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Found no committed offset for partition metadata-spitfire.Default.diku.DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET-0 15 Oct 2021 14:08:13.925 [vert.x-kafka-consumer-thread-48] INFO SubscriptionState [25177297eqId] [Consumer clientId=consumer-DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0-51, groupId=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET.mod-inventory-17.1.0] Resetting offset for partition metadata-spitfire.Default.diku.DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET-0 to offset 0. 15 Oct 2021 14:08:14.019 [vert.x-worker-thread-12] INFO dHridSetKafkaHandler [25177391eqId] Event payload has been received with event type: DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET and correlationId: fa07a2db-f218-40f5-a5ac-ca05609440e4 15 Oct 2021 14:08:14.046 [vert.x-worker-thread-6] INFO MappingMetadataCache [25177418eqId] MappingMetadata was loaded by jobExecutionId 'd49313dc-98f7-471f-b96d-309b614acc5f' 15 Oct 2021 14:08:15.025 [vert.x-worker-thread-13] INFO KafkaConsumerWrapper [25178397eqId] Consumer - id: 1 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET) Committed offset: 1 15 Oct 2021 14:08:53.920 [vert.x-worker-thread-6] INFO KafkaInternalCache [25217292eqId] Clearing cache from outdated events... 15 Oct 2021 14:14:32.581 [vert.x-worker-thread-18] INFO taImportKafkaHandler [25555953eqId] Data import event payload has been received with event type: DI_SRS_MARC_BIB_RECORD_CREATED correlationId: 968f28fe-a175-4347-88a1-a513a8fed03a 15 Oct 2021 14:14:32.615 [vert.x-worker-thread-8] INFO ProfileSnapshotCache [25555987eqId] JobProfileSnapshot was loaded by id 'e1bfd051-111c-43b0-b406-9de86eefcfe3' 15 Oct 2021 14:14:32.836 [vert.x-worker-thread-17] INFO MappingMetadataCache [25556208eqId] MappingMetadata was loaded by jobExecutionId '917dced0-74a5-4b48-b06c-a23fa19a4f76' 15 Oct 2021 14:14:33.331 [vert.x-worker-thread-14] INFO AbstractConfig [25556703eqId] ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:9092] buffer.memory = 33554432 client.dns.lookup = default client.id = producer-21 compression.type = gzip connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = true interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer 15 Oct 2021 14:14:33.413 [vert.x-worker-thread-14] INFO KafkaProducer [25556785eqId] [Producer clientId=producer-21] Instantiated an idempotent producer. 15 Oct 2021 14:14:33.417 [vert.x-worker-thread-14] INFO KafkaProducer [25556789eqId] [Producer clientId=producer-21] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled. 15 Oct 2021 14:14:33.418 [vert.x-worker-thread-14] INFO KafkaProducer [25556790eqId] [Producer clientId=producer-21] Overriding the default acks to all since idempotence is enabled. 15 Oct 2021 14:14:33.418 [vert.x-worker-thread-14] INFO ppInfoParser$AppInfo [25556790eqId] Kafka version: 2.5.0 15 Oct 2021 14:14:33.419 [vert.x-worker-thread-14] INFO ppInfoParser$AppInfo [25556791eqId] Kafka commitId: 66563e712b0b9f84 15 Oct 2021 14:14:33.421 [vert.x-worker-thread-14] INFO ppInfoParser$AppInfo [25556793eqId] Kafka startTimeMs: 1634307273418 15 Oct 2021 14:14:33.515 [kafka-producer-network-thread | producer-21] INFO Metadata [25556887eqId] [Producer clientId=producer-21] Cluster ID: ndvmVewOTKSTf8qW5if9eg 15 Oct 2021 14:14:33.515 [kafka-producer-network-thread | producer-21] INFO TransactionManager [25556887eqId] [Producer clientId=producer-21] ProducerId set to 4063 with epoch 0 15 Oct 2021 14:14:33.525 [vert.x-worker-thread-18] INFO KafkaEventPublisher [25556897eqId] Event with type: DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING and correlationId: 968f28fe-a175-4347-88a1-a513a8fed03a was sent to the topic metadata-spitfire.Default.diku.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING 15 Oct 2021 14:14:33.526 [vert.x-worker-thread-18] INFO KafkaProducer [25556898eqId] [Producer clientId=producer-21] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. 15 Oct 2021 14:14:33.716 [vert.x-worker-thread-10] INFO KafkaConsumerWrapper [25557088eqId] Consumer - id: 7 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_BIB_RECORD_CREATED, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_BIB_RECORD_CREATED) Committed offset: 10 15 Oct 2021 14:14:34.866 [vert.x-worker-thread-9] INFO eHridSetKafkaHandler [25558238eqId] Event payload has been received with event type: DI_SRS_MARC_BIB_INSTANCE_HRID_SET and correlationId: 968f28fe-a175-4347-88a1-a513a8fed03a 15 Oct 2021 14:14:35.077 [vert.x-worker-thread-3] INFO MappingMetadataCache [25558449eqId] MappingMetadata was loaded by jobExecutionId '917dced0-74a5-4b48-b06c-a23fa19a4f76' 15 Oct 2021 14:14:35.880 [vert.x-worker-thread-4] INFO KafkaConsumerWrapper [25559252eqId] Consumer - id: 4 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_BIB_INSTANCE_HRID_SET, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_BIB_INSTANCE_HRID_SET) Committed offset: 9 15 Oct 2021 14:15:15.985 [vert.x-worker-thread-12] INFO ebRequestDiagnostics [25599357eqId] Handling GET /inventory/instances/e360056f-5804-41d3-9892-4aa99ff51029 15 Oct 2021 14:17:10.979 [vert.x-worker-thread-19] INFO taImportKafkaHandler [25714351eqId] Data import event payload has been received with event type: DI_SRS_MARC_HOLDING_RECORD_CREATED correlationId: a08d8df5-df7d-4ffc-994f-48f9365a6814 15 Oct 2021 14:17:10.993 [vert.x-worker-thread-3] INFO ProfileSnapshotCache [25714365eqId] JobProfileSnapshot was loaded by id '80787eb7-807b-4062-8f7a-5f9de915bb2a' 15 Oct 2021 14:17:11.082 [vert.x-worker-thread-0] INFO MappingMetadataCache [25714454eqId] MappingMetadata was loaded by jobExecutionId '0b1ecf51-420e-4e75-af41-f4003b115f6a' 15 Oct 2021 14:17:11.089 [vert.x-worker-thread-0] INFO HoldingsEventHandler [25714461eqId] PAYLOAD: {"id":"e0ca9e7d-79d1-416b-b3b3-3845936c359d","holdingsTypeId":"e6da6c98-6dd0-41bc-8b4b-cfd4bbd9c3ae","formerIds":["445553"],"permanentLocationId":"53cf956f-c1df-410b-8bea-27f712cca7c0","electronicAccess":[],"callNumber":"BR140 .J86","notes":[],"holdingsStatements":[{"statement":"v.54-68 (2003-2017)"}],"holdingsStatementsForIndexes":[{"statement":"v.1/50 (1950/1999)"}],"holdingsStatementsForSupplements":[],"statisticalCodeIds":[],"sourceId":"036ee84a-6afd-4c3c-9ad3-4a12ab875f59"} 15 Oct 2021 14:17:11.089 [vert.x-worker-thread-0] INFO HoldingsEventHandler [25714461eqId] PAYLOAD 2: { "id" : "e0ca9e7d-79d1-416b-b3b3-3845936c359d", "holdingsTypeId" : "e6da6c98-6dd0-41bc-8b4b-cfd4bbd9c3ae", "formerIds" : [ "445553" ], "permanentLocationId" : "53cf956f-c1df-410b-8bea-27f712cca7c0", "electronicAccess" : [ ], "callNumber" : "BR140 .J86", "notes" : [ ], "holdingsStatements" : [ { "statement" : "v.54-68 (2003-2017)" } ], "holdingsStatementsForIndexes" : [ { "statement" : "v.1/50 (1950/1999)" } ], "holdingsStatementsForSupplements" : [ ], "statisticalCodeIds" : [ ], "sourceId" : "036ee84a-6afd-4c3c-9ad3-4a12ab875f59" } 15 Oct 2021 14:17:11.350 [vert.x-worker-thread-17] INFO HoldingsEventHandler [25714722eqId] Created Holding record 15 Oct 2021 14:17:11.352 [vert.x-worker-thread-17] INFO AbstractConfig [25714724eqId] ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:9092] buffer.memory = 33554432 client.dns.lookup = default client.id = producer-22 compression.type = gzip connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = true interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer 15 Oct 2021 14:17:11.353 [vert.x-worker-thread-17] INFO KafkaProducer [25714725eqId] [Producer clientId=producer-22] Instantiated an idempotent producer. 15 Oct 2021 14:17:11.355 [vert.x-worker-thread-17] INFO KafkaProducer [25714727eqId] [Producer clientId=producer-22] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled. 15 Oct 2021 14:17:11.355 [vert.x-worker-thread-17] INFO KafkaProducer [25714727eqId] [Producer clientId=producer-22] Overriding the default acks to all since idempotence is enabled. 15 Oct 2021 14:17:11.356 [vert.x-worker-thread-17] INFO ppInfoParser$AppInfo [25714728eqId] Kafka version: 2.5.0 15 Oct 2021 14:17:11.356 [vert.x-worker-thread-17] INFO ppInfoParser$AppInfo [25714728eqId] Kafka commitId: 66563e712b0b9f84 15 Oct 2021 14:17:11.356 [vert.x-worker-thread-17] INFO ppInfoParser$AppInfo [25714728eqId] Kafka startTimeMs: 1634307431355 15 Oct 2021 14:17:11.357 [kafka-producer-network-thread | producer-22] INFO Metadata [25714729eqId] [Producer clientId=producer-22] Cluster ID: ndvmVewOTKSTf8qW5if9eg 15 Oct 2021 14:17:11.358 [kafka-producer-network-thread | producer-22] INFO TransactionManager [25714730eqId] [Producer clientId=producer-22] ProducerId set to 4072 with epoch 0 15 Oct 2021 14:17:11.427 [vert.x-worker-thread-10] INFO KafkaEventPublisher [25714799eqId] Event with type: DI_INVENTORY_HOLDINGS_CREATED_READY_FOR_POST_PROCESSING and correlationId: a08d8df5-df7d-4ffc-994f-48f9365a6814 was sent to the topic metadata-spitfire.Default.diku.DI_INVENTORY_HOLDINGS_CREATED_READY_FOR_POST_PROCESSING 15 Oct 2021 14:17:11.428 [vert.x-worker-thread-10] INFO KafkaProducer [25714800eqId] [Producer clientId=producer-22] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. 15 Oct 2021 14:17:11.982 [vert.x-worker-thread-4] INFO KafkaConsumerWrapper [25715354eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_HOLDING_RECORD_CREATED, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_HOLDING_RECORD_CREATED) Committed offset: 2 15 Oct 2021 14:17:12.758 [vert.x-worker-thread-2] INFO dHridSetKafkaHandler [25716130eqId] Event payload has been received with event type: DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET and correlationId: a08d8df5-df7d-4ffc-994f-48f9365a6814 15 Oct 2021 14:17:12.772 [vert.x-worker-thread-9] INFO MappingMetadataCache [25716144eqId] MappingMetadata was loaded by jobExecutionId '0b1ecf51-420e-4e75-af41-f4003b115f6a' 15 Oct 2021 14:17:13.760 [vert.x-worker-thread-13] INFO KafkaConsumerWrapper [25717132eqId] Consumer - id: 1 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET) Committed offset: 2 15 Oct 2021 15:08:53.919 [vert.x-worker-thread-5] INFO KafkaInternalCache [28817291eqId] Clearing cache from outdated events... 15 Oct 2021 16:08:53.919 [vert.x-worker-thread-4] INFO KafkaInternalCache [32417291eqId] Clearing cache from outdated events... 15 Oct 2021 17:08:53.919 [vert.x-worker-thread-14] INFO KafkaInternalCache [36017291eqId] Clearing cache from outdated events... 18 Oct 2021 16:25:54.683 [vert.x-worker-thread-13] INFO taImportKafkaHandler [292638055eqId] Data import event payload has been received with event type: DI_SRS_MARC_BIB_RECORD_CREATED correlationId: d6b764a1-76c1-4011-96d9-09ff23ac571a 18 Oct 2021 16:25:54.717 [vert.x-worker-thread-3] INFO ProfileSnapshotCache [292638089eqId] JobProfileSnapshot was loaded by id '67118dc0-24d6-4222-b5a1-bd5617cc8d1d' 18 Oct 2021 16:25:54.782 [vert.x-worker-thread-8] INFO MappingMetadataCache [292638154eqId] MappingMetadata was loaded by jobExecutionId '810ea89c-9177-40e9-b6b1-54701b033264' 18 Oct 2021 16:25:54.870 [vert.x-worker-thread-3] INFO AbstractConfig [292638242eqId] ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:9092] buffer.memory = 33554432 client.dns.lookup = default client.id = producer-23 compression.type = gzip connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = true interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer 18 Oct 2021 16:25:54.870 [vert.x-worker-thread-3] INFO KafkaProducer [292638242eqId] [Producer clientId=producer-23] Instantiated an idempotent producer. 18 Oct 2021 16:25:54.915 [vert.x-worker-thread-3] INFO KafkaProducer [292638287eqId] [Producer clientId=producer-23] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled. 18 Oct 2021 16:25:54.915 [vert.x-worker-thread-3] INFO KafkaProducer [292638287eqId] [Producer clientId=producer-23] Overriding the default acks to all since idempotence is enabled. 18 Oct 2021 16:25:54.916 [vert.x-worker-thread-3] INFO ppInfoParser$AppInfo [292638288eqId] Kafka version: 2.5.0 18 Oct 2021 16:25:54.916 [vert.x-worker-thread-3] INFO ppInfoParser$AppInfo [292638288eqId] Kafka commitId: 66563e712b0b9f84 18 Oct 2021 16:25:54.916 [vert.x-worker-thread-3] INFO ppInfoParser$AppInfo [292638288eqId] Kafka startTimeMs: 1634574354916 18 Oct 2021 16:25:54.926 [kafka-producer-network-thread | producer-23] INFO Metadata [292638298eqId] [Producer clientId=producer-23] Cluster ID: ndvmVewOTKSTf8qW5if9eg 18 Oct 2021 16:25:54.926 [kafka-producer-network-thread | producer-23] INFO TransactionManager [292638298eqId] [Producer clientId=producer-23] ProducerId set to 4083 with epoch 0 18 Oct 2021 16:25:55.031 [vert.x-worker-thread-19] INFO KafkaEventPublisher [292638403eqId] Event with type: DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING and correlationId: d6b764a1-76c1-4011-96d9-09ff23ac571a was sent to the topic metadata-spitfire.Default.diku.DI_INVENTORY_INSTANCE_CREATED_READY_FOR_POST_PROCESSING 18 Oct 2021 16:25:55.031 [vert.x-worker-thread-19] INFO KafkaProducer [292638403eqId] [Producer clientId=producer-23] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. 18 Oct 2021 16:25:55.685 [vert.x-worker-thread-4] INFO KafkaConsumerWrapper [292639057eqId] Consumer - id: 7 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_BIB_RECORD_CREATED, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_BIB_RECORD_CREATED) Committed offset: 11 18 Oct 2021 16:25:56.058 [vert.x-worker-thread-12] INFO eHridSetKafkaHandler [292639430eqId] Event payload has been received with event type: DI_SRS_MARC_BIB_INSTANCE_HRID_SET and correlationId: d6b764a1-76c1-4011-96d9-09ff23ac571a 18 Oct 2021 16:25:56.175 [vert.x-worker-thread-7] INFO MappingMetadataCache [292639547eqId] MappingMetadata was loaded by jobExecutionId '810ea89c-9177-40e9-b6b1-54701b033264' 18 Oct 2021 16:25:57.060 [vert.x-worker-thread-12] INFO KafkaConsumerWrapper [292640432eqId] Consumer - id: 4 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_BIB_INSTANCE_HRID_SET, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_BIB_INSTANCE_HRID_SET) Committed offset: 10 18 Oct 2021 16:26:11.277 [vert.x-worker-thread-2] INFO ebRequestDiagnostics [292654649eqId] Handling GET /inventory/instances/594fa7a6-7c01-4640-9b63-a5eda2d781c5 18 Oct 2021 16:35:06.186 [vert.x-worker-thread-9] INFO taImportKafkaHandler [293189558eqId] Data import event payload has been received with event type: DI_SRS_MARC_HOLDING_RECORD_CREATED correlationId: 092189d6-856c-453a-a7e8-6573f12a0fc4 18 Oct 2021 16:35:06.226 [vert.x-worker-thread-16] INFO ProfileSnapshotCache [293189598eqId] JobProfileSnapshot was loaded by id 'bfafdedb-7438-45f6-82bc-1558adb676dd' 18 Oct 2021 16:35:06.378 [vert.x-worker-thread-8] INFO MappingMetadataCache [293189750eqId] MappingMetadata was loaded by jobExecutionId '8303f7ea-c811-4247-886f-5f48188eb52d' 18 Oct 2021 16:35:06.386 [vert.x-worker-thread-8] INFO HoldingsEventHandler [293189758eqId] PAYLOAD: {"id":"5c0fb6b8-2245-4075-b424-c38e58967e31","holdingsTypeId":"e6da6c98-6dd0-41bc-8b4b-cfd4bbd9c3ae","formerIds":["445541"],"permanentLocationId":"53cf956f-c1df-410b-8bea-27f712cca7c0","electronicAccess":[],"callNumber":"BR140 .J86","notes":[],"holdingsStatements":[{"statement":"v.54-68 (2003-2017)"}],"holdingsStatementsForIndexes":[{"statement":"v.1/50 (1950/1999)"}],"holdingsStatementsForSupplements":[],"statisticalCodeIds":[],"sourceId":"036ee84a-6afd-4c3c-9ad3-4a12ab875f59"} 18 Oct 2021 16:35:06.386 [vert.x-worker-thread-8] INFO HoldingsEventHandler [293189758eqId] PAYLOAD 2: { "id" : "5c0fb6b8-2245-4075-b424-c38e58967e31", "holdingsTypeId" : "e6da6c98-6dd0-41bc-8b4b-cfd4bbd9c3ae", "formerIds" : [ "445541" ], "permanentLocationId" : "53cf956f-c1df-410b-8bea-27f712cca7c0", "electronicAccess" : [ ], "callNumber" : "BR140 .J86", "notes" : [ ], "holdingsStatements" : [ { "statement" : "v.54-68 (2003-2017)" } ], "holdingsStatementsForIndexes" : [ { "statement" : "v.1/50 (1950/1999)" } ], "holdingsStatementsForSupplements" : [ ], "statisticalCodeIds" : [ ], "sourceId" : "036ee84a-6afd-4c3c-9ad3-4a12ab875f59" } 18 Oct 2021 16:35:06.580 [vert.x-worker-thread-16] INFO HoldingsEventHandler [293189952eqId] Created Holding record 18 Oct 2021 16:35:06.581 [vert.x-worker-thread-16] INFO AbstractConfig [293189953eqId] ProducerConfig values: acks = -1 batch.size = 16384 bootstrap.servers = [kafka:9092] buffer.memory = 33554432 client.dns.lookup = default client.id = producer-24 compression.type = gzip connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = true interceptor.classes = [] key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 0 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.StringSerializer 18 Oct 2021 16:35:06.582 [vert.x-worker-thread-16] INFO KafkaProducer [293189954eqId] [Producer clientId=producer-24] Instantiated an idempotent producer. 18 Oct 2021 16:35:06.584 [vert.x-worker-thread-16] INFO KafkaProducer [293189956eqId] [Producer clientId=producer-24] Overriding the default retries config to the recommended value of 2147483647 since the idempotent producer is enabled. 18 Oct 2021 16:35:06.584 [vert.x-worker-thread-16] INFO KafkaProducer [293189956eqId] [Producer clientId=producer-24] Overriding the default acks to all since idempotence is enabled. 18 Oct 2021 16:35:06.585 [vert.x-worker-thread-16] INFO ppInfoParser$AppInfo [293189957eqId] Kafka version: 2.5.0 18 Oct 2021 16:35:06.585 [vert.x-worker-thread-16] INFO ppInfoParser$AppInfo [293189957eqId] Kafka commitId: 66563e712b0b9f84 18 Oct 2021 16:35:06.585 [vert.x-worker-thread-16] INFO ppInfoParser$AppInfo [293189957eqId] Kafka startTimeMs: 1634574906585 18 Oct 2021 16:35:06.587 [kafka-producer-network-thread | producer-24] INFO Metadata [293189959eqId] [Producer clientId=producer-24] Cluster ID: ndvmVewOTKSTf8qW5if9eg 18 Oct 2021 16:35:06.587 [kafka-producer-network-thread | producer-24] INFO TransactionManager [293189959eqId] [Producer clientId=producer-24] ProducerId set to 4092 with epoch 0 18 Oct 2021 16:35:06.618 [vert.x-worker-thread-6] INFO KafkaEventPublisher [293189990eqId] Event with type: DI_INVENTORY_HOLDINGS_CREATED_READY_FOR_POST_PROCESSING and correlationId: 092189d6-856c-453a-a7e8-6573f12a0fc4 was sent to the topic metadata-spitfire.Default.diku.DI_INVENTORY_HOLDINGS_CREATED_READY_FOR_POST_PROCESSING 18 Oct 2021 16:35:06.621 [vert.x-worker-thread-6] INFO KafkaProducer [293189993eqId] [Producer clientId=producer-24] Closing the Kafka producer with timeoutMillis = 9223372036854775807 ms. 18 Oct 2021 16:35:07.192 [vert.x-worker-thread-16] INFO KafkaConsumerWrapper [293190564eqId] Consumer - id: 10 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_HOLDING_RECORD_CREATED, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_HOLDING_RECORD_CREATED) Committed offset: 3 18 Oct 2021 16:35:07.462 [vert.x-worker-thread-19] INFO dHridSetKafkaHandler [293190834eqId] Event payload has been received with event type: DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET and correlationId: 092189d6-856c-453a-a7e8-6573f12a0fc4 18 Oct 2021 16:35:07.477 [vert.x-worker-thread-12] INFO MappingMetadataCache [293190849eqId] MappingMetadata was loaded by jobExecutionId '8303f7ea-c811-4247-886f-5f48188eb52d' 18 Oct 2021 16:35:08.465 [vert.x-worker-thread-0] INFO KafkaConsumerWrapper [293191837eqId] Consumer - id: 1 subscriptionPattern: SubscriptionDefinition(eventType=DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET, subscriptionPattern=metadata-spitfire\.Default\.\w{1,}\.DI_SRS_MARC_HOLDINGS_HOLDING_HRID_SET) Committed offset: 3