Uploaded image for project: 'data-import-processing-core'
  1. data-import-processing-core
  2. MODDICORE-178

Use Spring Kafka template to replace Vertx Kafka producers

    XMLWordPrintable

Details

    • Folijet
    • Nolana R3 2022

    Description

      Currently data-import-processing creates producer for each invocation and closes it each time. 

      Previously we had a spike MODDATAIMP-499 SPIKE: Use active Kafka producer from the pool for sending messages and based on it results it was chosen to  use Spring Kafka Template for sending messages to topics in order to replace Vertx Kafka producers and not implementing custom pool of producers.

      This approach can abstract us from implementation details and easy to use. By default its tuned with the most appropriate settings  and at the same time is configurable and we can tune it for our needs. It will simply developer's work when concentration more on business requirements, but not supporting own pool of connection. Also our plan is migration our modules to Spring one by one and using Spring Kafka can be good starting point, moreover our modules already using Spring context for dependencies injections that will simplify introducing of Spring Kafka.

      PR for reference implemented in Spike: https://github.com/folio-org/mod-source-record-manager/pull/492

      These changes can be tested in mod-inventory, because some classes from mod-inventory use data-import-processing core to send events to Kafka

       

      TestRail: Results

        Attachments

          Issue Links

            Activity

              People

                Unassigned Unassigned
                Nosko Serhii Nosko
                Votes:
                0 Vote for this issue
                Watchers:
                2 Start watching this issue

                Dates

                  Created:
                  Updated:

                  TestRail: Runs

                    TestRail: Cases