Skip to content
This repository has been archived by the owner on Apr 22, 2022. It is now read-only.

Pushing records to ALL kafka sinks #483

Open
josephwibowo opened this issue Feb 9, 2021 · 6 comments
Open

Pushing records to ALL kafka sinks #483

josephwibowo opened this issue Feb 9, 2021 · 6 comments

Comments

@josephwibowo
Copy link

josephwibowo commented Feb 9, 2021

I'm a little confused as to the behavior of Kafka sinks. It seems that when an event is created, the event is pushed to all sinks. In the case of Kafka, this pushes the event to all Kafka sinks even though the event is only relevant to one sink. Am I missing something? Why wouldn't it be smart enough only to push the event to the relevant Kafka sink?

Example:

mappings {
    t1 = {
      sources = [browser]
      sinks = [t1]
      confluent_id = 1
    }

    t2 = {
      schema_file = "/opt/divolte/divolte-collector/conf/avro/t2.avsc"
      mapping_script_file = "/opt/divolte/divolte-collector/conf/mapping.groovy"
      sources = [browser]
      sinks = [t2]
      confluent_id = 2
    }
  }

sinks {
    t1 {
      type = kafka
      mode = confluent
      topic = t1
    }

    t2 {
      type = kafka
      mode = confluent
      topic = t2
    }
  }

t1 events will write to t1 and t2 sinks. t2 events will write to t1 and t2 sinks.

@asnare
Copy link
Member

asnare commented Feb 26, 2021

This is a bit of a head-scratcher. Indeed, events from mapping.t2 should only end up on topic t2.

Are you sure they're on t1?

A colleague mocked this up on a demonstration project and couldn't reproduce this: Divolte Shop, MR#45

@OneCricketeer
Copy link

even though the event is only relevant to one sink

Can you clarify that? Your source for both mappings is the browser. Are you filtering the events somehow? If not, both sinks will receive the source events

@soufianeodf
Copy link

Hey I'm running on the same issue, did anyone found any solution for that ???

@prk2331
Copy link

prk2331 commented Jan 24, 2022

We are also facing this issue , can you please provide some help here , we are integrating this one. and we found a bug in our testing, this is pushing the messages to all slinked Kafka

@prk2331
Copy link

prk2331 commented Jan 25, 2022

hi @OneCricketeer can you please let me know,
I want to send a group of fields which are define in the Avro schema to their particular Kafka sink
filter by something event name which we are passing in Divolte signal as a first parameter
Can we achieve this one ?

@OneCricketeer
Copy link

I've never used Divolte

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants