Skip to main content

All Questions

Filter by
Sorted by
Tagged with
0 votes
1 answer
178 views

Confluent JDBC source connector issue with table whitelist option

We are using Confluent Kafka JDBC source connector to fetch data from AWS RDS(Oracle db).The connector is working for select * from table name but it does not work with table whitelist option.It is ...
Adhish's user avatar
  • 87
0 votes
0 answers
138 views

Problem with TimestampConverter from Confluent

I'm new with Kafk and I'm struggling dealing with timestamp field within my JDBC sink connector. Making long things short, I'm using Docker to test a source PostgresSQL connector to copy the data of ...
leandrofita's user avatar
1 vote
3 answers
4k views

Kafka jdbc sink connector creates data types that do not matching the original

I am using Kafka and Kafka Connect to replicate MS SQL Server database to MySQL using debezium sql server CDC source connector and confluent JDBC sink connector. The "auto.create" is set to ...
hhe's user avatar
  • 313
0 votes
2 answers
749 views

How to modify/update to the data before sending it to downstream

I have a topic which has data in the format { before: {...}, after: {...}, source: {...}, op: 'u' } The data was produced by Debezium. I want to send the data to SQL Server db table, so I ...
Naman Kumar's user avatar
0 votes
1 answer
2k views

kafka jdbc sink connector throws org.apache.kafka.connect.errors.DataException (Struct schema's field name not specified properly) to insert PG table

I am using kafka-jdbc-sink-connector for my project where I need to post some JSON at kafka topic (kafka_subs) and then by using jdbc-sink-connector, I need to insert that record at the postgres table(...
Suvendu Ghosh's user avatar
2 votes
0 answers
4k views

Kafka Connect + JDBC Source connector + JDBC Sink connector + MSSQL SQL Server = IDENTITY_INSERT issues

I am trying to figure out why I am receiving an "IDENTITY_INSERT" error when trying to use a JDBC sink connector to sink data into a SQL Server database off of a topic that is also written ...
Joe G's user avatar
  • 43
0 votes
1 answer
314 views

Read current incrementing value of Confluent JDBC Kafka Source Connector with Rest?

I have a Kafka Source Connector using the io.confluent.connect.jdbc.JdbcSourceConnector class. It is run in incrementing mode. I can access this connector via the Rest interface. To examine a problem ...
Harold L. Brown's user avatar
0 votes
1 answer
322 views

Kafka Connect JDBC Source running even when query fails

I'm running a JDBC source connector and try to monitor its status somehow via the exposed JMX metrics and a prometheus exporter. However the status of the connector and all its tasks are still in the ...
m-kay's user avatar
  • 302
0 votes
0 answers
1k views

kafka sink connector. How insert record_key to table if record_value = null?

I have a topic where the data is only record_key and the record_value is null. I need to write the fields from the record_key to the table through the sink connector, but I get an error. How do I ...
Леонид Дубравский's user avatar
0 votes
1 answer
846 views

How kafka fetch data only when new row insert or update old row from mysql database using kafka streaming

I am using confluent platform JDBC connector, it is streaming data from mysql to kafka consumer. Through an application inserting data into another database for reporting purpose. Here problem it is ...
Bimal Kumar Dalei's user avatar
1 vote
1 answer
185 views

Insert Error while trying to load data to postgres from oracle using Kafka-Connect-API

I'm trying to transfer data from oracle to postgres using kafka jdbc connect API I don't wanna create a new table at postgres automatically. so I created my own table at postgres before creating a ...
vigneshwar reddy's user avatar
0 votes
1 answer
2k views

Is there a way for Kafka Connect to ignore 1 field in the schema file and read the other fields into the db

I am trying to sink this schema file into the db, where one of the field type is a map of boolean values reservation table create table "table-reservation" ( "sessionId" varchar,...
user14560049's user avatar
0 votes
0 answers
155 views

Reg: Kafka connector configuration

I am using Kafka source connector configuration to produce data from a source table (Maria DB) into a Kafka topic. For some tables, I am using mode='timestamp' in the connect config. I have two fields ...
Sathish's user avatar
0 votes
0 answers
265 views

Jdbc Source Connector is holding more active session in Kafka connect for Redshift?

we are using Kafka JDBC connector to connect to Redshift database we created three connectors to pull the data from three different table to three different topic we observed that each connector ...
RandomUser's user avatar
0 votes
0 answers
309 views

Duplicate records when consuming from MS SQL database via JDBC source connector

I am using a JDBC source connector to consume from a MS SQL database table. Attaching a picture for connector configurations. I am getting duplicate records while using these configurations. Data gets ...
Nan-7's user avatar
  • 81
1 vote
2 answers
2k views

How to convert table name to uppercase using JDBC sink connector

I am implementing CDC based replication between Postgres and Oracle databases. For the source connector I am using Debezium's Postgres Connector and for the sink connector Confluent's JDBC Sink ...
Nomad's user avatar
  • 353
2 votes
1 answer
2k views

How do we reset the state associated with a Kafka Connect source connector?

We are working with Kafka Connect 2.5. We are using the Confluent JDBC source connector (although I think this question is mostly agnostic to the connector type) and are consuming some data from an ...
Andrew Ferrier's user avatar
0 votes
1 answer
510 views

Confluent JDBC connector, where information is kept about last read ID and timestamp

I am running Confluent JDBC connector on my local machine and trying to figure out where it is storing the information about last read ID and timestamp. As per https://www.confluent.io/blog/kafka-...
JDev's user avatar
  • 1,802
0 votes
0 answers
29 views

Facing an issue while storing the data in SQL DB using JDBC sink connector

Using a partially open content model in JSONSchema. It means that there will be some mandatory fields(defined in properties) and non-mandatory fields(as defined in the patternProperties) accepted by ...
Raj Bhavsar's user avatar
1 vote
0 answers
235 views

JDBC Source Connector - table.whitelist - 5 Tables in List - What is The Order of Execution

I have 5 tables listed in JDBC Source connector table.whitelist. Are they all synced from PostgreSQL to Kafka in parallel or one table after the other. If it is one table the other, what is the ...
user1409708's user avatar
  • 1,043
0 votes
1 answer
866 views

How to write Kafka Connect query to query data between two specific dates using timestamp mode

I am writing a Kakfa JDBC Source Connector to query between two dates(starting and ending date) in a Postgres Table and want Connector run the query for every day starting from the ending date to the ...
JokerBean's user avatar
  • 423
2 votes
1 answer
789 views

Kafka connector: How to perform query with OR operator and incrementing column?

I am using kafka connect to source data from Oracle db to Kafka topic. I want to fetch data from one table based on increasing id or status column, like SQL statement: SELECT * from TABLE_X where ID &...
luk-perski's user avatar
0 votes
1 answer
619 views

How to combine/concat two jdbc sink connector Transforms?

In my Avro schema for a topic, there are two String fields, field1 and field2. Is it currently possible to use built-in Transforms on a JDBC Sink Connector to combine these two Avro fields into a ...
qlangman's user avatar
2 votes
1 answer
1k views

Kafka connect database source connector : how to copy data from foreign key

I am using the database source connector to move data from my Postgres database table to Kafka topic. I have an orders table having a foreign key with customers table using customerNumber field. Below ...
ajkush's user avatar
  • 589
2 votes
0 answers
645 views

Confluent Kafka Connector throws "No suitable driver found for jdbc:postgresql"

I am working on Multinode Confluent kafka setup with 3 kafka brokers where there separate node for KSQL & schema-registry and running connector in separate node. All the mapping between those ...
Jayasree's user avatar
3 votes
0 answers
553 views

kafka jdbc sink connector creates invalid data type

I am using Kafka to replicate a mysql database from one instance to another using debezium CDC, below is my sink connector configuration, where i use "auto.create:true" which actually creates the ...
Kamboh's user avatar
  • 185
1 vote
1 answer
444 views

Does kafka JDBC Sink Connector keeps track of data being loaded to destination database?

I am using kafka jdbc sink connector which insert data into SQL Server database from topic. Due to some issue in mssql connector is stopped with error: Caused by: org.apache.kafka.connect.errors....
Joseph N's user avatar
  • 560
3 votes
3 answers
361 views

Unable to set CLASSPATH with confluent CLI : java.sql.SQLException: No suitable driver found for jdbc:oracle:thin

I want to use JDBC connector on confluent. It doesnt work when I start connect with Confluent CLI. confluent local start connect and it gives this error: Caused by: java.sql.SQLException: No ...
CompEng's user avatar
  • 7,366
4 votes
1 answer
4k views

Kafka Connect JDBC Sink Connector: How to delete a record that doesn't have a NULL value?

Is there a (recommended) way to delete a record from a Kafka Connect JDBC Sink Connector where the record's value is not NULL? For example, if my JSON configuration includes the following: ... "...
bmoe24x's user avatar
  • 131
1 vote
1 answer
2k views

MySql query fails in Kafka-connect

I am using Kafka connect (confluentinc/cp-kafka-connect:5.4.0) and have MySQL connector installed in it. Basically the following Dockerfile: FROM confluentinc/cp-kafka-connect:5.4.0 RUN echo "===>...
Amit Yadav's user avatar
  • 4,944
3 votes
0 answers
983 views

Kafka JDBC Sink Connector : optional fields in schema

I have a topic where transmit records that will be consumed by a kafka jdbc sink connector (into Postgres). These records are upsert into a database and might be produced without all fields especially ...
Rocel's user avatar
  • 1,037
1 vote
0 answers
407 views

kafka not retreiving data from clickhouse

I have to push data from Clickhouse to Kafka topics,so I tried to use the Confluent JDBC connector. i am following this tutorial that uses mysql instead of clickhouse. here is my configuration and its ...
Mostafa Sadeghi's user avatar
0 votes
0 answers
2k views

Kafka-connect cannot connect to oracle database

I am trying to create a topic in Kafka. When i send a Post request to Kafka-connect to create a topic, connector is created but topic is not created. When i checked the kafka-connect log i saw below ...
omery's user avatar
  • 33
3 votes
1 answer
2k views

kafka-connect org.apache.avro.SchemaParseException: Illegal character $

I'm using jdbc source connector and getting below error when I want to connect with a table that has "$" in the name. org.apache.avro.SchemaParseException: Illegal character <<tablename>>$...
Sahas's user avatar
  • 3,186
0 votes
1 answer
491 views

kafka-connect-jdbc source connector OOM

I am using kafka-connect-jdbc:5.3.1 and when I am running the connector it kicks me out with Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "Thread-11" I ...
talkdatatome's user avatar
2 votes
1 answer
3k views

kafka connect JDBC sink. Error flattening JSON records

I'm using the Kafka connect JDBC Sink Connector to stored data from topics into a SQL Server table. The data needs to be flattened. I've created a SQL Server table and a JSON record based on the ...
Jaime Caffarel's user avatar
1 vote
0 answers
75 views

Confluent Platform: Source Connector (jdbc) keeps pulling old data

I am using Confluent Platform 4.1.1 for transferring data from sql-server to Oracle database. For this, I am using timestamp+incrementing mode and confluent JDBC. But sometimes I see that source ...
Sajal Srivastava's user avatar
1 vote
2 answers
2k views

Kafka connect JDBC source connector not working

Hello Everyone, I am using Kafka JDBC Source connector using for postgres. Following is my connector configuration. Some how it is not bringing any data. What is wrong in this ...
RajData's user avatar
  • 127
3 votes
2 answers
4k views

kafka jdbc source connector: time stamp mode is not working for sqlite3

I tried to set up a database with a table with timestamp column . I am trying to implement timestamp mode to capture incremental changes in the DB. But kafka-connect-jdbc is not reading any data from ...
kurian's user avatar
  • 191
1 vote
1 answer
7k views

How to create the JDBC sink connector with multiple topic using topic regex option

Created a JDBC source connector catalog.pattern = test_01 source connector configuration { "name": "jdbcsource", "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector", "...
hepzi's user avatar
  • 445
1 vote
1 answer
353 views

Is it possible to use a file-source as input AND jdbc-sink as output with kafka?

I am currently working on a Kafka project and my issue is that I am able to read a file with the file-source Connector and store the data into a topic. My configuration : connector.class=...
Camille SAURY's user avatar
1 vote
2 answers
1k views

JDBC confluent connector mode [closed]

I am using a custom query in JDBC kafka source connector can any one told me what is the mode at the time of using custom query in JDBC kafka source connector if i am using bulk mode then it will ...
santoXme's user avatar
  • 802
4 votes
1 answer
1k views

How to convert topic name into smaller case using tranform method available in confluent properties

I am using JDBC kafka connector for taking data from oracle to kafka topics. I have topic names defined in small case. But, as by default properties are defined in quotes so Oracle is taking ...
user147296's user avatar
1 vote
1 answer
2k views

kafka connect for jdbc sink for microsoft sql server. it works for multiple keys for record_value and this error is poping up for record_key

I was using jdbc sink driver from kafka connect. it allows create table with one primary key when I try to add the 2 pk.key fields . it gives me error: java.lang.NullPointerException at io....
Kumar Pinumalla's user avatar
1 vote
1 answer
589 views

table.whitelist is working as case sensitive even after specifying quote.sql.identifiers=NEVER

I have used JDBC source connector to ingest data from oracle to kafka topics. I have kafka topics created in small letters so I have to specify table.whitelist=table_name (in small case). Since by ...
user147296's user avatar
1 vote
1 answer
63 views

issue using incrementing ingest in jdbc connector

I'm trying to use an incrementing ingest to produce a message to a topic on update of a table in mysql. It works using timestamp but doesn't seem to be working using incrementing column mode. When I ...
user3310115's user avatar
  • 1,460
0 votes
1 answer
1k views

Kafka connect JDBC sink - Mapping nested json to mulitple rows

As part of requirement, we are going ahead with Kafka connect to push data to our database. What I read so far is that there will be a 1x1 mapping between message and db row i.e. for a single message ...
Abhash Upadhyaya's user avatar
10 votes
0 answers
2k views

How to use the Kafka Connect JDBC to source PostgreSQL with multiple schemas that contain tables with the same name?

I need to source data from a PostgreSQL database with ~2000 schemas. All schemas contain the same tables (it is a multi-tenant application). The connector is configured as following: { "name": "...
Daniel Ferreira Jorge's user avatar
0 votes
1 answer
939 views

Messages saved Kafka Topic not saving correctly via Kafka Connector

So I have a Confluent Kafka JDBC connector set up. First I start up a schema registry such as ./bin/schema-registry-start ./etc/schema-registry/schema-registry.properties This is the schema-...
anonuser1234's user avatar
4 votes
1 answer
2k views

Run Confluent JDBC Connector at specific time instead of using polling interval?

We are trying to copy data from a database table into Kafka using the Confluent JDBC-Source connector. The problem is that the data in that table gets updated exactly one every night, so we would like ...
Nikos Epping's user avatar