I have Debezium in a container, capturing all changes of PostgeSQL database records. In addition a have a confluent JDBC container to write all changes to another database.
In the source connector the definition is allowing multiple tables to be captured and therefore some of the tables have primary keys and some others not.
In the sink connector the definition specifies pk.mode like the following:
"insert.mode": "upsert","delete.enabled": "true","pk.mode": "record_key",
But while there are tables in the source database without keys on the tables, sink connector is throwing the following message:
Caused by: org.apache.kafka.connect.errors.ConnectException: PK mode for table 'contex_str_dealer_branch_address' is RECORD_KEY, but record key schema is missing
Normally there should be a few options
- to exclude tables without primary keys from the source
- to exclude tables without primary keys from the sink
- to have primary key for tables that they have or use all other columns for tables that they have not primary key
Is there any way to skip those tables from any operation?