JdbcMetadataStore for filtering





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ height:90px;width:728px;box-sizing:border-box;
}







0















I have flow that I want to implement JdbcMetadataStore in the filers, I used SimpleMetadataStore() but this is causing issues as it an in memory store, I need to use a shared metadata store, so I have Postgres DB installed and I can see that Jdbc supports it, I declared a bean as per documents to return a JdbcMetadataStore but I'm not sure how I can use this in the filters, tried to search a lot for any example but could not find a one, note that I'm using FileSystemPersistentAcceptOnceFileListFilter as well as my datasource for Postgres is all setup in my application properties. I have pasted my code here anyone can please guide me on how to move forward?



private DataSource dataSource;

public IntegrationFlow localToFtpFlow(Branch myBranch){

return IntegrationFlows.from(Files.inboundAdapter(new File(myBranch.getBranchCode()))

.filter(new ChainFileListFilter<File>()
.addFilter(new RegexPatternFileListFilter("final" + myBranch.getBranchCode() +".csv"))
.addFilter(new FileSystemPersistentAcceptOnceFileListFilter(new SimpleMetadataStore(), "foo"))),
e -> e.poller(Pollers.fixedDelay(10_000)))
.transform( p ->{
LOG1.info("Sending file " + p + " to FTP branch " + myBranch.getBranchCode());

return p;
})


.log()
.handle(Ftp.outboundAdapter(createNewFtpSessionFactory(myBranch),FileExistsMode.REPLACE)
.useTemporaryFileName(true)
.autoCreateDirectory(false)
.remoteDirectory(myBranch.getFolderPath()))
.get();
}


public DefaultFtpSessionFactory createNewFtpSessionFactory(Branch branch){
final DefaultFtpSessionFactory factory = new DefaultFtpSessionFactory();
factory.setHost(branch.getHost());
factory.setUsername(branch.getUsern());
factory.setPort(branch.getFtpPort());
factory.setPassword(branch.getPassword());
return factory;
}

@Bean
public MetadataStore metadataStore(final DataSource dataSource) {
return new JdbcMetadataStore(dataSource);
}


error arising before I created the table manually, but shouldn't it be created automatically in Postgres as the DB is supported:



org.springframework.messaging.MessagingException: nested exception is org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [INSERT INTO INT_METADATA_STORE(METADATA_KEY, METADATA_VALUE, REGION) SELECT ?, ?, ? FROM INT_METADATA_STORE WHERE METADATA_KEY=? AND REGION=? HAVING COUNT(*)=0]; nested exception is org.postgresql.util.PSQLException: ERROR: relation "int_metadata_store" does not exist


Logging info regarding the issue when adding another flow for the second server, it will trigger the first flow handling method and sending the data to ftp server for both:



Saved Branch : BEY
Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
BEY
2019-01-07 15:11:25.816 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 2 subscriber(s).
2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1.channel#0' channel
2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1.channel#0' has 1 subscriber(s).
2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2019-01-07 15:11:25.829 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
BEY
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '1o.channel#2' channel
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#2' has 1 subscriber(s).
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1o.channel#0' channel
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#0' has 1 subscriber(s).
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started 1o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
2019-01-07 15:11:42.655 INFO 12940 --- [ask-scheduler-4] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportBEY.csv
Hibernate: select branch0_._id as _id1_0_, branch0_.branch_code as branch_c2_0_, branch0_.folder_path as folder_p3_0_, branch0_.ftp_port as ftp_port4_0_, branch0_.host as host5_0_, branch0_.password as password6_0_, branch0_.usern as usern7_0_ from branch branch0_
Hibernate: insert into branch (branch_code, folder_path, ftp_port, host, password, usern) values (?, ?, ?, ?, ?, ?)
Hibernate: select currval('branch__id_seq')
Saved Branch : JNB
Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
JNB
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 3 subscriber(s).
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2.channel#0' channel
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2.channel#0' has 1 subscriber(s).
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
JNB
2019-01-07 15:13:36.130 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '2o.channel#2' channel
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#2' has 1 subscriber(s).
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2o.channel#0' channel
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#0' has 1 subscriber(s).
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started 2o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
2019-01-07 15:13:40.981 INFO 12940 --- [ask-scheduler-1] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportJNB.csv
2019-01-07 15:13:46.085 INFO 12940 --- [ask-scheduler-7] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=BEYfinalBEY.csv, headers={file_originalFile=BEYfinalBEY.csv, id=42a97889-7bfb-8f77-75d8-4e7988a368f9, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826085}]]
2019-01-07 15:13:46.086 INFO 12940 --- [ask-scheduler-7] o.s.integration.handler.LoggingHandler : GenericMessage [payload=BEYfinalBEY.csv, headers={file_originalFile=BEYfinalBEY.csv, id=108a92b0-db42-620e-1c46-90652a071220, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826086}]
2019-01-07 15:13:46.160 INFO 12940 --- [ask-scheduler-8] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=JNBfinalJNB.csv, headers={file_originalFile=JNBfinalJNB.csv, id=d3b2c6a0-2e9c-42a8-c224-0ed9cbbfaabb, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826160}]]
2019-01-07 15:13:46.161 INFO 12940 --- [ask-scheduler-8] o.s.integration.handler.LoggingHandler : GenericMessage [payload=JNBfinalJNB.csv, headers={file_originalFile=JNBfinalJNB.csv, id=e34070c2-e6ff-e5e1-8c64-4af697ab1032, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826161}]
2019-01-07 15:13:47.129 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing
2019-01-07 15:13:47.534 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing to /ftp/erbranch/EDMS/FEFO/finalBEY.csv
2019-01-07 15:13:49.772 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing
2019-01-07 15:13:50.757 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing to /ftp/erbranch/EDMS/FEFO/finalJNB.csv


You can find my app here
https://github.com/EliasKhattar/Spring-Integration-Project/tree/master/spring4ftpappftp










share|improve this question































    0















    I have flow that I want to implement JdbcMetadataStore in the filers, I used SimpleMetadataStore() but this is causing issues as it an in memory store, I need to use a shared metadata store, so I have Postgres DB installed and I can see that Jdbc supports it, I declared a bean as per documents to return a JdbcMetadataStore but I'm not sure how I can use this in the filters, tried to search a lot for any example but could not find a one, note that I'm using FileSystemPersistentAcceptOnceFileListFilter as well as my datasource for Postgres is all setup in my application properties. I have pasted my code here anyone can please guide me on how to move forward?



    private DataSource dataSource;

    public IntegrationFlow localToFtpFlow(Branch myBranch){

    return IntegrationFlows.from(Files.inboundAdapter(new File(myBranch.getBranchCode()))

    .filter(new ChainFileListFilter<File>()
    .addFilter(new RegexPatternFileListFilter("final" + myBranch.getBranchCode() +".csv"))
    .addFilter(new FileSystemPersistentAcceptOnceFileListFilter(new SimpleMetadataStore(), "foo"))),
    e -> e.poller(Pollers.fixedDelay(10_000)))
    .transform( p ->{
    LOG1.info("Sending file " + p + " to FTP branch " + myBranch.getBranchCode());

    return p;
    })


    .log()
    .handle(Ftp.outboundAdapter(createNewFtpSessionFactory(myBranch),FileExistsMode.REPLACE)
    .useTemporaryFileName(true)
    .autoCreateDirectory(false)
    .remoteDirectory(myBranch.getFolderPath()))
    .get();
    }


    public DefaultFtpSessionFactory createNewFtpSessionFactory(Branch branch){
    final DefaultFtpSessionFactory factory = new DefaultFtpSessionFactory();
    factory.setHost(branch.getHost());
    factory.setUsername(branch.getUsern());
    factory.setPort(branch.getFtpPort());
    factory.setPassword(branch.getPassword());
    return factory;
    }

    @Bean
    public MetadataStore metadataStore(final DataSource dataSource) {
    return new JdbcMetadataStore(dataSource);
    }


    error arising before I created the table manually, but shouldn't it be created automatically in Postgres as the DB is supported:



    org.springframework.messaging.MessagingException: nested exception is org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [INSERT INTO INT_METADATA_STORE(METADATA_KEY, METADATA_VALUE, REGION) SELECT ?, ?, ? FROM INT_METADATA_STORE WHERE METADATA_KEY=? AND REGION=? HAVING COUNT(*)=0]; nested exception is org.postgresql.util.PSQLException: ERROR: relation "int_metadata_store" does not exist


    Logging info regarding the issue when adding another flow for the second server, it will trigger the first flow handling method and sending the data to ftp server for both:



    Saved Branch : BEY
    Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
    BEY
    2019-01-07 15:11:25.816 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 2 subscriber(s).
    2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
    2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1.channel#0' channel
    2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1.channel#0' has 1 subscriber(s).
    2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
    2019-01-07 15:11:25.829 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
    BEY
    2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '1o.channel#2' channel
    2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#2' has 1 subscriber(s).
    2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
    2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1o.channel#0' channel
    2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#0' has 1 subscriber(s).
    2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
    2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started 1o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
    2019-01-07 15:11:42.655 INFO 12940 --- [ask-scheduler-4] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportBEY.csv
    Hibernate: select branch0_._id as _id1_0_, branch0_.branch_code as branch_c2_0_, branch0_.folder_path as folder_p3_0_, branch0_.ftp_port as ftp_port4_0_, branch0_.host as host5_0_, branch0_.password as password6_0_, branch0_.usern as usern7_0_ from branch branch0_
    Hibernate: insert into branch (branch_code, folder_path, ftp_port, host, password, usern) values (?, ?, ?, ?, ?, ?)
    Hibernate: select currval('branch__id_seq')
    Saved Branch : JNB
    Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
    JNB
    2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 3 subscriber(s).
    2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
    2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2.channel#0' channel
    2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2.channel#0' has 1 subscriber(s).
    2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
    2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
    JNB
    2019-01-07 15:13:36.130 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '2o.channel#2' channel
    2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#2' has 1 subscriber(s).
    2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
    2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2o.channel#0' channel
    2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#0' has 1 subscriber(s).
    2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
    2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started 2o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
    2019-01-07 15:13:40.981 INFO 12940 --- [ask-scheduler-1] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportJNB.csv
    2019-01-07 15:13:46.085 INFO 12940 --- [ask-scheduler-7] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=BEYfinalBEY.csv, headers={file_originalFile=BEYfinalBEY.csv, id=42a97889-7bfb-8f77-75d8-4e7988a368f9, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826085}]]
    2019-01-07 15:13:46.086 INFO 12940 --- [ask-scheduler-7] o.s.integration.handler.LoggingHandler : GenericMessage [payload=BEYfinalBEY.csv, headers={file_originalFile=BEYfinalBEY.csv, id=108a92b0-db42-620e-1c46-90652a071220, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826086}]
    2019-01-07 15:13:46.160 INFO 12940 --- [ask-scheduler-8] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=JNBfinalJNB.csv, headers={file_originalFile=JNBfinalJNB.csv, id=d3b2c6a0-2e9c-42a8-c224-0ed9cbbfaabb, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826160}]]
    2019-01-07 15:13:46.161 INFO 12940 --- [ask-scheduler-8] o.s.integration.handler.LoggingHandler : GenericMessage [payload=JNBfinalJNB.csv, headers={file_originalFile=JNBfinalJNB.csv, id=e34070c2-e6ff-e5e1-8c64-4af697ab1032, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826161}]
    2019-01-07 15:13:47.129 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing
    2019-01-07 15:13:47.534 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing to /ftp/erbranch/EDMS/FEFO/finalBEY.csv
    2019-01-07 15:13:49.772 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing
    2019-01-07 15:13:50.757 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing to /ftp/erbranch/EDMS/FEFO/finalJNB.csv


    You can find my app here
    https://github.com/EliasKhattar/Spring-Integration-Project/tree/master/spring4ftpappftp










    share|improve this question



























      0












      0








      0








      I have flow that I want to implement JdbcMetadataStore in the filers, I used SimpleMetadataStore() but this is causing issues as it an in memory store, I need to use a shared metadata store, so I have Postgres DB installed and I can see that Jdbc supports it, I declared a bean as per documents to return a JdbcMetadataStore but I'm not sure how I can use this in the filters, tried to search a lot for any example but could not find a one, note that I'm using FileSystemPersistentAcceptOnceFileListFilter as well as my datasource for Postgres is all setup in my application properties. I have pasted my code here anyone can please guide me on how to move forward?



      private DataSource dataSource;

      public IntegrationFlow localToFtpFlow(Branch myBranch){

      return IntegrationFlows.from(Files.inboundAdapter(new File(myBranch.getBranchCode()))

      .filter(new ChainFileListFilter<File>()
      .addFilter(new RegexPatternFileListFilter("final" + myBranch.getBranchCode() +".csv"))
      .addFilter(new FileSystemPersistentAcceptOnceFileListFilter(new SimpleMetadataStore(), "foo"))),
      e -> e.poller(Pollers.fixedDelay(10_000)))
      .transform( p ->{
      LOG1.info("Sending file " + p + " to FTP branch " + myBranch.getBranchCode());

      return p;
      })


      .log()
      .handle(Ftp.outboundAdapter(createNewFtpSessionFactory(myBranch),FileExistsMode.REPLACE)
      .useTemporaryFileName(true)
      .autoCreateDirectory(false)
      .remoteDirectory(myBranch.getFolderPath()))
      .get();
      }


      public DefaultFtpSessionFactory createNewFtpSessionFactory(Branch branch){
      final DefaultFtpSessionFactory factory = new DefaultFtpSessionFactory();
      factory.setHost(branch.getHost());
      factory.setUsername(branch.getUsern());
      factory.setPort(branch.getFtpPort());
      factory.setPassword(branch.getPassword());
      return factory;
      }

      @Bean
      public MetadataStore metadataStore(final DataSource dataSource) {
      return new JdbcMetadataStore(dataSource);
      }


      error arising before I created the table manually, but shouldn't it be created automatically in Postgres as the DB is supported:



      org.springframework.messaging.MessagingException: nested exception is org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [INSERT INTO INT_METADATA_STORE(METADATA_KEY, METADATA_VALUE, REGION) SELECT ?, ?, ? FROM INT_METADATA_STORE WHERE METADATA_KEY=? AND REGION=? HAVING COUNT(*)=0]; nested exception is org.postgresql.util.PSQLException: ERROR: relation "int_metadata_store" does not exist


      Logging info regarding the issue when adding another flow for the second server, it will trigger the first flow handling method and sending the data to ftp server for both:



      Saved Branch : BEY
      Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
      BEY
      2019-01-07 15:11:25.816 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 2 subscriber(s).
      2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
      2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1.channel#0' channel
      2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1.channel#0' has 1 subscriber(s).
      2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
      2019-01-07 15:11:25.829 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
      BEY
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '1o.channel#2' channel
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#2' has 1 subscriber(s).
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1o.channel#0' channel
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#0' has 1 subscriber(s).
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started 1o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
      2019-01-07 15:11:42.655 INFO 12940 --- [ask-scheduler-4] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportBEY.csv
      Hibernate: select branch0_._id as _id1_0_, branch0_.branch_code as branch_c2_0_, branch0_.folder_path as folder_p3_0_, branch0_.ftp_port as ftp_port4_0_, branch0_.host as host5_0_, branch0_.password as password6_0_, branch0_.usern as usern7_0_ from branch branch0_
      Hibernate: insert into branch (branch_code, folder_path, ftp_port, host, password, usern) values (?, ?, ?, ?, ?, ?)
      Hibernate: select currval('branch__id_seq')
      Saved Branch : JNB
      Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
      JNB
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 3 subscriber(s).
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2.channel#0' channel
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2.channel#0' has 1 subscriber(s).
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
      JNB
      2019-01-07 15:13:36.130 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '2o.channel#2' channel
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#2' has 1 subscriber(s).
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2o.channel#0' channel
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#0' has 1 subscriber(s).
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started 2o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
      2019-01-07 15:13:40.981 INFO 12940 --- [ask-scheduler-1] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportJNB.csv
      2019-01-07 15:13:46.085 INFO 12940 --- [ask-scheduler-7] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=BEYfinalBEY.csv, headers={file_originalFile=BEYfinalBEY.csv, id=42a97889-7bfb-8f77-75d8-4e7988a368f9, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826085}]]
      2019-01-07 15:13:46.086 INFO 12940 --- [ask-scheduler-7] o.s.integration.handler.LoggingHandler : GenericMessage [payload=BEYfinalBEY.csv, headers={file_originalFile=BEYfinalBEY.csv, id=108a92b0-db42-620e-1c46-90652a071220, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826086}]
      2019-01-07 15:13:46.160 INFO 12940 --- [ask-scheduler-8] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=JNBfinalJNB.csv, headers={file_originalFile=JNBfinalJNB.csv, id=d3b2c6a0-2e9c-42a8-c224-0ed9cbbfaabb, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826160}]]
      2019-01-07 15:13:46.161 INFO 12940 --- [ask-scheduler-8] o.s.integration.handler.LoggingHandler : GenericMessage [payload=JNBfinalJNB.csv, headers={file_originalFile=JNBfinalJNB.csv, id=e34070c2-e6ff-e5e1-8c64-4af697ab1032, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826161}]
      2019-01-07 15:13:47.129 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing
      2019-01-07 15:13:47.534 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing to /ftp/erbranch/EDMS/FEFO/finalBEY.csv
      2019-01-07 15:13:49.772 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing
      2019-01-07 15:13:50.757 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing to /ftp/erbranch/EDMS/FEFO/finalJNB.csv


      You can find my app here
      https://github.com/EliasKhattar/Spring-Integration-Project/tree/master/spring4ftpappftp










      share|improve this question
















      I have flow that I want to implement JdbcMetadataStore in the filers, I used SimpleMetadataStore() but this is causing issues as it an in memory store, I need to use a shared metadata store, so I have Postgres DB installed and I can see that Jdbc supports it, I declared a bean as per documents to return a JdbcMetadataStore but I'm not sure how I can use this in the filters, tried to search a lot for any example but could not find a one, note that I'm using FileSystemPersistentAcceptOnceFileListFilter as well as my datasource for Postgres is all setup in my application properties. I have pasted my code here anyone can please guide me on how to move forward?



      private DataSource dataSource;

      public IntegrationFlow localToFtpFlow(Branch myBranch){

      return IntegrationFlows.from(Files.inboundAdapter(new File(myBranch.getBranchCode()))

      .filter(new ChainFileListFilter<File>()
      .addFilter(new RegexPatternFileListFilter("final" + myBranch.getBranchCode() +".csv"))
      .addFilter(new FileSystemPersistentAcceptOnceFileListFilter(new SimpleMetadataStore(), "foo"))),
      e -> e.poller(Pollers.fixedDelay(10_000)))
      .transform( p ->{
      LOG1.info("Sending file " + p + " to FTP branch " + myBranch.getBranchCode());

      return p;
      })


      .log()
      .handle(Ftp.outboundAdapter(createNewFtpSessionFactory(myBranch),FileExistsMode.REPLACE)
      .useTemporaryFileName(true)
      .autoCreateDirectory(false)
      .remoteDirectory(myBranch.getFolderPath()))
      .get();
      }


      public DefaultFtpSessionFactory createNewFtpSessionFactory(Branch branch){
      final DefaultFtpSessionFactory factory = new DefaultFtpSessionFactory();
      factory.setHost(branch.getHost());
      factory.setUsername(branch.getUsern());
      factory.setPort(branch.getFtpPort());
      factory.setPassword(branch.getPassword());
      return factory;
      }

      @Bean
      public MetadataStore metadataStore(final DataSource dataSource) {
      return new JdbcMetadataStore(dataSource);
      }


      error arising before I created the table manually, but shouldn't it be created automatically in Postgres as the DB is supported:



      org.springframework.messaging.MessagingException: nested exception is org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [INSERT INTO INT_METADATA_STORE(METADATA_KEY, METADATA_VALUE, REGION) SELECT ?, ?, ? FROM INT_METADATA_STORE WHERE METADATA_KEY=? AND REGION=? HAVING COUNT(*)=0]; nested exception is org.postgresql.util.PSQLException: ERROR: relation "int_metadata_store" does not exist


      Logging info regarding the issue when adding another flow for the second server, it will trigger the first flow handling method and sending the data to ftp server for both:



      Saved Branch : BEY
      Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
      BEY
      2019-01-07 15:11:25.816 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 2 subscriber(s).
      2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
      2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1.channel#0' channel
      2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1.channel#0' has 1 subscriber(s).
      2019-01-07 15:11:25.817 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
      2019-01-07 15:11:25.829 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
      BEY
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '1o.channel#2' channel
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#2' has 1 subscriber(s).
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '1o.channel#0' channel
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.integration.channel.DirectChannel : Channel 'application.1o.channel#0' has 1 subscriber(s).
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.endpoint.EventDrivenConsumer : started 1o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
      2019-01-07 15:11:25.984 INFO 12940 --- [nio-8081-exec-5] o.s.i.e.SourcePollingChannelAdapter : started 1o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
      2019-01-07 15:11:42.655 INFO 12940 --- [ask-scheduler-4] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportBEY.csv
      Hibernate: select branch0_._id as _id1_0_, branch0_.branch_code as branch_c2_0_, branch0_.folder_path as folder_p3_0_, branch0_.ftp_port as ftp_port4_0_, branch0_.host as host5_0_, branch0_.password as password6_0_, branch0_.usern as usern7_0_ from branch branch0_
      Hibernate: insert into branch (branch_code, folder_path, ftp_port, host, password, usern) values (?, ?, ?, ?, ?, ?)
      Hibernate: select currval('branch__id_seq')
      Saved Branch : JNB
      Hibernate: select branch0_._id as _id1_0_0_, branch0_.branch_code as branch_c2_0_0_, branch0_.folder_path as folder_p3_0_0_, branch0_.ftp_port as ftp_port4_0_0_, branch0_.host as host5_0_0_, branch0_.password as password6_0_0_, branch0_.usern as usern7_0_0_ from branch branch0_ where branch0_._id=?
      JNB
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.intermediateChannel' has 3 subscriber(s).
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2.channel#0' channel
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2.channel#0' has 1 subscriber(s).
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
      2019-01-07 15:13:36.099 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started stockInboundPoller
      JNB
      2019-01-07 15:13:36.130 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {message-handler} as a subscriber to the '2o.channel#2' channel
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#2' has 1 subscriber(s).
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#1
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : Adding {transformer} as a subscriber to the '2o.channel#0' channel
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.integration.channel.DirectChannel : Channel 'application.2o.channel#0' has 1 subscriber(s).
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.endpoint.EventDrivenConsumer : started 2o.org.springframework.integration.config.ConsumerEndpointFactoryBean#0
      2019-01-07 15:13:36.135 INFO 12940 --- [nio-8081-exec-7] o.s.i.e.SourcePollingChannelAdapter : started 2o.org.springframework.integration.config.SourcePollingChannelAdapterFactoryBean#0
      2019-01-07 15:13:40.981 INFO 12940 --- [ask-scheduler-1] o.s.integration.ftp.session.FtpSession : File has been successfully transferred from: /ftp/erbranch/EDMS/FEFO/FEFOexportJNB.csv
      2019-01-07 15:13:46.085 INFO 12940 --- [ask-scheduler-7] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=BEYfinalBEY.csv, headers={file_originalFile=BEYfinalBEY.csv, id=42a97889-7bfb-8f77-75d8-4e7988a368f9, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826085}]]
      2019-01-07 15:13:46.086 INFO 12940 --- [ask-scheduler-7] o.s.integration.handler.LoggingHandler : GenericMessage [payload=BEYfinalBEY.csv, headers={file_originalFile=BEYfinalBEY.csv, id=108a92b0-db42-620e-1c46-90652a071220, file_name=finalBEY.csv, file_relativePath=finalBEY.csv, timestamp=1546866826086}]
      2019-01-07 15:13:46.160 INFO 12940 --- [ask-scheduler-8] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=JNBfinalJNB.csv, headers={file_originalFile=JNBfinalJNB.csv, id=d3b2c6a0-2e9c-42a8-c224-0ed9cbbfaabb, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826160}]]
      2019-01-07 15:13:46.161 INFO 12940 --- [ask-scheduler-8] o.s.integration.handler.LoggingHandler : GenericMessage [payload=JNBfinalJNB.csv, headers={file_originalFile=JNBfinalJNB.csv, id=e34070c2-e6ff-e5e1-8c64-4af697ab1032, file_name=finalJNB.csv, file_relativePath=finalJNB.csv, timestamp=1546866826161}]
      2019-01-07 15:13:47.129 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing
      2019-01-07 15:13:47.534 INFO 12940 --- [ask-scheduler-7] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalBEY.csv.writing to /ftp/erbranch/EDMS/FEFO/finalBEY.csv
      2019-01-07 15:13:49.772 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully transferred to: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing
      2019-01-07 15:13:50.757 INFO 12940 --- [ask-scheduler-8] o.s.integration.ftp.session.FtpSession : File has been successfully renamed from: /ftp/erbranch/EDMS/FEFO/finalJNB.csv.writing to /ftp/erbranch/EDMS/FEFO/finalJNB.csv


      You can find my app here
      https://github.com/EliasKhattar/Spring-Integration-Project/tree/master/spring4ftpappftp







      spring-integration






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Jan 7 at 13:22







      Elias Khattar

















      asked Jan 4 at 13:26









      Elias KhattarElias Khattar

      436




      436
























          1 Answer
          1






          active

          oldest

          votes


















          0














          Just use metadataSource(dataSource) instead of new SimpleMetadataStore() in the filter constructor.



          EDIT



          I just copied your flow into a new app (made a few changes, but not to the filter) and everything works fine for me...



          @SpringBootApplication
          public class So54039852Application {

          public static void main(String args) {
          SpringApplication.run(So54039852Application.class, args);
          }

          @Bean
          public IntegrationFlow localToFtpFlow(DataSource dataSource) {

          return IntegrationFlows.from(Files.inboundAdapter(new File("/tmp/foo"))

          .filter(new ChainFileListFilter<File>()
          .addFilter(new RegexPatternFileListFilter(".*\.csv"))
          .addFilter(new FileSystemPersistentAcceptOnceFileListFilter(metadataStore(dataSource), "foo"))),
          e -> e.poller(Pollers.fixedDelay(10_000)))
          .log()
          .get();
          }

          @Bean
          public ConcurrentMetadataStore metadataStore(final DataSource dataSource) {
          return new JdbcMetadataStore(dataSource);
          }

          }


          and



          $ touch /tmp/foo/foo.csv
          ...
          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:46:26.332  INFO 43329 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/foo.csv, headers={file_originalFile=/tmp/foo/foo.csv, id=e0613529-a657-fbd3-5e67-8bb53a58b5ca, file_name=foo.csv, file_relativePath=foo.csv, timestamp=1547055986330}]
          2019-01-09 12:47:26.487 INFO 43329 --- [ask-scheduler-5] o.s.integration.handler.LoggingHandler : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=4feb74b6-d711-f028-70c7-83cdfcd0aeec, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056046487}]


          and



          mysql> select * from INT_METADATA_STORE;
          +---------------------+----------------+---------+
          | METADATA_KEY | METADATA_VALUE | REGION |
          +---------------------+----------------+---------+
          | foo/tmp/foo/bar.csv | 1547056039000 | DEFAULT |
          | foo/tmp/foo/foo.csv | 1547055980000 | DEFAULT |
          +---------------------+----------------+---------+
          2 rows in set (0.00 sec)


          I don't see the file again even after a restart, but if I change the date on one of the files...



          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:51:58.534  INFO 44430 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=f92d6b36-c948-37cc-ca56-9ef28de336f2, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056318532}]





          share|improve this answer


























          • After I added metadataStore(dataSource) in the filter I had to change the metadataStore to ConcurrentMetadataStore as this filter does not take MetaDataStore type, then I had to declare a Datasource variable to be used in the method in the filter, after testing the code I'm facing an error which I'm including above in the code, seems that it is not creating the table in the postgres although the jdbc connections are all showing correct as per debugging@Gary Russell

            – Elias Khattar
            Jan 7 at 7:52











          • I created a table in postgres with the name in small char as the DB does not read capital letters, after that it worked to parse the info in the table, but the same issue is still there, if I have two files that are already processed and I drop one file in one server then it sends both again at the same time not only the new one, Am I missing anything in the filters? Also if I create the initial flow which suppose to pick the file,creates new and sends it, it is only pulling the file from FTP and I have to add another ftp flow so that it sends back the new files@Gary russell

            – Elias Khattar
            Jan 7 at 10:58













          • The filter should be an FtpPersistentAcceptOnceFileListFilter; there is also a localFilter, which needs the FileSystemPersistentFileListFilter.

            – Gary Russell
            Jan 7 at 14:16











          • Do you mean that I should replace FileSystemPersistentAcceptOnceFileListFilter with FtpPersistentAcceptOnceFileListFilter? tried so but it is giving filter type issue maybe cause I'm using ChainFileListFilter? should that be replaced?also about the localFilter where do you mean that should be entered?thank you@Gary Russell

            – Elias Khattar
            Jan 7 at 20:26











          • The inbound channel adapter has 2 filters (.filter()) and localFilter()). The first one determines which files are fetched from FTP and copied to the local file system; the second determines which files on the file system are sent as messages. If you can upload a simplified project to GitHub, one of us can take a look.

            – Gary Russell
            Jan 7 at 20:31












          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54039852%2fjdbcmetadatastore-for-filtering%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          Just use metadataSource(dataSource) instead of new SimpleMetadataStore() in the filter constructor.



          EDIT



          I just copied your flow into a new app (made a few changes, but not to the filter) and everything works fine for me...



          @SpringBootApplication
          public class So54039852Application {

          public static void main(String args) {
          SpringApplication.run(So54039852Application.class, args);
          }

          @Bean
          public IntegrationFlow localToFtpFlow(DataSource dataSource) {

          return IntegrationFlows.from(Files.inboundAdapter(new File("/tmp/foo"))

          .filter(new ChainFileListFilter<File>()
          .addFilter(new RegexPatternFileListFilter(".*\.csv"))
          .addFilter(new FileSystemPersistentAcceptOnceFileListFilter(metadataStore(dataSource), "foo"))),
          e -> e.poller(Pollers.fixedDelay(10_000)))
          .log()
          .get();
          }

          @Bean
          public ConcurrentMetadataStore metadataStore(final DataSource dataSource) {
          return new JdbcMetadataStore(dataSource);
          }

          }


          and



          $ touch /tmp/foo/foo.csv
          ...
          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:46:26.332  INFO 43329 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/foo.csv, headers={file_originalFile=/tmp/foo/foo.csv, id=e0613529-a657-fbd3-5e67-8bb53a58b5ca, file_name=foo.csv, file_relativePath=foo.csv, timestamp=1547055986330}]
          2019-01-09 12:47:26.487 INFO 43329 --- [ask-scheduler-5] o.s.integration.handler.LoggingHandler : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=4feb74b6-d711-f028-70c7-83cdfcd0aeec, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056046487}]


          and



          mysql> select * from INT_METADATA_STORE;
          +---------------------+----------------+---------+
          | METADATA_KEY | METADATA_VALUE | REGION |
          +---------------------+----------------+---------+
          | foo/tmp/foo/bar.csv | 1547056039000 | DEFAULT |
          | foo/tmp/foo/foo.csv | 1547055980000 | DEFAULT |
          +---------------------+----------------+---------+
          2 rows in set (0.00 sec)


          I don't see the file again even after a restart, but if I change the date on one of the files...



          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:51:58.534  INFO 44430 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=f92d6b36-c948-37cc-ca56-9ef28de336f2, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056318532}]





          share|improve this answer


























          • After I added metadataStore(dataSource) in the filter I had to change the metadataStore to ConcurrentMetadataStore as this filter does not take MetaDataStore type, then I had to declare a Datasource variable to be used in the method in the filter, after testing the code I'm facing an error which I'm including above in the code, seems that it is not creating the table in the postgres although the jdbc connections are all showing correct as per debugging@Gary Russell

            – Elias Khattar
            Jan 7 at 7:52











          • I created a table in postgres with the name in small char as the DB does not read capital letters, after that it worked to parse the info in the table, but the same issue is still there, if I have two files that are already processed and I drop one file in one server then it sends both again at the same time not only the new one, Am I missing anything in the filters? Also if I create the initial flow which suppose to pick the file,creates new and sends it, it is only pulling the file from FTP and I have to add another ftp flow so that it sends back the new files@Gary russell

            – Elias Khattar
            Jan 7 at 10:58













          • The filter should be an FtpPersistentAcceptOnceFileListFilter; there is also a localFilter, which needs the FileSystemPersistentFileListFilter.

            – Gary Russell
            Jan 7 at 14:16











          • Do you mean that I should replace FileSystemPersistentAcceptOnceFileListFilter with FtpPersistentAcceptOnceFileListFilter? tried so but it is giving filter type issue maybe cause I'm using ChainFileListFilter? should that be replaced?also about the localFilter where do you mean that should be entered?thank you@Gary Russell

            – Elias Khattar
            Jan 7 at 20:26











          • The inbound channel adapter has 2 filters (.filter()) and localFilter()). The first one determines which files are fetched from FTP and copied to the local file system; the second determines which files on the file system are sent as messages. If you can upload a simplified project to GitHub, one of us can take a look.

            – Gary Russell
            Jan 7 at 20:31
















          0














          Just use metadataSource(dataSource) instead of new SimpleMetadataStore() in the filter constructor.



          EDIT



          I just copied your flow into a new app (made a few changes, but not to the filter) and everything works fine for me...



          @SpringBootApplication
          public class So54039852Application {

          public static void main(String args) {
          SpringApplication.run(So54039852Application.class, args);
          }

          @Bean
          public IntegrationFlow localToFtpFlow(DataSource dataSource) {

          return IntegrationFlows.from(Files.inboundAdapter(new File("/tmp/foo"))

          .filter(new ChainFileListFilter<File>()
          .addFilter(new RegexPatternFileListFilter(".*\.csv"))
          .addFilter(new FileSystemPersistentAcceptOnceFileListFilter(metadataStore(dataSource), "foo"))),
          e -> e.poller(Pollers.fixedDelay(10_000)))
          .log()
          .get();
          }

          @Bean
          public ConcurrentMetadataStore metadataStore(final DataSource dataSource) {
          return new JdbcMetadataStore(dataSource);
          }

          }


          and



          $ touch /tmp/foo/foo.csv
          ...
          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:46:26.332  INFO 43329 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/foo.csv, headers={file_originalFile=/tmp/foo/foo.csv, id=e0613529-a657-fbd3-5e67-8bb53a58b5ca, file_name=foo.csv, file_relativePath=foo.csv, timestamp=1547055986330}]
          2019-01-09 12:47:26.487 INFO 43329 --- [ask-scheduler-5] o.s.integration.handler.LoggingHandler : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=4feb74b6-d711-f028-70c7-83cdfcd0aeec, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056046487}]


          and



          mysql> select * from INT_METADATA_STORE;
          +---------------------+----------------+---------+
          | METADATA_KEY | METADATA_VALUE | REGION |
          +---------------------+----------------+---------+
          | foo/tmp/foo/bar.csv | 1547056039000 | DEFAULT |
          | foo/tmp/foo/foo.csv | 1547055980000 | DEFAULT |
          +---------------------+----------------+---------+
          2 rows in set (0.00 sec)


          I don't see the file again even after a restart, but if I change the date on one of the files...



          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:51:58.534  INFO 44430 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=f92d6b36-c948-37cc-ca56-9ef28de336f2, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056318532}]





          share|improve this answer


























          • After I added metadataStore(dataSource) in the filter I had to change the metadataStore to ConcurrentMetadataStore as this filter does not take MetaDataStore type, then I had to declare a Datasource variable to be used in the method in the filter, after testing the code I'm facing an error which I'm including above in the code, seems that it is not creating the table in the postgres although the jdbc connections are all showing correct as per debugging@Gary Russell

            – Elias Khattar
            Jan 7 at 7:52











          • I created a table in postgres with the name in small char as the DB does not read capital letters, after that it worked to parse the info in the table, but the same issue is still there, if I have two files that are already processed and I drop one file in one server then it sends both again at the same time not only the new one, Am I missing anything in the filters? Also if I create the initial flow which suppose to pick the file,creates new and sends it, it is only pulling the file from FTP and I have to add another ftp flow so that it sends back the new files@Gary russell

            – Elias Khattar
            Jan 7 at 10:58













          • The filter should be an FtpPersistentAcceptOnceFileListFilter; there is also a localFilter, which needs the FileSystemPersistentFileListFilter.

            – Gary Russell
            Jan 7 at 14:16











          • Do you mean that I should replace FileSystemPersistentAcceptOnceFileListFilter with FtpPersistentAcceptOnceFileListFilter? tried so but it is giving filter type issue maybe cause I'm using ChainFileListFilter? should that be replaced?also about the localFilter where do you mean that should be entered?thank you@Gary Russell

            – Elias Khattar
            Jan 7 at 20:26











          • The inbound channel adapter has 2 filters (.filter()) and localFilter()). The first one determines which files are fetched from FTP and copied to the local file system; the second determines which files on the file system are sent as messages. If you can upload a simplified project to GitHub, one of us can take a look.

            – Gary Russell
            Jan 7 at 20:31














          0












          0








          0







          Just use metadataSource(dataSource) instead of new SimpleMetadataStore() in the filter constructor.



          EDIT



          I just copied your flow into a new app (made a few changes, but not to the filter) and everything works fine for me...



          @SpringBootApplication
          public class So54039852Application {

          public static void main(String args) {
          SpringApplication.run(So54039852Application.class, args);
          }

          @Bean
          public IntegrationFlow localToFtpFlow(DataSource dataSource) {

          return IntegrationFlows.from(Files.inboundAdapter(new File("/tmp/foo"))

          .filter(new ChainFileListFilter<File>()
          .addFilter(new RegexPatternFileListFilter(".*\.csv"))
          .addFilter(new FileSystemPersistentAcceptOnceFileListFilter(metadataStore(dataSource), "foo"))),
          e -> e.poller(Pollers.fixedDelay(10_000)))
          .log()
          .get();
          }

          @Bean
          public ConcurrentMetadataStore metadataStore(final DataSource dataSource) {
          return new JdbcMetadataStore(dataSource);
          }

          }


          and



          $ touch /tmp/foo/foo.csv
          ...
          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:46:26.332  INFO 43329 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/foo.csv, headers={file_originalFile=/tmp/foo/foo.csv, id=e0613529-a657-fbd3-5e67-8bb53a58b5ca, file_name=foo.csv, file_relativePath=foo.csv, timestamp=1547055986330}]
          2019-01-09 12:47:26.487 INFO 43329 --- [ask-scheduler-5] o.s.integration.handler.LoggingHandler : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=4feb74b6-d711-f028-70c7-83cdfcd0aeec, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056046487}]


          and



          mysql> select * from INT_METADATA_STORE;
          +---------------------+----------------+---------+
          | METADATA_KEY | METADATA_VALUE | REGION |
          +---------------------+----------------+---------+
          | foo/tmp/foo/bar.csv | 1547056039000 | DEFAULT |
          | foo/tmp/foo/foo.csv | 1547055980000 | DEFAULT |
          +---------------------+----------------+---------+
          2 rows in set (0.00 sec)


          I don't see the file again even after a restart, but if I change the date on one of the files...



          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:51:58.534  INFO 44430 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=f92d6b36-c948-37cc-ca56-9ef28de336f2, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056318532}]





          share|improve this answer















          Just use metadataSource(dataSource) instead of new SimpleMetadataStore() in the filter constructor.



          EDIT



          I just copied your flow into a new app (made a few changes, but not to the filter) and everything works fine for me...



          @SpringBootApplication
          public class So54039852Application {

          public static void main(String args) {
          SpringApplication.run(So54039852Application.class, args);
          }

          @Bean
          public IntegrationFlow localToFtpFlow(DataSource dataSource) {

          return IntegrationFlows.from(Files.inboundAdapter(new File("/tmp/foo"))

          .filter(new ChainFileListFilter<File>()
          .addFilter(new RegexPatternFileListFilter(".*\.csv"))
          .addFilter(new FileSystemPersistentAcceptOnceFileListFilter(metadataStore(dataSource), "foo"))),
          e -> e.poller(Pollers.fixedDelay(10_000)))
          .log()
          .get();
          }

          @Bean
          public ConcurrentMetadataStore metadataStore(final DataSource dataSource) {
          return new JdbcMetadataStore(dataSource);
          }

          }


          and



          $ touch /tmp/foo/foo.csv
          ...
          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:46:26.332  INFO 43329 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/foo.csv, headers={file_originalFile=/tmp/foo/foo.csv, id=e0613529-a657-fbd3-5e67-8bb53a58b5ca, file_name=foo.csv, file_relativePath=foo.csv, timestamp=1547055986330}]
          2019-01-09 12:47:26.487 INFO 43329 --- [ask-scheduler-5] o.s.integration.handler.LoggingHandler : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=4feb74b6-d711-f028-70c7-83cdfcd0aeec, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056046487}]


          and



          mysql> select * from INT_METADATA_STORE;
          +---------------------+----------------+---------+
          | METADATA_KEY | METADATA_VALUE | REGION |
          +---------------------+----------------+---------+
          | foo/tmp/foo/bar.csv | 1547056039000 | DEFAULT |
          | foo/tmp/foo/foo.csv | 1547055980000 | DEFAULT |
          +---------------------+----------------+---------+
          2 rows in set (0.00 sec)


          I don't see the file again even after a restart, but if I change the date on one of the files...



          $ touch /tmp/foo/bar.csv


          and



          2019-01-09 12:51:58.534  INFO 44430 --- [ask-scheduler-2] o.s.integration.handler.LoggingHandler   : GenericMessage [payload=/tmp/foo/bar.csv, headers={file_originalFile=/tmp/foo/bar.csv, id=f92d6b36-c948-37cc-ca56-9ef28de336f2, file_name=bar.csv, file_relativePath=bar.csv, timestamp=1547056318532}]






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Jan 9 at 17:53

























          answered Jan 4 at 14:28









          Gary RussellGary Russell

          86k85179




          86k85179













          • After I added metadataStore(dataSource) in the filter I had to change the metadataStore to ConcurrentMetadataStore as this filter does not take MetaDataStore type, then I had to declare a Datasource variable to be used in the method in the filter, after testing the code I'm facing an error which I'm including above in the code, seems that it is not creating the table in the postgres although the jdbc connections are all showing correct as per debugging@Gary Russell

            – Elias Khattar
            Jan 7 at 7:52











          • I created a table in postgres with the name in small char as the DB does not read capital letters, after that it worked to parse the info in the table, but the same issue is still there, if I have two files that are already processed and I drop one file in one server then it sends both again at the same time not only the new one, Am I missing anything in the filters? Also if I create the initial flow which suppose to pick the file,creates new and sends it, it is only pulling the file from FTP and I have to add another ftp flow so that it sends back the new files@Gary russell

            – Elias Khattar
            Jan 7 at 10:58













          • The filter should be an FtpPersistentAcceptOnceFileListFilter; there is also a localFilter, which needs the FileSystemPersistentFileListFilter.

            – Gary Russell
            Jan 7 at 14:16











          • Do you mean that I should replace FileSystemPersistentAcceptOnceFileListFilter with FtpPersistentAcceptOnceFileListFilter? tried so but it is giving filter type issue maybe cause I'm using ChainFileListFilter? should that be replaced?also about the localFilter where do you mean that should be entered?thank you@Gary Russell

            – Elias Khattar
            Jan 7 at 20:26











          • The inbound channel adapter has 2 filters (.filter()) and localFilter()). The first one determines which files are fetched from FTP and copied to the local file system; the second determines which files on the file system are sent as messages. If you can upload a simplified project to GitHub, one of us can take a look.

            – Gary Russell
            Jan 7 at 20:31



















          • After I added metadataStore(dataSource) in the filter I had to change the metadataStore to ConcurrentMetadataStore as this filter does not take MetaDataStore type, then I had to declare a Datasource variable to be used in the method in the filter, after testing the code I'm facing an error which I'm including above in the code, seems that it is not creating the table in the postgres although the jdbc connections are all showing correct as per debugging@Gary Russell

            – Elias Khattar
            Jan 7 at 7:52











          • I created a table in postgres with the name in small char as the DB does not read capital letters, after that it worked to parse the info in the table, but the same issue is still there, if I have two files that are already processed and I drop one file in one server then it sends both again at the same time not only the new one, Am I missing anything in the filters? Also if I create the initial flow which suppose to pick the file,creates new and sends it, it is only pulling the file from FTP and I have to add another ftp flow so that it sends back the new files@Gary russell

            – Elias Khattar
            Jan 7 at 10:58













          • The filter should be an FtpPersistentAcceptOnceFileListFilter; there is also a localFilter, which needs the FileSystemPersistentFileListFilter.

            – Gary Russell
            Jan 7 at 14:16











          • Do you mean that I should replace FileSystemPersistentAcceptOnceFileListFilter with FtpPersistentAcceptOnceFileListFilter? tried so but it is giving filter type issue maybe cause I'm using ChainFileListFilter? should that be replaced?also about the localFilter where do you mean that should be entered?thank you@Gary Russell

            – Elias Khattar
            Jan 7 at 20:26











          • The inbound channel adapter has 2 filters (.filter()) and localFilter()). The first one determines which files are fetched from FTP and copied to the local file system; the second determines which files on the file system are sent as messages. If you can upload a simplified project to GitHub, one of us can take a look.

            – Gary Russell
            Jan 7 at 20:31

















          After I added metadataStore(dataSource) in the filter I had to change the metadataStore to ConcurrentMetadataStore as this filter does not take MetaDataStore type, then I had to declare a Datasource variable to be used in the method in the filter, after testing the code I'm facing an error which I'm including above in the code, seems that it is not creating the table in the postgres although the jdbc connections are all showing correct as per debugging@Gary Russell

          – Elias Khattar
          Jan 7 at 7:52





          After I added metadataStore(dataSource) in the filter I had to change the metadataStore to ConcurrentMetadataStore as this filter does not take MetaDataStore type, then I had to declare a Datasource variable to be used in the method in the filter, after testing the code I'm facing an error which I'm including above in the code, seems that it is not creating the table in the postgres although the jdbc connections are all showing correct as per debugging@Gary Russell

          – Elias Khattar
          Jan 7 at 7:52













          I created a table in postgres with the name in small char as the DB does not read capital letters, after that it worked to parse the info in the table, but the same issue is still there, if I have two files that are already processed and I drop one file in one server then it sends both again at the same time not only the new one, Am I missing anything in the filters? Also if I create the initial flow which suppose to pick the file,creates new and sends it, it is only pulling the file from FTP and I have to add another ftp flow so that it sends back the new files@Gary russell

          – Elias Khattar
          Jan 7 at 10:58







          I created a table in postgres with the name in small char as the DB does not read capital letters, after that it worked to parse the info in the table, but the same issue is still there, if I have two files that are already processed and I drop one file in one server then it sends both again at the same time not only the new one, Am I missing anything in the filters? Also if I create the initial flow which suppose to pick the file,creates new and sends it, it is only pulling the file from FTP and I have to add another ftp flow so that it sends back the new files@Gary russell

          – Elias Khattar
          Jan 7 at 10:58















          The filter should be an FtpPersistentAcceptOnceFileListFilter; there is also a localFilter, which needs the FileSystemPersistentFileListFilter.

          – Gary Russell
          Jan 7 at 14:16





          The filter should be an FtpPersistentAcceptOnceFileListFilter; there is also a localFilter, which needs the FileSystemPersistentFileListFilter.

          – Gary Russell
          Jan 7 at 14:16













          Do you mean that I should replace FileSystemPersistentAcceptOnceFileListFilter with FtpPersistentAcceptOnceFileListFilter? tried so but it is giving filter type issue maybe cause I'm using ChainFileListFilter? should that be replaced?also about the localFilter where do you mean that should be entered?thank you@Gary Russell

          – Elias Khattar
          Jan 7 at 20:26





          Do you mean that I should replace FileSystemPersistentAcceptOnceFileListFilter with FtpPersistentAcceptOnceFileListFilter? tried so but it is giving filter type issue maybe cause I'm using ChainFileListFilter? should that be replaced?also about the localFilter where do you mean that should be entered?thank you@Gary Russell

          – Elias Khattar
          Jan 7 at 20:26













          The inbound channel adapter has 2 filters (.filter()) and localFilter()). The first one determines which files are fetched from FTP and copied to the local file system; the second determines which files on the file system are sent as messages. If you can upload a simplified project to GitHub, one of us can take a look.

          – Gary Russell
          Jan 7 at 20:31





          The inbound channel adapter has 2 filters (.filter()) and localFilter()). The first one determines which files are fetched from FTP and copied to the local file system; the second determines which files on the file system are sent as messages. If you can upload a simplified project to GitHub, one of us can take a look.

          – Gary Russell
          Jan 7 at 20:31




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Stack Overflow!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54039852%2fjdbcmetadatastore-for-filtering%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Monofisismo

          Angular Downloading a file using contenturl with Basic Authentication

          Olmecas