File Adapter process the same file twice in a clustered environment

I am using File Adapter to process a file to target system. when i deployed my composite in clustered environment, both nodes of the cluster started processing the same file, causing duplicate records in target system.

For resolving this issue we need to use HA file adapter and set the singleton property in composite.xml .

1. HA File Adapter –> 

Change the connection factory to HA file adapter like below.

<adapter-config name=”MyFile” adapter=”File Adapter”
wsdlLocation=”MyFile.wsdl”
xmlns=”http://platform.integration.oracle/blocks/adapter/fw/metadata”&gt;

<connection-factory location=”eis/HAFileAdapter

HAFileAdapter uses database behind the scene to create mutex to make sure only one node will process one file.

2. Singleton property in composite.xml –>

<binding.jca config=“MyFile_file.jca”>

<property name=“singleton”>true</property>

</binding.jca>

Singleton property force the BPEL process to pick the file from one server at a time.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s