Author |
Message
|
team |
Posted: Thu Sep 06, 2007 1:56 pm Post subject: Number of Messages and DB |
|
|
Centurion
Joined: 03 Nov 2006 Posts: 108
|
Hi...
i am facing a problem when loggin for a number of messages into the DB.
Flow 1 : Insert Record A into the DB
Flow 2 : Read record A inserted above......and update another table.
If i run 1 msg..it works well....if i run few msgs ...workds well.
But when i attempt to run about 800 to 900 msgs....about 15 odd of them fail......they fail in an attempt to read from the DB in flow 2...since the insert has not yet happened according to the error msg thrown.
I fail to understand that it is sequentail processing..then how can this happen. The same msgs...if i take and run singularly..they work well.
Any help would be appreciated.
Thanks, |
|
Back to top |
|
 |
kevinf2349 |
Posted: Thu Sep 06, 2007 3:33 pm Post subject: |
|
|
 Grand Master
Joined: 28 Feb 2003 Posts: 1311 Location: USA
|
Are you hitting a database uncommitted UOW limit maybe? |
|
Back to top |
|
 |
team |
Posted: Thu Sep 06, 2007 3:45 pm Post subject: |
|
|
Centurion
Joined: 03 Nov 2006 Posts: 108
|
what would that mean kevin? |
|
Back to top |
|
 |
team |
Posted: Tue Sep 25, 2007 6:18 am Post subject: |
|
|
Centurion
Joined: 03 Nov 2006 Posts: 108
|
Would any one have any ideas regarding this?
The transaction is automatic..so i would not assume this to be the behaviour.. |
|
Back to top |
|
 |
jefflowrey |
Posted: Tue Sep 25, 2007 6:44 am Post subject: |
|
|
Grand Poobah
Joined: 16 Oct 2002 Posts: 19981
|
If you haven't committed the insert, then you shouldn't expect that a successive select should succeed.
You don't specify how Flow 2 gets started. There are a lot of different reasons why you could be seeing what you are seeing.
And it's not reasonable to think that two generic flows in Broker will ever be executed in any sort of sequence, unless you have taken specific steps to cause them to be executed in a particular sequence. _________________ I am *not* the model of the modern major general. |
|
Back to top |
|
 |
team |
Posted: Tue Sep 25, 2007 6:45 am Post subject: |
|
|
Centurion
Joined: 03 Nov 2006 Posts: 108
|
Thanks for your response Jeff.
The output queue of flow1 is the input Queue for flow 2. Hence the messages flow in that order.....Flow1 -- > Flow2..
And what i'd like to mention is that this is a problem only when i play a number of messages.... |
|
Back to top |
|
 |
jefflowrey |
Posted: Tue Sep 25, 2007 6:55 am Post subject: |
|
|
Grand Poobah
Joined: 16 Oct 2002 Posts: 19981
|
You didn't show any output queues... and we only have the information you present.
So, presumably, what shows up on the input queue of flow 2 is a unique identifier for a particular record in the database?
Are you using two-phase commit/Globally coordinated transactions?
You could be running into a timing issue between when the MQ output from flow 1 commits the output message, and when the database commits the insert transaction. _________________ I am *not* the model of the modern major general. |
|
Back to top |
|
 |
team |
Posted: Tue Sep 25, 2007 7:13 am Post subject: |
|
|
Centurion
Joined: 03 Nov 2006 Posts: 108
|
Hi,
"You could be running into a timing issue between when the MQ output from flow 1 commits the output message, and when the database commits the insert transaction."
I had the same thoughts, but please could you let me know what is the order of the commit and the 'put' to the queue?
If the 'put' is after the commit, then its fine,
But if the 'put' is before the commit, then how does one deal with this timing issue?
Also, i am jus using the properties as defined by the node 'Transaction =Yes'
Thanks in advance |
|
Back to top |
|
 |
jefflowrey |
Posted: Tue Sep 25, 2007 8:12 am Post subject: |
|
|
Grand Poobah
Joined: 16 Oct 2002 Posts: 19981
|
Unless you have configured your broker for Globally Coordinated flows and configured your qmgr to provide the necessary XA support and configured your flow when deployed to be in a Global Transaction...
Then Broker is only going to guarantee that the two separate transactions in Flow 1 - the one for the database and the one for the MQ Output - will be committed. But they won't be tied together. So it could be that the MQ Output gets committed, Flow 2 starts and tries to do the Select, and then the database insert is committed.
As you say, if you do this in small batches everything works fine. If you do it under high volume, then it doesn't necessarily.
You can look at tuning the MQ Ouptut node to increase the number of messages it keeps in a transaction. If you increase this (but not by much!), then you may be able to give the database more time to commit the inserts.
This would be less work than configuring Global Transactions. _________________ I am *not* the model of the modern major general. |
|
Back to top |
|
 |
team |
Posted: Tue Sep 25, 2007 9:11 am Post subject: |
|
|
Centurion
Joined: 03 Nov 2006 Posts: 108
|
Hi Jeff..
Thanks a lot for your responses..
I looking to c:
1) How can i fine tune the ouput node to do this. I have not an option in the properties to set this. Am still checking out to get a way.
2) Why do you specify 'not too much!'
Thanks |
|
Back to top |
|
 |
jefflowrey |
Posted: Tue Sep 25, 2007 9:25 am Post subject: |
|
|
Grand Poobah
Joined: 16 Oct 2002 Posts: 19981
|
Hrm. Okay, the property isn't on the MQOutput node... it's the "Commit Count" in the Bar File on the Flow itself.
You don't want to make it too big, because then you'll be making transactions very big, which will use up lots of space in logs and either cause things to fail or slow down.
You probably actually want to change the node that does the Insert to use Transaction="No", so that it always commits. _________________ I am *not* the model of the modern major general. |
|
Back to top |
|
 |
elvis_gn |
Posted: Tue Sep 25, 2007 10:14 am Post subject: |
|
|
 Padawan
Joined: 08 Oct 2004 Posts: 1905 Location: Dubai
|
Hi all,
jefflowrey wrote: |
Then Broker is only going to guarantee that the two separate transactions in Flow 1 - the one for the database and the one for the MQ Output - will be committed. But they won't be tied together. So it could be that the MQ Output gets committed, Flow 2 starts and tries to do the Select, and then the database insert is committed. |
Would not it be simpler to set 'commit' on the Compute node doing the insert ? That should force the flow to complete the db transaction before the MQOutput transaction...
I'm not sure if I've missed some business restriction to doing that...
Regards. |
|
Back to top |
|
 |
jefflowrey |
Posted: Tue Sep 25, 2007 11:08 am Post subject: |
|
|
Grand Poobah
Joined: 16 Oct 2002 Posts: 19981
|
No, you're right. I just didn't think of it until after I'd made that post.
Unless there is a business requirement to only send the output message if the insert succeeds. _________________ I am *not* the model of the modern major general. |
|
Back to top |
|
 |
team |
Posted: Tue Sep 25, 2007 11:40 am Post subject: |
|
|
Centurion
Joined: 03 Nov 2006 Posts: 108
|
Hi....
Regarding the suggestions,
1) I don't want to put commit on the particular node that does the insert for the reason that in a scenario of an exception within the flow (in downstream nodes), this commit will not be rolled back, since the update has been commited.
2) There is no business requirement, however, the DB inserts are done for logging the path of a message through the system. Hence, flow 2 reads the value inserted in flow 1(which ideally should have been there). Hence i need the commit to happen, before the message takes off from the input node of flow 2. |
|
Back to top |
|
 |
sunny_30 |
Posted: Sat Nov 17, 2007 7:46 am Post subject: |
|
|
 Master
Joined: 03 Oct 2005 Posts: 258
|
Hi
I have the same problem as above.
flow1 has a db-insert.
The output from flow1 is input to flow2.
flow2 does update to the flow1's insert row.
I dont want to commit the db-insert in flow1-compute as I want to roll-back the whole transaction in the event of a put failure to the flow1-mqoutput.
But, when I set the compute-node's transaction in flow1 to automatic, the update in flow-2 fails occassionally. The output of flow1 is input to flow2. The put & insert is a success in flow1, but still the update in flow2 fails(sometimes) to find the inserted record from flow1.
what is the solution for this problem? Do we need to have everything globally-coordinated? Is there any other way..
Thanks,
Sunny. |
|
Back to top |
|
 |
|