Author |
Message
|
2swild |
Posted: Tue Nov 22, 2011 12:47 pm Post subject: Propagating Large Data Results from DB Query |
|
|
Newbie
Joined: 30 Sep 2005 Posts: 7
|
I have a large set of data returned from a stored procedure; I need to propagate a message for each record to an output queue. The query is returning ~350,000 records and will continue to grow. It is taking nearly 4 hours to propagate all the messages to a queue. Can anyone suggest an improvement on the code below?
CREATE COMPUTE MODULE EAI_PRISM_MDR_TRIGGER_GetMktData
CREATE FUNCTION Main() RETURNS BOOLEAN
BEGIN
DECLARE currentSet, i Integer;
DECLARE tick CHAR;
Set tick = '''';
CALL CopyMessageHeaders();
Set InputLocalEnvironment.SQL.select_stmt1 = 'Execute prismuser.sp_bh_eu_mkt_get_a' || tick || '2011' || tick || ',' || tick || 'NONE' || tick;
Set Environment.DATA.ROW[] = PASSTHRU(InputLocalEnvironment.SQL.select_stmt1);
Set currentSet = CARDINALITY(Environment.DATA.ROW[]);
Set i = 1;
While (i <= currentSet) Do
Set OutputRoot.MQMD = NULL;
Set OutputRoot.MQMD.ReplyToQ = NULL;
Set OutputRoot.MQMD.Persistence = 0;
Set OutputRoot.MQMD.Format = 'MQSTR';
Set OutputRoot.MQMD.CodedCharSetId = 1208;
Set OutputRoot.XMLNSC = NULL;
Set OutputRoot.XMLNSC.REQUEST.CONTROL.ORIGINATOR = 'EAI_PRISM_MDR_TRIGGER';
Set OutputRoot.XMLNSC.REQUEST.CONTROL.VERSION = '2.0';
Set OutputRoot.XMLNSC.REQUEST.CONTROL.SERVICE = 'PRISM_MDR_DATA';
Set OutputRoot.XMLNSC.REQUEST.CONTROL.DESTINATION.INSTANCE = 'DEF';
Set OutputRoot.XMLNSC.REQUEST.CONTROL.DESTINATION.CLIENT = '999';
Set OutputRoot.XMLNSC.REQUEST.MDR.DATA = Environment.DATA.ROW[i];
Set i = i + 1;
Propagate;
End While;
RETURN FALSE;
END; |
|
Back to top |
|
 |
lancelotlinc |
Posted: Tue Nov 22, 2011 12:50 pm Post subject: |
|
|
 Jedi Knight
Joined: 22 Mar 2010 Posts: 4941 Location: Bloomington, IL USA
|
Create an intermediate table. Populate the result set to the intermediate table. Use two dozen message flows to read the intermediate table and publish messages. As each record gets read, mark a time stamp or status code so it is not duplicated. Spread the processing out among the message flows by each flow taking a group of records. Make sure each message flow locks a different part of the table when reading/updating so that you don't have the flows waiting for DB locks to clear. _________________ http://leanpub.com/IIB_Tips_and_Tricks
Save $20: Coupon Code: MQSERIES_READER |
|
Back to top |
|
 |
smdavies99 |
Posted: Tue Nov 22, 2011 1:09 pm Post subject: |
|
|
 Jedi Council
Joined: 10 Feb 2003 Posts: 6076 Location: Somewhere over the Rainbow this side of Never-never land.
|
lancelotlinc wrote: |
Create an intermediate table. Populate the result set to the intermediate table. Use two dozen message flows to read the intermediate table and publish messages. As each record gets read, mark a time stamp or status code so it is not duplicated. Spread the processing out among the message flows by each flow taking a group of records. Make sure each message flow locks a different part of the table when reading/updating so that you don't have the flows waiting for DB locks to clear. |
fine in principle BUT what if there is some affinity between the records so that they need to be presented at the destination in a certain order.
Also, running one thread for 4 hours is IMHO asking for trouble.
I'd probably do something like you suggested BUT read a defined number of records and propagate them, update the db and then re-cycle the flow until there is no more data to read.
By recycle, I meand when all the current block is done, I'd drop a message onto the inputQ for the flow with an MQ output node. Then I'd let that thread end.
The next message on the Queue starts it all off again.
However, I can't help thinking that there must be a better way to do this other than using broker. These does seem to be a trend recently for large dataset either from a file or a DB being put through broker. _________________ WMQ User since 1999
MQSI/WBI/WMB/'Thingy' User since 2002
Linux user since 1995
Every time you reinvent the wheel the more square it gets (anon). If in doubt think and investigate before you ask silly questions. |
|
Back to top |
|
 |
mqjeff |
Posted: Tue Nov 22, 2011 1:49 pm Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
4 hours?
Change
Code: |
Set currentSet = CARDINALITY(Environment.DATA.ROW[]); |
To use a reference variable and loop over MOVE NEXT and WHILE LASTMOVE, and you'll likely drop to 4 minutes. |
|
Back to top |
|
 |
schroederms |
Posted: Tue Nov 22, 2011 2:28 pm Post subject: |
|
|
 Disciple
Joined: 21 Jul 2003 Posts: 169 Location: IA
|
Just process the first occurrence of Environment.DATA.ROW[1].
--Then null out.
Set Environment.DATA.ROW[1] = null;
Continue thru the loop.
That way you will not be parsing thru the Environment tree over and over and over……………………………
Guaranteed positive results!! |
|
Back to top |
|
 |
2swild |
Posted: Tue Nov 22, 2011 2:31 pm Post subject: |
|
|
Newbie
Joined: 30 Sep 2005 Posts: 7
|
schroederms wrote: |
Just process the first occurrence of Environment.DATA.ROW[1].
--Then null out.
Set Environment.DATA.ROW[1] = null;
Continue thru the loop.
That way you will not be parsing thru the Environment tree over and over and over……………………………
Guaranteed positive results!! |
Nulling out the Environment variable worked great! I processed 386,600+ records in about 2 minutes! |
|
Back to top |
|
 |
mqjeff |
Posted: Tue Nov 22, 2011 2:33 pm Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
2swild wrote: |
Nulling out the Environment variable worked great! I processed 386,600+ records in about 2 minutes! |
Same thing, mostly, as using a reference variable.
This is destructive to your tree. A reference variable is not.
If you don't need the tree later, this is better. If you do need the tree later, it's not. |
|
Back to top |
|
 |
|