Author |
Message
|
bhaski |
Posted: Mon Mar 05, 2012 10:29 pm Post subject: larger data into environment Variables |
|
|
 Voyager
Joined: 13 Sep 2006 Posts: 78 Location: USA
|
SET Environment.XMLNS.OUTREC.TABLEOUT[] = PASSTHRU('{call dbo.GET_TBLE_RECORD(?,?,?)}',TRIM( PolicyNumber),TRIM( ModNumber),TRIM( PolicyPrefix));
The stored procedure output having more than 8000 records when WMB calling this stored procedure, it is not bringing the output into SET Environment.XMLNS.OUTREC.TABLEOUT[].
But I am able to get if the record is less than 4000. what is the limitation of wmb to stored the data into SET Environment.XMLNS.OUTREC.TABLEOUT[]...
If any one clear my doubt, it is very much helpful.... or any other idea for larger data.. I need to process all the data with different condition...within 10 mins.
sad day.. Make it happy friends....
Thanks in Advance
Bhaskar Raj. _________________ Thanks and Regards
Bhaski
Websphere MQ Admin Certified
Websphere WMB Admin certified |
|
Back to top |
|
 |
cociu_2012 |
Posted: Tue Mar 06, 2012 12:13 am Post subject: Re: larger data into environment Variables |
|
|
Acolyte
Joined: 06 Jan 2012 Posts: 72
|
bhaski wrote: |
If any one clear my doubt, it is very much helpful.... or any other idea for larger data.. I need to process all the data with different condition...within 10 mins.
|
For the Environment, I'll let experts who knows the answer to respond. All I can say is that I stored large XML data into Env, without any problems.
Other solutions:
Use JDBC Adapters if you have the platform.
Split your db result call intro smaller data (take 2 or more steps). |
|
Back to top |
|
 |
mqsiuser |
Posted: Tue Mar 06, 2012 12:51 am Post subject: Re: larger data into environment Variables |
|
|
 Yatiri
Joined: 15 Apr 2008 Posts: 637 Location: Germany
|
Try to put the data in a shared row
Or try on the output root
You might have problems with the db: Issue the command in a DB admin tool.
Rewrite your SQL (passthru) statement, so that it is "string only". Do not use any "?". _________________ Just use REFERENCEs |
|
Back to top |
|
 |
kimbert |
Posted: Tue Mar 06, 2012 1:33 am Post subject: |
|
|
 Jedi Council
Joined: 29 Jul 2003 Posts: 5542 Location: Southampton
|
There is no documented limit on the size of the environment tree. All message trees are allocated from the heap, so if you create large enough trees you will eventually run out of heap.
Quote: |
The stored procedure output having more than 8000 records when WMB calling this stored procedure |
Can you modify the stored procedure to return (say) a maximum of 1000 records at a time?
Quote: |
it is not bringing the output into SET Environment.XMLNS.OUTREC.TABLEOUT[]. |
Please tell us what it *is* doing. Have you taken a user trace to find out what is going wrong?
Quote: |
I need to process all the data with different condition... |
Why did you tell us that? Were you expecting us to understand that sentence? |
|
Back to top |
|
 |
mqsiuser |
Posted: Tue Mar 06, 2012 1:43 am Post subject: |
|
|
 Yatiri
Joined: 15 Apr 2008 Posts: 637 Location: Germany
|
Thank you kimbert: @OP: cross out my first 2 suggestions.
Quote: |
Can you modify the stored procedure to return (say) a maximum of 1000 records at a time? |
Can I say that it is a good idea to process row by row and that this is ideal with broker, MQ and transactions (if it is possible to do so) ? I'd always try to process as small chunks as possible (small messages). If not I try to process small logical chunks (e.g. all rows of an order ... probably do not randomly chop your message)... Though there also is good support for large message processing in ESQL (delete previous in siblings). _________________ Just use REFERENCEs |
|
Back to top |
|
 |
kimbert |
Posted: Tue Mar 06, 2012 1:58 am Post subject: |
|
|
 Jedi Council
Joined: 29 Jul 2003 Posts: 5542 Location: Southampton
|
Quote: |
Can I say that it is a good idea to process row by row ideally with broker ? I'd always try to process as small chunks as possible. |
I agree - but we don't know what the OP means by 'I need to process all the data with different condition'. It's just possible that he needs to process a set of rows all at the same time. Maybe the 8000 rows contain many smaller groups of related rows. Then again, maybe not. We just don't know yet.
Quote: |
Though there also is good support for large message processing in ESQL (delete previous in siblings). |
Those techniques usually exploit on-demand parsing to avoid creating huge message trees. With DB access you need to control the size of the result set yourself ( unless someone wants to correct me on that ). Hence my comment about modifying the stored procedure. |
|
Back to top |
|
 |
marko.pitkanen |
Posted: Tue Mar 06, 2012 2:43 am Post subject: |
|
|
 Chevalier
Joined: 23 Jul 2008 Posts: 440 Location: Jamsa, Finland
|
Hi,
Have you checked that transaction logs etc. are big enough to process all those 8000 records at one transaction.
--
Marko |
|
Back to top |
|
 |
mqsiuser |
Posted: Tue Mar 06, 2012 3:41 am Post subject: |
|
|
 Yatiri
Joined: 15 Apr 2008 Posts: 637 Location: Germany
|
marko.pitkanen wrote: |
Have you checked that transaction logs etc. are big enough to process all those 8000 records at one transaction. |
That is actually a really good point: The transactional capabilities of you DB (and the (size) of the DB-transaction logs).
@OP: We all perfectly agree that you should look at the (performance) of the stored procedure. Change it to process small chunks or make sure (with a DB-Admin tool) that it performs well on 8000 records.
On troubleshooting memory (heap) issues in broker: You can clearly see the mem-usage (of the exec group), if you do nothing in your flow but only/just getting/parsing the data in (make sure to trigger a full parse, when you have msgs coming in). _________________ Just use REFERENCEs |
|
Back to top |
|
 |
|