ASG
IBM
Zystems
Cressida
Icon
Netflexity
 
  MQSeries.net
Search  Search       Tech Exchange      Education      Certifications      Library      Info Center      SupportPacs      LinkedIn  Search  Search                                                                   FAQ  FAQ   Usergroups  Usergroups
 
Register  ::  Log in Log in to check your private messages
 
RSS Feed - WebSphere MQ Support RSS Feed - Message Broker Support

MQSeries.net Forum Index » WebSphere Message Broker (ACE) Support » Facing issue when processing huge set of records

Post new topic  Reply to topic
 Facing issue when processing huge set of records « View previous topic :: View next topic » 
Author Message
new2MB
PostPosted: Mon Jun 09, 2014 10:10 am    Post subject: Facing issue when processing huge set of records Reply with quote

Novice

Joined: 04 Jun 2014
Posts: 11

We have a requirement where we need to process lacs of records.
we have implemented the logic as follows
1st flow : pick the records in bundle of 500, do some internal processing to map the records as per req and put in the Queue
2nd flow : read each message from queue and write to file in appending mode

now we are not even able to process some thousands of records.
the flow abruptly terminates after processing 27k records from 40K .
i am not able to identify the root cause.
is there some timeout which occurs or some memory issue.
Please suggest some way to identify the issue and how to resolve.

Suggestions are much appreciated
Back to top
View user's profile Send private message
Vitor
PostPosted: Mon Jun 09, 2014 10:16 am    Post subject: Re: Facing issue when processing huge set of records Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 26093
Location: Texas, USA

new2MB wrote:
We have a requirement where we need to process lacs of records.
we have implemented the logic as follows


Why?

new2MB wrote:
1st flow : pick the records in bundle of 500, do some internal processing to map the records as per req and put in the Queue
2nd flow : read each message from queue and write to file in appending mode


How did you come up with 500? What actually groups these records? What do they have in common? How do you pick them off?

new2MB wrote:
i am not able to identify the root cause.
is there some timeout which occurs or some memory issue.
Please suggest some way to identify the issue and how to resolve.


What have you tried? What does your flow actually do? You've provided almost no details on either your design or your code.

new2MB wrote:
Suggestions are much appreciated


Investigate. Diagnose. Trace.

Review your design. If you're reading the file in chunks of 500*<record length> bytes I've found your problem.
_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
Vitor
PostPosted: Mon Jun 09, 2014 10:22 am    Post subject: Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 26093
Location: Texas, USA

And for the record, 40K is not a "huge set of records" for WMB.
_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
new2MB
PostPosted: Mon Jun 09, 2014 10:25 pm    Post subject: Reply with quote

Novice

Joined: 04 Jun 2014
Posts: 11

Sorry for not giving the details.
Explaination in detail:

we need to process a batch which consist of say 2 lac records
these records are spread over 3 tables

we need to pick 1 batch from table A find the matching 2 lac records in table B then for these B_record find matching C_record say 2.5 lac records.
So we hav to loop 2 times

just to ease the flow we pick 500 of B_record based on some status say 'start', status is on C_record
for each of this B_record pick C_record and update the records picked with status 'picked'

now for these C_record prepare a structure based on some xsd and then write it to the file.
Let me know if i should give more details

Quote:

Review your design. If you're reading the file in chunks of 500*<record length> bytes I've found your problem.


Can you please help me understand what is the issue you are guessing
Back to top
View user's profile Send private message
Esa
PostPosted: Mon Jun 09, 2014 10:44 pm    Post subject: Re: Facing issue when processing huge set of records Reply with quote

Grand Master

Joined: 22 May 2008
Posts: 1387
Location: Finland

new2MB wrote:

do some internal processing to map the records as per req and put in the Queue


Please elaborate. Java?

Have you run a test with flow 2 stopped to determine if the problem is caused by flow 1 or 2?
Back to top
View user's profile Send private message
new2MB
PostPosted: Mon Jun 09, 2014 11:50 pm    Post subject: Reply with quote

Novice

Joined: 04 Jun 2014
Posts: 11

Its a simple mapping in esql from the resultset retrieved from data base to set the values on outputroot to prepare the structure as required to wrtie into the file.

when i have adde some logging into the flow i could identify that it is the 1st flow which terminates. there seems to be no issue with 2nd flow.

as i mentioned above this mapping is done for 500 B_Record in 1 loop which requires to retreive the matching C_record and also update status of C_Record.
the procedure which implements this retrieval, status update and mapping take arnd 13 secs for each 500 records
Back to top
View user's profile Send private message
McueMart
PostPosted: Tue Jun 10, 2014 12:36 am    Post subject: Reply with quote

Chevalier

Joined: 29 Nov 2011
Posts: 490
Location: UK...somewhere

What version of WMB (full version including fixpack)? Are you using SHARED variables to maintain any state?

Have you taken a user trace? Do you have anything in the broker event log (Windows event viewer/syslog on linx)?
Back to top
View user's profile Send private message
Esa
PostPosted: Tue Jun 10, 2014 12:51 am    Post subject: Reply with quote

Grand Master

Joined: 22 May 2008
Posts: 1387
Location: Finland

new2MB wrote:
Its a simple mapping in esql from the resultset retrieved from data base to set the values on outputroot to prepare the structure as required to wrtie into the file.


You do all this in one single Main() function? Or do you call a procedure for each record for doing the db retrieval and mapping?

Why do you construct a 500 record MQ message and use another flow for appending it to a file?

A better performing approach would be to append each record directly to the file -- from flow 1.
Back to top
View user's profile Send private message
new2MB
PostPosted: Tue Jun 10, 2014 1:25 am    Post subject: Reply with quote

Novice

Joined: 04 Jun 2014
Posts: 11

McueMart wrote:
What version of WMB (full version including fixpack)? Are you using SHARED variables to maintain any state?


version 8.0.0.2
Back to top
View user's profile Send private message
new2MB
PostPosted: Tue Jun 10, 2014 1:28 am    Post subject: Reply with quote

Novice

Joined: 04 Jun 2014
Posts: 11

Quote:
You do all this in one single Main() function? Or do you call a procedure for each record for doing the db retrieval and mapping?


not doing the Main call a procedure for db retrieval and mapping
Quote:

Why do you construct a 500 record MQ message and use another flow for appending it to a file?

A better performing approach would be to append each record directly to the file -- from flow 1.


using 2 flows will help multiprocessing, 1 to prepare the structure and 2 to write to file
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic  Reply to topic Page 1 of 1

MQSeries.net Forum Index » WebSphere Message Broker (ACE) Support » Facing issue when processing huge set of records
Jump to:  



You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
Protected by Anti-Spam ACP
 
 


Theme by Dustin Baccetti
Powered by phpBB © 2001, 2002 phpBB Group

Copyright © MQSeries.net. All rights reserved.