ASG
IBM
Zystems
Cressida
Icon
Netflexity
 
  MQSeries.net
Search  Search       Tech Exchange      Education      Certifications      Library      Info Center      SupportPacs      LinkedIn  Search  Search                                                                   FAQ  FAQ   Usergroups  Usergroups
 
Register  ::  Log in Log in to check your private messages
 
RSS Feed - WebSphere MQ Support RSS Feed - Message Broker Support

MQSeries.net Forum Index » WebSphere Message Broker (ACE) Support » Read 1000 records from a csv file at a time

Post new topic  Reply to topic Goto page 1, 2  Next
 Read 1000 records from a csv file at a time « View previous topic :: View next topic » 
Author Message
sumitha.mp
PostPosted: Wed Jan 16, 2013 7:56 am    Post subject: Read 1000 records from a csv file at a time Reply with quote

Newbie

Joined: 21 Aug 2012
Posts: 9

HI ,

Using file input node, how do I read 1000 records from a csv at a time ?
I do not want to process record by record nor the whole file approach, instead from the file it should pic up 1000 records each and do the processing.

Can you please suggest an approach for this.

Thanks
Back to top
View user's profile Send private message
lancelotlinc
PostPosted: Wed Jan 16, 2013 8:00 am    Post subject: Reply with quote

Jedi Knight

Joined: 22 Mar 2010
Posts: 4941
Location: Bloomington, IL USA

>> Can you please suggest an approach for this.

Read the InfoCentre. Take a training class. Ask a WMB developer.
_________________
http://leanpub.com/IIB_Tips_and_Tricks
Save $20: Coupon Code: MQSERIES_READER
Back to top
View user's profile Send private message Send e-mail
Vitor
PostPosted: Wed Jan 16, 2013 8:01 am    Post subject: Re: Read 1000 records from a csv file at a time Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 26093
Location: Texas, USA

sumitha.mp wrote:
I do not want to process record by record nor the whole file approach, instead from the file it should pic up 1000 records each and do the processing.


Why?
_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
mqjeff
PostPosted: Wed Jan 16, 2013 8:05 am    Post subject: Re: Read 1000 records from a csv file at a time Reply with quote

Grand Master

Joined: 25 Jun 2008
Posts: 17447

sumitha.mp wrote:
HI ,

Using file input node, how do I read 1000 records from a csv at a time ?

Change the definition of a "record".
Back to top
View user's profile Send private message
Vitor
PostPosted: Wed Jan 16, 2013 8:19 am    Post subject: Re: Read 1000 records from a csv file at a time Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 26093
Location: Texas, USA

mqjeff wrote:
sumitha.mp wrote:
HI ,

Using file input node, how do I read 1000 records from a csv at a time ?

Change the definition of a "record".




If you absolutely must do this. I still question why.
_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
smdavies99
PostPosted: Wed Jan 16, 2013 10:48 am    Post subject: Re: Read 1000 records from a csv file at a time Reply with quote

Jedi Council

Joined: 10 Feb 2003
Posts: 6076
Location: Somewhere over the Rainbow this side of Never-never land.

Vitor wrote:


If you absolutely must do this. I still question why.


Probably because he has been told to do it that way. Hence my thread titled

"And the requirement is..."
_________________
WMQ User since 1999
MQSI/WBI/WMB/'Thingy' User since 2002
Linux user since 1995

Every time you reinvent the wheel the more square it gets (anon). If in doubt think and investigate before you ask silly questions.
Back to top
View user's profile Send private message
Vitor
PostPosted: Wed Jan 16, 2013 11:11 am    Post subject: Re: Read 1000 records from a csv file at a time Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 26093
Location: Texas, USA

smdavies99 wrote:
Vitor wrote:


If you absolutely must do this. I still question why.


Probably because he has been told to do it that way. Hence my thread titled

"And the requirement is..."



_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
sumitha.mp
PostPosted: Thu Jan 17, 2013 9:37 am    Post subject: Reply with quote

Newbie

Joined: 21 Aug 2012
Posts: 9

Requirement is to read 1000 records per file read and do some computation on the 1000 records.

Last edited by sumitha.mp on Thu Jan 17, 2013 9:41 am; edited 2 times in total
Back to top
View user's profile Send private message
lancelotlinc
PostPosted: Thu Jan 17, 2013 9:39 am    Post subject: Reply with quote

Jedi Knight

Joined: 22 Mar 2010
Posts: 4941
Location: Bloomington, IL USA

sumitha.mp wrote:
Requirement is to read 1000 records per file read and do some computation on the 1000 records.


Very well. Whats your plan to accomplish this?
_________________
http://leanpub.com/IIB_Tips_and_Tricks
Save $20: Coupon Code: MQSERIES_READER
Back to top
View user's profile Send private message Send e-mail
sumitha.mp
PostPosted: Thu Jan 17, 2013 9:51 am    Post subject: Reply with quote

Newbie

Joined: 21 Aug 2012
Posts: 9

Currently I could find only approached to read record by record or as whole file .
Back to top
View user's profile Send private message
lancelotlinc
PostPosted: Thu Jan 17, 2013 9:54 am    Post subject: Reply with quote

Jedi Knight

Joined: 22 Mar 2010
Posts: 4941
Location: Bloomington, IL USA

sumitha.mp wrote:
Currently I could find only approached to read record by record or as whole file .


So, you will read record-by-record * 1,000 then process the group of 1,000. Nothing hard about that...
_________________
http://leanpub.com/IIB_Tips_and_Tricks
Save $20: Coupon Code: MQSERIES_READER
Back to top
View user's profile Send private message Send e-mail
sumitha.mp
PostPosted: Thu Jan 17, 2013 10:01 am    Post subject: Reply with quote

Newbie

Joined: 21 Aug 2012
Posts: 9

Reason for reading 1000 records is the computation needs to be done per 1000 records. If I'm going with record by record , I will have to read record by record till 1000 records are read and store each of those in memory. Is there a better way to do this ?
Back to top
View user's profile Send private message
lancelotlinc
PostPosted: Thu Jan 17, 2013 10:03 am    Post subject: Reply with quote

Jedi Knight

Joined: 22 Mar 2010
Posts: 4941
Location: Bloomington, IL USA

sumitha.mp wrote:
Reason for reading 1000 records is the computation needs to be done per 1000 records. If I'm going with record by record , I will have to read record by record till 1000 records are read and store each of those in memory. Is there a better way to do this ?


No.
_________________
http://leanpub.com/IIB_Tips_and_Tricks
Save $20: Coupon Code: MQSERIES_READER
Back to top
View user's profile Send private message Send e-mail
nathanw
PostPosted: Thu Jan 17, 2013 10:05 am    Post subject: Reply with quote

Knight

Joined: 14 Jul 2004
Posts: 550

out of curiosity what happens if there is not 1000 records in a file?

or not a multiple of 1000?
_________________
Who is General Failure and why is he reading my hard drive?

Artificial Intelligence stands no chance against Natural Stupidity.

Only the User Trace Speaks The Truth
Back to top
View user's profile Send private message MSN Messenger
mqjeff
PostPosted: Thu Jan 17, 2013 10:10 am    Post subject: Reply with quote

Grand Master

Joined: 25 Jun 2008
Posts: 17447

lancelotlinc wrote:
sumitha.mp wrote:
Reason for reading 1000 records is the computation needs to be done per 1000 records. If I'm going with record by record , I will have to read record by record till 1000 records are read and store each of those in memory. Is there a better way to do this ?


No.


YES.

Alter the message model to include a record structure that contains up to 1000 records.

Tell the FileInput node that *THAT* is a "record", rather than the structure that holds one record.

Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic  Reply to topic Goto page 1, 2  Next Page 1 of 2

MQSeries.net Forum Index » WebSphere Message Broker (ACE) Support » Read 1000 records from a csv file at a time
Jump to:  



You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
Protected by Anti-Spam ACP
 
 


Theme by Dustin Baccetti
Powered by phpBB © 2001, 2002 phpBB Group

Copyright © MQSeries.net. All rights reserved.