Author |
Message
|
sumitha.mp |
Posted: Wed Jan 16, 2013 7:56 am Post subject: Read 1000 records from a csv file at a time |
|
|
Newbie
Joined: 21 Aug 2012 Posts: 9
|
HI ,
Using file input node, how do I read 1000 records from a csv at a time ?
I do not want to process record by record nor the whole file approach, instead from the file it should pic up 1000 records each and do the processing.
Can you please suggest an approach for this.
Thanks |
|
Back to top |
|
 |
lancelotlinc |
Posted: Wed Jan 16, 2013 8:00 am Post subject: |
|
|
 Jedi Knight
Joined: 22 Mar 2010 Posts: 4941 Location: Bloomington, IL USA
|
>> Can you please suggest an approach for this.
Read the InfoCentre. Take a training class. Ask a WMB developer. _________________ http://leanpub.com/IIB_Tips_and_Tricks
Save $20: Coupon Code: MQSERIES_READER |
|
Back to top |
|
 |
Vitor |
Posted: Wed Jan 16, 2013 8:01 am Post subject: Re: Read 1000 records from a csv file at a time |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
sumitha.mp wrote: |
I do not want to process record by record nor the whole file approach, instead from the file it should pic up 1000 records each and do the processing. |
Why? _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
mqjeff |
Posted: Wed Jan 16, 2013 8:05 am Post subject: Re: Read 1000 records from a csv file at a time |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
sumitha.mp wrote: |
HI ,
Using file input node, how do I read 1000 records from a csv at a time ? |
Change the definition of a "record". |
|
Back to top |
|
 |
Vitor |
Posted: Wed Jan 16, 2013 8:19 am Post subject: Re: Read 1000 records from a csv file at a time |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
mqjeff wrote: |
sumitha.mp wrote: |
HI ,
Using file input node, how do I read 1000 records from a csv at a time ? |
Change the definition of a "record". |
If you absolutely must do this. I still question why. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
smdavies99 |
Posted: Wed Jan 16, 2013 10:48 am Post subject: Re: Read 1000 records from a csv file at a time |
|
|
 Jedi Council
Joined: 10 Feb 2003 Posts: 6076 Location: Somewhere over the Rainbow this side of Never-never land.
|
Vitor wrote: |
If you absolutely must do this. I still question why. |
Probably because he has been told to do it that way. Hence my thread titled
"And the requirement is..." _________________ WMQ User since 1999
MQSI/WBI/WMB/'Thingy' User since 2002
Linux user since 1995
Every time you reinvent the wheel the more square it gets (anon). If in doubt think and investigate before you ask silly questions. |
|
Back to top |
|
 |
Vitor |
Posted: Wed Jan 16, 2013 11:11 am Post subject: Re: Read 1000 records from a csv file at a time |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
smdavies99 wrote: |
Vitor wrote: |
If you absolutely must do this. I still question why. |
Probably because he has been told to do it that way. Hence my thread titled
"And the requirement is..." |
 _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
sumitha.mp |
Posted: Thu Jan 17, 2013 9:37 am Post subject: |
|
|
Newbie
Joined: 21 Aug 2012 Posts: 9
|
Requirement is to read 1000 records per file read and do some computation on the 1000 records.
Last edited by sumitha.mp on Thu Jan 17, 2013 9:41 am; edited 2 times in total |
|
Back to top |
|
 |
lancelotlinc |
Posted: Thu Jan 17, 2013 9:39 am Post subject: |
|
|
 Jedi Knight
Joined: 22 Mar 2010 Posts: 4941 Location: Bloomington, IL USA
|
sumitha.mp wrote: |
Requirement is to read 1000 records per file read and do some computation on the 1000 records. |
Very well. Whats your plan to accomplish this? _________________ http://leanpub.com/IIB_Tips_and_Tricks
Save $20: Coupon Code: MQSERIES_READER |
|
Back to top |
|
 |
sumitha.mp |
Posted: Thu Jan 17, 2013 9:51 am Post subject: |
|
|
Newbie
Joined: 21 Aug 2012 Posts: 9
|
Currently I could find only approached to read record by record or as whole file . |
|
Back to top |
|
 |
lancelotlinc |
Posted: Thu Jan 17, 2013 9:54 am Post subject: |
|
|
 Jedi Knight
Joined: 22 Mar 2010 Posts: 4941 Location: Bloomington, IL USA
|
sumitha.mp wrote: |
Currently I could find only approached to read record by record or as whole file . |
So, you will read record-by-record * 1,000 then process the group of 1,000. Nothing hard about that... _________________ http://leanpub.com/IIB_Tips_and_Tricks
Save $20: Coupon Code: MQSERIES_READER |
|
Back to top |
|
 |
sumitha.mp |
Posted: Thu Jan 17, 2013 10:01 am Post subject: |
|
|
Newbie
Joined: 21 Aug 2012 Posts: 9
|
Reason for reading 1000 records is the computation needs to be done per 1000 records. If I'm going with record by record , I will have to read record by record till 1000 records are read and store each of those in memory. Is there a better way to do this ? |
|
Back to top |
|
 |
lancelotlinc |
Posted: Thu Jan 17, 2013 10:03 am Post subject: |
|
|
 Jedi Knight
Joined: 22 Mar 2010 Posts: 4941 Location: Bloomington, IL USA
|
sumitha.mp wrote: |
Reason for reading 1000 records is the computation needs to be done per 1000 records. If I'm going with record by record , I will have to read record by record till 1000 records are read and store each of those in memory. Is there a better way to do this ? |
No. _________________ http://leanpub.com/IIB_Tips_and_Tricks
Save $20: Coupon Code: MQSERIES_READER |
|
Back to top |
|
 |
nathanw |
Posted: Thu Jan 17, 2013 10:05 am Post subject: |
|
|
 Knight
Joined: 14 Jul 2004 Posts: 550
|
out of curiosity what happens if there is not 1000 records in a file?
or not a multiple of 1000? _________________ Who is General Failure and why is he reading my hard drive?
Artificial Intelligence stands no chance against Natural Stupidity.
Only the User Trace Speaks The Truth  |
|
Back to top |
|
 |
mqjeff |
Posted: Thu Jan 17, 2013 10:10 am Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
lancelotlinc wrote: |
sumitha.mp wrote: |
Reason for reading 1000 records is the computation needs to be done per 1000 records. If I'm going with record by record , I will have to read record by record till 1000 records are read and store each of those in memory. Is there a better way to do this ? |
No. |
YES.
Alter the message model to include a record structure that contains up to 1000 records.
Tell the FileInput node that *THAT* is a "record", rather than the structure that holds one record.
 |
|
Back to top |
|
 |
|