Author |
Message
|
wbi_telecom |
Posted: Fri Mar 06, 2009 8:46 am Post subject: File processing nodes question |
|
|
 Disciple
Joined: 15 Feb 2006 Posts: 188 Location: Harrisburg, PA
|
We have a requirement for which we are considering the File processing nodes in Wbi 6.1. I have never used them before so this question. The requirement goes like this.
I will get a flat file containing fix length records which belong to 3 different copybooks. I am supposed to construct one big XML message using these records. The flat file records will be in sequential order for me to build the XML. The problem is once I get the first record from the file, do the MRM to XML stuff and build some part of OutputRoot.XML, what will trigger the second record in the file to be passed on to me without propagating the OutputRoot.
I have been dealing with messages in queues. The flow gets the next message from the queue only after the first message is processed, the unit of work ends and OutputRoot is gone. In this case I will have to preserve the OutputRoot till I am done with all the messages in the File.
is there a setting on the File Processing nodes that will help me?
Cheers, |
|
Back to top |
|
 |
elvis_gn |
Posted: Fri Mar 06, 2009 8:52 am Post subject: |
|
|
 Padawan
Joined: 08 Oct 2004 Posts: 1905 Location: Dubai
|
Hi wbi_telecom,
You can use the file node to pick record by record or pick the entire data at one go. I think you need the second option, unless your file is very huge, which will require you to use a Db to store records.
Regards. |
|
Back to top |
|
 |
wbi_telecom |
Posted: Fri Mar 06, 2009 9:05 am Post subject: |
|
|
 Disciple
Joined: 15 Feb 2006 Posts: 188 Location: Harrisburg, PA
|
The file contains records belonging to 3 different copybooks so even if I pick it as one single message I need to substring the records and then parse them based on the copybooks they belong. Again the question is what will promt the propagation of the input records becaues I will not be propagating the OutputRoot till the entire file is processed
I read about a collector node in this version of broker. Will that help in this requirement?
Cheers, |
|
Back to top |
|
 |
cnurao_008 |
Posted: Fri Mar 06, 2009 10:25 am Post subject: |
|
|
Apprentice
Joined: 25 Nov 2008 Posts: 30
|
hello Threre sud be delimiter to distinguish 3 different files. So You can read the files in oredre by using delimiter or length of the each of the file or record . then you can perform your operations till the record is ended. |
|
Back to top |
|
 |
wbi_telecom |
Posted: Fri Mar 06, 2009 10:49 am Post subject: |
|
|
 Disciple
Joined: 15 Feb 2006 Posts: 188 Location: Harrisburg, PA
|
That's not an issue. The records are fixed length. The problem is how would I keep collecting messages without sending OutputRoot. In the message flows that I worked with so far I get a message , I transform it and send out one or more messages. In this case is I am supposed to drain all messages from a file and send out a single message.
Cheers, |
|
Back to top |
|
 |
elvis_gn |
Posted: Fri Mar 06, 2009 11:39 am Post subject: |
|
|
 Padawan
Joined: 08 Oct 2004 Posts: 1905 Location: Dubai
|
Hi wbi_telecom,
Where does the need for collecting arise if you have all the three copybooks parsed and fetched as InputRoot ? You will only need to create a message set which will handle the three structures in one definition.
Regards. |
|
Back to top |
|
 |
napier |
Posted: Fri Mar 06, 2009 11:59 am Post subject: |
|
|
 Apprentice
Joined: 09 Oct 2007 Posts: 48 Location: USA
|
If it's fixed length and coming in a single file then whats the problem to handle that one in a single message set defination? |
|
Back to top |
|
 |
wbi_telecom |
Posted: Fri Mar 06, 2009 12:39 pm Post subject: |
|
|
 Disciple
Joined: 15 Feb 2006 Posts: 188 Location: Harrisburg, PA
|
I guess I am not clear in stating the problem here. I have file containing 15000 records. 5000 belong to format A, 5000 belong to format B and 5000 belong to format C. The sequence of the records can be random like
A
B
B
B
C
C
C
C
As for out put I need to come up with a single XML each record will be a complex type under a root element.
Cheers, |
|
Back to top |
|
 |
mqjeff |
Posted: Fri Mar 06, 2009 2:30 pm Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
Okay, so just because the records are *described* by three separate copybooks, it doesn't mean you can't create a single message that describes all three record types.
The records just have to be distinguishable from each other in some useful way.
The other question that you are asking, that nobody else seems to be answering, is "How do I collect multiple inputs into a single output".
In your case, you have two choices. One (mqpaul's least favorite) read the whole file at once. Then you only have one input, not multiple inputs. Two, the better choice: read the file record by record, parse each one into a partial XML document, and pass it to the (you may have guessed already) Collector node. Then you can do various things to trigger that the collection is complete, and you will get all of the partial XML records in one Collection tree, that you can assemble into a single XML document and send out. |
|
Back to top |
|
 |
wbi_telecom |
Posted: Sat Mar 07, 2009 3:34 am Post subject: |
|
|
 Disciple
Joined: 15 Feb 2006 Posts: 188 Location: Harrisburg, PA
|
Jeff,
Thank you so much for your reply. I already started a proof of concept that involves File Input node and collector node. I have configured the fileinput node to send me fixed length of data from the file which I take in as a BLOB, and based on the first character parse it with the appropriate copybook. I then send the messages to the collector node.
I have currently conencted the End of File terminal of FileInput node to the Control terminal of Collector node hoping that's how I will stop collecting at the end of file. The I have a compute to manipulate the XML the way I want it and send it as a single message.
Keeping my fingers crossed.
Cheers, |
|
Back to top |
|
 |
chids |
Posted: Mon Mar 09, 2009 5:25 am Post subject: |
|
|
 Novice
Joined: 09 Oct 2006 Posts: 22 Location: Stockholm, Sweden
|
wbi_telecom wrote: |
I already started a proof of concept that involves File Input node and collector node. I have configured the fileinput node to send me fixed length of data from the file which I take in as a BLOB, and based on the first character parse it with the appropriate copybook. I then send the messages to the collector node. |
This what I'd do as well.
Since you're using both FileInput for local files and the collector I assume you're in a single broker environment. Even though you might be fine with that just keep in mind that were you to horizontally scale your environment to an active/active setup this solution will most likely be affected. In that event you'd probably need to figure out a way to make sure only one broker reads the file. Also remember, in the case of an active/active environment, that the collector node keeps state on the local queue manager. _________________ /mårten.
-- http://marten.gustafson.pp.se/
-- marten.gustafson@gmail.com |
|
Back to top |
|
 |
wbi_telecom |
Posted: Mon Mar 09, 2009 7:27 am Post subject: |
|
|
 Disciple
Joined: 15 Feb 2006 Posts: 188 Location: Harrisburg, PA
|
Thanks for your input. We do not have a hardware clustering for HA. We do have multiple brokers.The way I plan to design it is the files will be "ftp"ed to one of the Prod brokers and all will be picked up by FileProcessing node on that broker and processed. If the node on which the broker exists is down, the ftp will fail and the files will be sent to the other broker node and all the files will be processed there.
This is the first(and only) flow that is going to use FileProcessing nodes. All other flows use Http or MQ nodes. They are load balanced using Edge server for http and MQ Clusters for MQ messages.
Cheers, |
|
Back to top |
|
 |
mqjeff |
Posted: Mon Mar 09, 2009 8:07 am Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
again, if your record types are identifiable by the first character ( and you said they are) you should be able to create a single Message in the TDS domain that includes a choice of the three record types with a tagged/delimited composition or tagged/fixed length. Then TDS will just figure out what type of record you have and parse it automatically, and you won't have to deal with the BLOB yourself. |
|
Back to top |
|
 |
wbi_telecom |
Posted: Mon Mar 09, 2009 11:56 am Post subject: |
|
|
 Disciple
Joined: 15 Feb 2006 Posts: 188 Location: Harrisburg, PA
|
Hi Jeff,
I will certainly consider using it. One of the devlopers here already has the message sets created based on copybooks so I decided to use them for the POC. This POC is more for the file processing nodes and the collector node. Once we got that part working as per expectations we will change the way we are converting the flat files to XML.
Cheers, |
|
Back to top |
|
 |
|