|
RSS Feed - WebSphere MQ Support
|
RSS Feed - Message Broker Support
|
 |
|
very large messages |
« View previous topic :: View next topic » |
Author |
Message
|
Gerard |
Posted: Fri Dec 10, 2004 6:04 am Post subject: very large messages |
|
|
Newbie
Joined: 10 Dec 2004 Posts: 2
|
Hi,
I have a requirement that the broker should be able to receive and process messages of a certain type that contains a repeating group. The maximum size expected is around 2 GB, this message will be received via FTP. Other messages can be very small (a few KB, if the repeating group contains just 1 item for instance) and will be received via e-mail.
Can we use the broker to deal with both situations in the same way?
Is it possible to parse very large messages, what´s the size limit?
What is best practice in this case? Is the broker able to use segmented messages and parse the message as it becomes available?
Any ideas?
Thanks in advance |
|
Back to top |
|
 |
JT |
Posted: Mon Dec 13, 2004 4:50 pm Post subject: |
|
|
Padawan
Joined: 27 Mar 2003 Posts: 1564 Location: Hartford, CT.
|
From v5.0 FAQs:
Quote: |
What size of WebSphere MQ messages does the Brokers support?
On distributed platforms (e.g. NT and Unix platforms) WebSphere MQ supports messages with an individual size of 100MB. These messages can also be chained together as segmented messages, so that any size of message can be passed as a logically single message. Also if the product receives a segmented WebSphere MQ message, it can, if required, reassemble all message segments before processing the message. However, large messages can become extremely slow to parse. For example, a 100MB XML message could take a long time to process, this is not a limitation of the broker, the same would be true of any XML parser.
WebSphere MQ provides features such as segmented messages and groups of messages. Do the Brokers support messages using these features?
As described above, the Brokers have full support for processing segmented messages if selected by customizing the MQInput node in the relevant message flow. There is also a setting for supporting WebSphere MQ message groups whereby processing of a message that is part of a group can be delayed until all messages in that group are available and then processing of the messages can take place in sequence
|
Quote: |
Can we use the broker to deal with both situations in the same way? |
Yes. Varying number of occurrences of a repeating group that have the same message structure, are not a problem.
Quote: |
Is it possible to parse very large messages |
Possible......Yes.
Realistic.....Well, that depends on a number of factors, i.e. server configuration, expected response time (if any), etc.....
Quote: |
what´s the size limit? |
I believe the maximum size limit is linked to the server configuration. According to the following thread, PGoodhart is adequately parsing 700mb messages: http://www.mqseries.net/phpBB2/viewtopic.php?t=18629
Quote: |
Is the broker able to use segmented messages and parse the message as it becomes available? |
Yes. See excerpt at the top of this post and review the documentation on Configuring the node to handle message groups |
|
Back to top |
|
 |
gerasale |
Posted: Tue Dec 14, 2004 6:33 am Post subject: latge messages |
|
|
Newbie
Joined: 14 Dec 2004 Posts: 3
|
Full parsing of messages around 100Mb may be so slow that impractical to use (several hours). You need to look at what actually need to be parsed.
There is a way in ESQL to partially parse message (that is if you only need to addrees/change some part of your message). this can considerably reduce time of processing. _________________ Sasha |
|
Back to top |
|
 |
ChrisThomas |
Posted: Tue Dec 14, 2004 1:54 pm Post subject: HTTP Input Nodes |
|
|
Apprentice
Joined: 14 May 2003 Posts: 29 Location: Wisconsin
|
If you are going to try to receive the message via HTTP (using the HTTP Input Node) the limit is 8 meg. We are currently running into this issue where are message were suppose to average 8 meg.
This problem is a problem with the biphttplistener which imposes the 8 meg limit. My understanding is the biphttplistener converts the HTTP message to an internal MQ queue Then the HTTP Input node picks it up from there. I am in contact with IBM to understand why there is a 8 meg limit when MQ can handle much larger size messages. |
|
Back to top |
|
 |
kirani |
Posted: Tue Dec 14, 2004 11:38 pm Post subject: |
|
|
Jedi Knight
Joined: 05 Sep 2001 Posts: 3779 Location: Torrance, CA, USA
|
It'd be better to consider your UOW scope when processing very large messages. You might not want to reject the complete message if one of the record in the file is bad. In this case you need to make sure your error handling is capable of doing that. _________________ Kiran
IBM Cert. Solution Designer & System Administrator - WBIMB V5
IBM Cert. Solutions Expert - WMQI
IBM Cert. Specialist - WMQI, MQSeries
IBM Cert. Developer - MQSeries
|
|
Back to top |
|
 |
Gerard |
Posted: Tue Dec 21, 2004 12:18 am Post subject: does the entire message have to be in memory before parsing? |
|
|
Newbie
Joined: 10 Dec 2004 Posts: 2
|
Thanks for the replies.
Regarding the amount of memory needed to parse a very large message, I´ve been told that the entire message will be loaded into memory before parsing starts. Is that really true?
The idea of on-demand parsers in WBIMB, only parsing enough of the input message to satisfy the current request, seems to imply that the entire message is not needed to be in memory, or am I wishfull thinking here?
The idea would be to read the large file, put the content in segmented MQ messages to the broker that will parse in a streaming way. Is this a valid option or not?
Thanks in advance for your comments.
Gerard |
|
Back to top |
|
 |
|
|
 |
|
Page 1 of 1 |
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|
|
|