ASG
IBM
Zystems
Cressida
Icon
Netflexity
 
  MQSeries.net
Search  Search       Tech Exchange      Education      Certifications      Library      Info Center      SupportPacs      LinkedIn  Search  Search                                                                   FAQ  FAQ   Usergroups  Usergroups
 
Register  ::  Log in Log in to check your private messages
 
RSS Feed - WebSphere MQ Support RSS Feed - Message Broker Support

MQSeries.net Forum Index » WebSphere Message Broker (ACE) Support » Need solution approach !

Post new topic  Reply to topic
 Need solution approach ! « View previous topic :: View next topic » 
Author Message
afroz11031
PostPosted: Sat Jan 07, 2017 11:43 pm    Post subject: Need solution approach ! Reply with quote

Apprentice

Joined: 28 Jan 2014
Posts: 36

Dear IIB Experts,

I have a requirement as below:

1. XML data is put into an input queue (WMQ) by consumer.
2. The input queue may receive data from several customers and could be of same customer but not in a sequence order. I mean like 1st request, 3rd and 6th requests are of same customer.
3. For each same customer, we could have a number of xml request in the queue.

Client requirement is, once it comes to IIB, retrieve all the same customer request message and club into 1 single message for those same customers (i.e if we receive 2 different customers of 10 request messages then there will be only 2 message need to create). The reason is to do that we need to hit database twice in this case. So i do not need to open 10 connections for 10 requests.

Please note this sorting of same customer and clubbing into 1 single request is not possible at client side.

Please advise the best solution to implement.

FYI, i planned to do this as below:

1. MQInput Node ---> DB (Insert all records in the staging table of database as soon it comes to queue, but seems it will have performance issue)
2. TimerFlow node ---> Compute Node (timerflow will run every some specified time and retrieve all the requests from DB(considering performance issue if more requests are there) and then filter based on the customer and create a single request. The whole single request then we will hit to the final database at one short.


Thanks
Afroz
Back to top
View user's profile Send private message
smdavies99
PostPosted: Sat Jan 07, 2017 11:59 pm    Post subject: Reply with quote

Jedi Council

Joined: 10 Feb 2003
Posts: 6076
Location: Somewhere over the Rainbow this side of Never-never land.

so how do you identify the data from each different customer?
How much data per day are we talking about?
_________________
WMQ User since 1999
MQSI/WBI/WMB/'Thingy' User since 2002
Linux user since 1995

Every time you reinvent the wheel the more square it gets (anon). If in doubt think and investigate before you ask silly questions.
Back to top
View user's profile Send private message
afroz11031
PostPosted: Sun Jan 08, 2017 12:25 am    Post subject: Reply with quote

Apprentice

Joined: 28 Jan 2014
Posts: 36

Thanks for your quick turn.
1.There is a customer id which will be in each request , based on the customer id.

2. In 1 minute there will be 50 requests maximum.


Thanks
Afroz
Back to top
View user's profile Send private message
smdavies99
PostPosted: Sun Jan 08, 2017 2:19 am    Post subject: Reply with quote

Jedi Council

Joined: 10 Feb 2003
Posts: 6076
Location: Somewhere over the Rainbow this side of Never-never land.

afroz11031 wrote:
Thanks for your quick turn.
1.There is a customer id which will be in each request , based on the customer id.

2. In 1 minute there will be 50 requests maximum.


Thanks
Afroz


Your proposed solution seems to be good but you need to work the issue of DB performance with your DBA's. You aren't really loading and decent RDBMS at all.
_________________
WMQ User since 1999
MQSI/WBI/WMB/'Thingy' User since 2002
Linux user since 1995

Every time you reinvent the wheel the more square it gets (anon). If in doubt think and investigate before you ask silly questions.
Back to top
View user's profile Send private message
afroz11031
PostPosted: Sun Jan 08, 2017 6:07 am    Post subject: Reply with quote

Apprentice

Joined: 28 Jan 2014
Posts: 36

Thanks mate for your valuable input. I will talk to DBA on the performance.

Cheers
Afroz
Back to top
View user's profile Send private message
ruimadaleno
PostPosted: Mon Jan 09, 2017 1:30 am    Post subject: Reply with quote

Master

Joined: 08 May 2014
Posts: 274

do you plan to store xml directly in database, maybe in a blob/clob/bin column ? or do you plan to parse the xml , extract the id and some other info to store in specific database table columns ? maybe this can help in db performance problems ... get info from xml (you need to parse the xml .. if the xml is big you may hit a problem in message flow performance). .. store info in db table and quicky extract info ... then extract xml ..

just my two cents
_________________
Best regards

Rui Madaleno
Back to top
View user's profile Send private message
smdavies99
PostPosted: Mon Jan 09, 2017 1:52 am    Post subject: Reply with quote

Jedi Council

Joined: 10 Feb 2003
Posts: 6076
Location: Somewhere over the Rainbow this side of Never-never land.

ruimadaleno wrote:
do you plan to store xml directly in database, maybe in a blob/clob/bin column ? or do you plan to parse the xml , extract the id and some other info to store in specific database table columns ? maybe this can help in db performance problems ... get info from xml (you need to parse the xml .. if the xml is big you may hit a problem in message flow performance). .. store info in db table and quicky extract info ... then extract xml ..

just my two cents


I've done this sort of thing many times over the years. I use two tables. One containing the data plus an index and a lookup key. the other table contains the timestamp of the first record in a group/set and the lookup key.
The OP will probably have to parse the input message to extract the key identifiers.
So I'd probably read the message as a blob and the parse the blob to get the bits I'd need to identify it. Then write the BLOB + identifiers + a UUID to the table.

Not rocket science really but desiging the tables properly is key.
_________________
WMQ User since 1999
MQSI/WBI/WMB/'Thingy' User since 2002
Linux user since 1995

Every time you reinvent the wheel the more square it gets (anon). If in doubt think and investigate before you ask silly questions.
Back to top
View user's profile Send private message
afroz11031
PostPosted: Mon Jan 09, 2017 8:58 pm    Post subject: Reply with quote

Apprentice

Joined: 28 Jan 2014
Posts: 36

ruimadaleno wrote:
do you plan to store xml directly in database, maybe in a blob/clob/bin column ? or do you plan to parse the xml , extract the id and some other info to store in specific database table columns ? maybe this can help in db performance problems ... get info from xml (you need to parse the xml .. if the xml is big you may hit a problem in message flow performance). .. store info in db table and quicky extract info ... then extract xml ..

just my two cents


As I mentioned earlier, this is what I have planned.

Quote:
1. MQInput Node ---> DB (Insert all records in the staging table of database as soon it comes to queue, but seems it will have performance issue)
2. TimerFlow node ---> Compute Node (timerflow will run every some specified time and retrieve all the requests from DB(considering performance issue if more requests are there) and then filter based on the customer and create a single request
.

Once the above is done I will be removing the selected records or making them as read (using flag option) so that next time I will be retrieving from next record.

In all this case I am just using a single table.
Back to top
View user's profile Send private message
ruimadaleno
PostPosted: Tue Jan 10, 2017 1:45 am    Post subject: Reply with quote

Master

Joined: 08 May 2014
Posts: 274

After a handfull of months running you system in production you may have some gigabytes of xml.

How do you performe some cleanup ? if you decide to retain the last 6months of xml data, how can you delete/discard the older xml ?
_________________
Best regards

Rui Madaleno
Back to top
View user's profile Send private message
afroz11031
PostPosted: Tue Jan 10, 2017 5:13 am    Post subject: Reply with quote

Apprentice

Joined: 28 Jan 2014
Posts: 36

As per the requirement we are clubbing the message based on the customer id and then hit to another component, so once this process is completed , I do not need to keep the information in staging table, so we are cleaning up the records parallel. In other words I can say I don't need to worry about the message which were processed. However there is no requirement for us to keep the data.

Thanks
Afroz
Back to top
View user's profile Send private message
inMo
PostPosted: Wed Jan 11, 2017 6:31 am    Post subject: Reply with quote

Master

Joined: 27 Jun 2009
Posts: 216
Location: NY

Quote:
In 1 minute there will be 50 requests maximum.


Is this every minute 24/7 or random bursts? What is the size of each XML file on the queue? How often will your timer process trigger an assembly?

What would be the consequence of inserting the XML as-is to staging 50 times in 1 minute?
Back to top
View user's profile Send private message
afroz11031
PostPosted: Wed Jan 11, 2017 10:07 am    Post subject: Reply with quote

Apprentice

Joined: 28 Jan 2014
Posts: 36

Quote:
Is this every minute 24/7 or random bursts?


every minute 24/7 maximum 50 requests.

Quote:
What is the size of each XML file on the queue?


it will be very small file size around 2KB each xml.

Quote:
How often will your timer process trigger an assembly?


It will be trigger every minute.

Quote:
What would be the consequence of inserting the XML as-is to staging 50 times in 1 minute?


as soon as the message comes to the queue it will be inserted into the table.
Back to top
View user's profile Send private message
mqjeff
PostPosted: Wed Jan 11, 2017 10:18 am    Post subject: Reply with quote

Grand Master

Joined: 25 Jun 2008
Posts: 17447

afroz11031 wrote:
as soon as the message comes to the queue it will be inserted into the table.


Then why do you need to use a trigger?

If you need to assemble a set of messages into a single message, that's message Aggregation.
_________________
chmod -R ugo-wx /
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic  Reply to topic Page 1 of 1

MQSeries.net Forum Index » WebSphere Message Broker (ACE) Support » Need solution approach !
Jump to:  



You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
Protected by Anti-Spam ACP
 
 


Theme by Dustin Baccetti
Powered by phpBB © 2001, 2002 phpBB Group

Copyright © MQSeries.net. All rights reserved.