ASG
IBM
Zystems
Cressida
Icon
Netflexity
 
  MQSeries.net
Search  Search       Tech Exchange      Education      Certifications      Library      Info Center      SupportPacs      LinkedIn  Search  Search                                                                   FAQ  FAQ   Usergroups  Usergroups
 
Register  ::  Log in Log in to check your private messages
 
RSS Feed - WebSphere MQ Support RSS Feed - Message Broker Support

MQSeries.net Forum Index » WebSphere Message Broker (ACE) Support » potential complications with extnsive dependncy on DB tables

Post new topic  Reply to topic
 potential complications with extnsive dependncy on DB tables « View previous topic :: View next topic » 
Author Message
vickas
PostPosted: Sun May 08, 2016 12:57 am    Post subject: potential complications with extnsive dependncy on DB tables Reply with quote

Centurion

Joined: 18 Aug 2013
Posts: 126

Hie All ,
I had to implement some flows on IIB v9.0.0.2 which needs to talk to the database (Oracle DB) extensively .
The complete logic of the flows are based on the results fetched from the tables & then insert into their respective tables .
The flows are inter-dependent on each other , i.e the records inserted into one table by one flow will be used/fetched by another flow for its processing .

With all the flows running at the same time , what can be the impact ( something like dead lock ) on the server , flow performance & Database ?

knowing that the DB calls are always costly , what steps can be taken from DB side ( memory , sessions) & server side ( connections , dead locks) to ensure not to get prone to any issues ?

The tables get updated every second either by the flows or by some GUI , so cant put the tables info on the Broker cache .

so pls list down some precautionary measures for smooth going .
ua inputs are highly appreciated .
Back to top
View user's profile Send private message
fjb_saper
PostPosted: Sun May 08, 2016 3:24 am    Post subject: Re: potential complications with extnsive dependncy on DB ta Reply with quote

Grand High Poobah

Joined: 18 Nov 2003
Posts: 20756
Location: LI,NY

vickas wrote:
Hie All ,
I had to implement some flows on IIB v9.0.0.2 which needs to talk to the database (Oracle DB) extensively .
The complete logic of the flows are based on the results fetched from the tables & then insert into their respective tables .
The flows are inter-dependent on each other , i.e the records inserted into one table by one flow will be used/fetched by another flow for its processing .

With all the flows running at the same time , what can be the impact ( something like dead lock ) on the server , flow performance & Database ?

knowing that the DB calls are always costly , what steps can be taken from DB side ( memory , sessions) & server side ( connections , dead locks) to ensure not to get prone to any issues ?

The tables get updated every second either by the flows or by some GUI , so cant put the tables info on the Broker cache .

so pls list down some precautionary measures for smooth going .
ua inputs are highly appreciated .

Have you thought about DB Caching of the Table / Table rows??
_________________
MQ & Broker admin
Back to top
View user's profile Send private message Send e-mail
vickas
PostPosted: Sun May 08, 2016 11:23 pm    Post subject: Re: potential complications with extnsive dependncy on DB ta Reply with quote

Centurion

Joined: 18 Aug 2013
Posts: 126

fjb_saper wrote:
vickas wrote:
Hie All ,
I had to implement some flows on IIB v9.0.0.2 which needs to talk to the database (Oracle DB) extensively .
The complete logic of the flows are based on the results fetched from the tables & then insert into their respective tables .
The flows are inter-dependent on each other , i.e the records inserted into one table by one flow will be used/fetched by another flow for its processing .

With all the flows running at the same time , what can be the impact ( something like dead lock ) on the server , flow performance & Database ?

knowing that the DB calls are always costly , what steps can be taken from DB side ( memory , sessions) & server side ( connections , dead locks) to ensure not to get prone to any issues ?

The tables get updated every second either by the flows or by some GUI , so cant put the tables info on the Broker cache .

so pls list down some precautionary measures for smooth going .
ua inputs are highly appreciated .

Have you thought about DB Caching of the Table / Table rows??

Quote:

Have you thought about DB Caching of the Table / Table rows?? :.

As said , The tables get updated every second either by the flows or by some GUI , so cant put the tables info onto the Broker cache
Ya , if we know that the data on tables will be constant for atleast certain time period , we can put it on cache .
here i see hundreds of records getting updated every second .
Back to top
View user's profile Send private message
mqjeff
PostPosted: Mon May 09, 2016 4:41 am    Post subject: Reply with quote

Grand Master

Joined: 25 Jun 2008
Posts: 17447

A prepared query/statement or a stored procedure can probably help performance.

The Broker does cache open DB handles for some period of time of inactivity (a minute?) . So if you are using the handles constantly, they shouldn't be dropped and recreated.

PASSTHRU might be more efficient than not.

There might also be Oracle specific tuning that can be done with an Oracle client - you would need to install the Oracle client and adjust the ODBC db to use it.

Generally, this is a question to work with your DBAs on.
_________________
chmod -R ugo-wx /
Back to top
View user's profile Send private message
mayheminMQ
PostPosted: Wed May 18, 2016 7:25 am    Post subject: Reply with quote

Voyager

Joined: 04 Sep 2012
Posts: 77
Location: UK beyond the meadows of RocknRoll

Quote:
The complete logic of the flows are based on the results fetched from the tables & then insert into their respective tables .
The flows are inter-dependent on each other , i.e the records inserted into one table by one flow will be used/fetched by another flow for its processing .


Is this one of those designs that I have seen where there is a massive orchestration of a flow done with a tonne of smaller flows performing a minuscule operation like routing or just one particular transformation etc?

If the orchestration logic can be grouped into one, cache them as suggested and perform a single update when complete(greenpath or failure). Instead of querying the table, you query your cache variable . Once completed send through a single update.
_________________
A Colorblind man may appear disadvantaged but he always sees more than just colors...
Back to top
View user's profile Send private message
Vitor
PostPosted: Wed May 18, 2016 7:38 am    Post subject: Re: potential complications with extnsive dependncy on DB ta Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 26093
Location: Texas, USA

vickas wrote:
fjb_saper wrote:
Have you thought about DB Caching of the Table / Table rows?? :.

As said , The tables get updated every second either by the flows or by some GUI , so cant put the tables info onto the Broker cache [/quote]

I believe my most worthy associate was talking about caching on the Oracle side, not the IIB side. Which I agree could help you.
_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
smdavies99
PostPosted: Wed May 18, 2016 10:00 pm    Post subject: Reply with quote

Jedi Council

Joined: 10 Feb 2003
Posts: 6076
Location: Somewhere over the Rainbow this side of Never-never land.

And, to add to the mix

You need to understand the Oracle page/row locking and you can then adjust any reference selects (selects not for an update) to be done READONLY and thus not incur any DB locking. This is one of the major causes of deadlocks in my experience.

We had lots of locking/deadlock issues until we went through every DB interaction and where it was a reference select we changed it to PASSTHRU and made the query READONLY. The deadlocks went away as if by magic.
YMMV though. and different DB's use different locking methodologies.
_________________
WMQ User since 1999
MQSI/WBI/WMB/'Thingy' User since 2002
Linux user since 1995

Every time you reinvent the wheel the more square it gets (anon). If in doubt think and investigate before you ask silly questions.
Back to top
View user's profile Send private message
Vitor
PostPosted: Thu May 19, 2016 5:05 am    Post subject: Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 26093
Location: Texas, USA

smdavies99 wrote:
You need to understand the Oracle page/row locking and you can then adjust any reference selects (selects not for an update) to be done READONLY and thus not incur any DB locking. This is one of the major causes of deadlocks in my experience.




smdavies99 wrote:

YMMV though. and different DB's use different locking methodologies.



_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
vickas
PostPosted: Tue May 24, 2016 12:12 am    Post subject: Reply with quote

Centurion

Joined: 18 Aug 2013
Posts: 126

Hie Experts , thanks for your suggestions ..

some of my trustful mates have been talking about caching .
jus want to know one thing about caching at ESB level , what will be the impact on the memory consumption if we cache the DB records ??
suppose if i have to maintain last 10 days records ( which will be around 15000 ( 15k) records ) what will be the impact on resources ?

I have used global cache in some of my projects where we used to store very few records which was refreshed everyday .
here , if i have to cache some thousands of records ( 15k ) , is it good to go for this approach ? pls advise .
Back to top
View user's profile Send private message
mqjeff
PostPosted: Tue May 24, 2016 4:09 am    Post subject: Reply with quote

Grand Master

Joined: 25 Jun 2008
Posts: 17447

The impact will be the memory used to hold the cache.

And potentially the CPU time to access a given record in the cache, depending on how you access it/design it/write it.

If you use an ESQL Row and access it using [], you will take an inordinate amount of time each time you have to access anything more than about [3].
_________________
chmod -R ugo-wx /
Back to top
View user's profile Send private message
Vitor
PostPosted: Tue May 24, 2016 4:28 am    Post subject: Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 26093
Location: Texas, USA

vickas wrote:
suppose if i have to maintain last 10 days records ( which will be around 15000 ( 15k) records ) what will be the impact on resources ?


It depends if the records are 1Kb, 1Mb or 1Gb.

Either way, it's going to eat a lot of memory, and a fair number of cycles to maintain. This is one of the reasons I (and others) suggested caching at the DB level; it's much more optimized and performant for this kind of solution.

The other problem with an ESB-side cache is the one you yourself laid out in your original post. If you're updating records in the ESB cache very frequently, you're rapidly going to get seriously out of step with the actual DB. In a best case scenario, that means at some point you're going to need a massive update exercise to re-sync the data. In a worst case scenario, there's a problem with EG and the cache gets reset, losing all you updates.

vickas wrote:
is it good to go for this approach ?


No. For the reasons given.
_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic  Reply to topic Page 1 of 1

MQSeries.net Forum Index » WebSphere Message Broker (ACE) Support » potential complications with extnsive dependncy on DB tables
Jump to:  



You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
Protected by Anti-Spam ACP
 
 


Theme by Dustin Baccetti
Powered by phpBB © 2001, 2002 phpBB Group

Copyright © MQSeries.net. All rights reserved.