Author |
Message
|
vickas |
Posted: Sun May 08, 2016 12:57 am Post subject: potential complications with extnsive dependncy on DB tables |
|
|
Centurion
Joined: 18 Aug 2013 Posts: 126
|
Hie All ,
I had to implement some flows on IIB v9.0.0.2 which needs to talk to the database (Oracle DB) extensively .
The complete logic of the flows are based on the results fetched from the tables & then insert into their respective tables .
The flows are inter-dependent on each other , i.e the records inserted into one table by one flow will be used/fetched by another flow for its processing .
With all the flows running at the same time , what can be the impact ( something like dead lock ) on the server , flow performance & Database ?
knowing that the DB calls are always costly , what steps can be taken from DB side ( memory , sessions) & server side ( connections , dead locks) to ensure not to get prone to any issues ?
The tables get updated every second either by the flows or by some GUI , so cant put the tables info on the Broker cache .
so pls list down some precautionary measures for smooth going .
ua inputs are highly appreciated . |
|
Back to top |
|
 |
fjb_saper |
Posted: Sun May 08, 2016 3:24 am Post subject: Re: potential complications with extnsive dependncy on DB ta |
|
|
 Grand High Poobah
Joined: 18 Nov 2003 Posts: 20756 Location: LI,NY
|
vickas wrote: |
Hie All ,
I had to implement some flows on IIB v9.0.0.2 which needs to talk to the database (Oracle DB) extensively .
The complete logic of the flows are based on the results fetched from the tables & then insert into their respective tables .
The flows are inter-dependent on each other , i.e the records inserted into one table by one flow will be used/fetched by another flow for its processing .
With all the flows running at the same time , what can be the impact ( something like dead lock ) on the server , flow performance & Database ?
knowing that the DB calls are always costly , what steps can be taken from DB side ( memory , sessions) & server side ( connections , dead locks) to ensure not to get prone to any issues ?
The tables get updated every second either by the flows or by some GUI , so cant put the tables info on the Broker cache .
so pls list down some precautionary measures for smooth going .
ua inputs are highly appreciated . |
Have you thought about DB Caching of the Table / Table rows??  _________________ MQ & Broker admin |
|
Back to top |
|
 |
vickas |
Posted: Sun May 08, 2016 11:23 pm Post subject: Re: potential complications with extnsive dependncy on DB ta |
|
|
Centurion
Joined: 18 Aug 2013 Posts: 126
|
fjb_saper wrote: |
vickas wrote: |
Hie All ,
I had to implement some flows on IIB v9.0.0.2 which needs to talk to the database (Oracle DB) extensively .
The complete logic of the flows are based on the results fetched from the tables & then insert into their respective tables .
The flows are inter-dependent on each other , i.e the records inserted into one table by one flow will be used/fetched by another flow for its processing .
With all the flows running at the same time , what can be the impact ( something like dead lock ) on the server , flow performance & Database ?
knowing that the DB calls are always costly , what steps can be taken from DB side ( memory , sessions) & server side ( connections , dead locks) to ensure not to get prone to any issues ?
The tables get updated every second either by the flows or by some GUI , so cant put the tables info on the Broker cache .
so pls list down some precautionary measures for smooth going .
ua inputs are highly appreciated . |
Have you thought about DB Caching of the Table / Table rows??  |
Quote: |
Have you thought about DB Caching of the Table / Table rows?? :. |
As said , The tables get updated every second either by the flows or by some GUI , so cant put the tables info onto the Broker cache
Ya , if we know that the data on tables will be constant for atleast certain time period , we can put it on cache .
here i see hundreds of records getting updated every second . |
|
Back to top |
|
 |
mqjeff |
Posted: Mon May 09, 2016 4:41 am Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
A prepared query/statement or a stored procedure can probably help performance.
The Broker does cache open DB handles for some period of time of inactivity (a minute?) . So if you are using the handles constantly, they shouldn't be dropped and recreated.
PASSTHRU might be more efficient than not.
There might also be Oracle specific tuning that can be done with an Oracle client - you would need to install the Oracle client and adjust the ODBC db to use it.
Generally, this is a question to work with your DBAs on. _________________ chmod -R ugo-wx / |
|
Back to top |
|
 |
mayheminMQ |
Posted: Wed May 18, 2016 7:25 am Post subject: |
|
|
 Voyager
Joined: 04 Sep 2012 Posts: 77 Location: UK beyond the meadows of RocknRoll
|
Quote: |
The complete logic of the flows are based on the results fetched from the tables & then insert into their respective tables .
The flows are inter-dependent on each other , i.e the records inserted into one table by one flow will be used/fetched by another flow for its processing . |
Is this one of those designs that I have seen where there is a massive orchestration of a flow done with a tonne of smaller flows performing a minuscule operation like routing or just one particular transformation etc?
If the orchestration logic can be grouped into one, cache them as suggested and perform a single update when complete(greenpath or failure). Instead of querying the table, you query your cache variable . Once completed send through a single update. _________________ A Colorblind man may appear disadvantaged but he always sees more than just colors... |
|
Back to top |
|
 |
Vitor |
Posted: Wed May 18, 2016 7:38 am Post subject: Re: potential complications with extnsive dependncy on DB ta |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
vickas wrote: |
fjb_saper wrote: |
Have you thought about DB Caching of the Table / Table rows?? :. |
|
As said , The tables get updated every second either by the flows or by some GUI , so cant put the tables info onto the Broker cache [/quote]
I believe my most worthy associate was talking about caching on the Oracle side, not the IIB side. Which I agree could help you. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
smdavies99 |
Posted: Wed May 18, 2016 10:00 pm Post subject: |
|
|
 Jedi Council
Joined: 10 Feb 2003 Posts: 6076 Location: Somewhere over the Rainbow this side of Never-never land.
|
And, to add to the mix
You need to understand the Oracle page/row locking and you can then adjust any reference selects (selects not for an update) to be done READONLY and thus not incur any DB locking. This is one of the major causes of deadlocks in my experience.
We had lots of locking/deadlock issues until we went through every DB interaction and where it was a reference select we changed it to PASSTHRU and made the query READONLY. The deadlocks went away as if by magic.
YMMV though. and different DB's use different locking methodologies. _________________ WMQ User since 1999
MQSI/WBI/WMB/'Thingy' User since 2002
Linux user since 1995
Every time you reinvent the wheel the more square it gets (anon). If in doubt think and investigate before you ask silly questions. |
|
Back to top |
|
 |
Vitor |
Posted: Thu May 19, 2016 5:05 am Post subject: |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
smdavies99 wrote: |
You need to understand the Oracle page/row locking and you can then adjust any reference selects (selects not for an update) to be done READONLY and thus not incur any DB locking. This is one of the major causes of deadlocks in my experience. |
smdavies99 wrote: |
YMMV though. and different DB's use different locking methodologies. |
 _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
vickas |
Posted: Tue May 24, 2016 12:12 am Post subject: |
|
|
Centurion
Joined: 18 Aug 2013 Posts: 126
|
Hie Experts , thanks for your suggestions ..
some of my trustful mates have been talking about caching .
jus want to know one thing about caching at ESB level , what will be the impact on the memory consumption if we cache the DB records ??
suppose if i have to maintain last 10 days records ( which will be around 15000 ( 15k) records ) what will be the impact on resources ?
I have used global cache in some of my projects where we used to store very few records which was refreshed everyday .
here , if i have to cache some thousands of records ( 15k ) , is it good to go for this approach ? pls advise . |
|
Back to top |
|
 |
mqjeff |
Posted: Tue May 24, 2016 4:09 am Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
The impact will be the memory used to hold the cache.
And potentially the CPU time to access a given record in the cache, depending on how you access it/design it/write it.
If you use an ESQL Row and access it using [], you will take an inordinate amount of time each time you have to access anything more than about [3]. _________________ chmod -R ugo-wx / |
|
Back to top |
|
 |
Vitor |
Posted: Tue May 24, 2016 4:28 am Post subject: |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
vickas wrote: |
suppose if i have to maintain last 10 days records ( which will be around 15000 ( 15k) records ) what will be the impact on resources ? |
It depends if the records are 1Kb, 1Mb or 1Gb.
Either way, it's going to eat a lot of memory, and a fair number of cycles to maintain. This is one of the reasons I (and others) suggested caching at the DB level; it's much more optimized and performant for this kind of solution.
The other problem with an ESB-side cache is the one you yourself laid out in your original post. If you're updating records in the ESB cache very frequently, you're rapidly going to get seriously out of step with the actual DB. In a best case scenario, that means at some point you're going to need a massive update exercise to re-sync the data. In a worst case scenario, there's a problem with EG and the cache gets reset, losing all you updates.
vickas wrote: |
is it good to go for this approach ? |
No. For the reasons given. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
|