|
RSS Feed - WebSphere MQ Support
|
RSS Feed - Message Broker Support
|
 |
|
Best practices for DB interaction |
« View previous topic :: View next topic » |
Author |
Message
|
Muhammad Haider |
Posted: Sun Jan 21, 2018 10:53 pm Post subject: Best practices for DB interaction |
|
|
Apprentice
Joined: 14 Mar 2011 Posts: 43
|
Hi All
What is best practice for following 2 scenarios:
1. To insert a large no. of records like 70K in DB (stored procedure OR loop using PASSTHRU/insert etc would be better approach )
2. Reading large no. of records from the database and insert into a file (should we read e.g 10K records at a time and put them in a file in a loop because reading 70K records into Environment variable may consume a lot of memory..)
Members, can you kindly share what can be some best practices as per your past experiences with IIB and databases or any interesting scenarios? |
|
Back to top |
|
 |
abhi_thri |
Posted: Mon Jan 22, 2018 12:59 am Post subject: |
|
|
 Knight
Joined: 17 Jul 2017 Posts: 516 Location: UK
|
Regarding reading records from DB, I've seen the approach of using a stored proc which retrieves a set no. of records from the DB (which is a configurable parm) worked very well.
Say the flow calls the stored proc with no. of records as 100 and the DB has got 1000 records the stored proc returns the first 100 records as marks it as extracted (using a flag column) and the flow will auto trigger itself (by sending a msg to itself) so that the stored proc can be called multiple times until all the data is retrieved.
Regarding the insert scenario, calling a stored proc along with propagate works fine but that was not dealing with massive volumes. |
|
Back to top |
|
 |
Vitor |
Posted: Mon Jan 22, 2018 6:02 am Post subject: |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
How much data is in these 70K records? 1K each? 10K each? 1Mb each?
IMHO:
1) I'd use a stored procedure, not because of the size but to prevent table locking
2) Depending on the amount of data pulled back, you might have to limit the number of rows. You'd certainly need to follow the guidelines in the InfoCenter for handling large data.
Though frankly, I'd question using IIB for this at all. Unless you're doing something more than a lift-and-shift of 70K records into or out of a database, there are better ETL tools you can use.
You shouldn't use IIB just because you have IIB. It's a really good screwdriver but if all you want to do is bang in nails, a hammer is better & cheaper. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
|
|
 |
|
Page 1 of 1 |
|
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum
|
|
|
|