ASG
IBM
Zystems
Cressida
Icon
Netflexity
 
  MQSeries.net
Search  Search       Tech Exchange      Education      Certifications      Library      Info Center      SupportPacs      LinkedIn  Search  Search                                                                   FAQ  FAQ   Usergroups  Usergroups
 
Register  ::  Log in Log in to check your private messages
 
RSS Feed - WebSphere MQ Support RSS Feed - Message Broker Support

MQSeries.net Forum IndexWebSphere Message Broker SupportBest practices for DB interaction

Post new topicReply to topic
Best practices for DB interaction View previous topic :: View next topic
Author Message
Muhammad Haider
PostPosted: Sun Jan 21, 2018 10:53 pm Post subject: Best practices for DB interaction Reply with quote

Apprentice

Joined: 14 Mar 2011
Posts: 39

Hi All

What is best practice for following 2 scenarios:

1. To insert a large no. of records like 70K in DB (stored procedure OR loop using PASSTHRU/insert etc would be better approach )

2. Reading large no. of records from the database and insert into a file (should we read e.g 10K records at a time and put them in a file in a loop because reading 70K records into Environment variable may consume a lot of memory..)

Members, can you kindly share what can be some best practices as per your past experiences with IIB and databases or any interesting scenarios?
Back to top
View user's profile Send private message
abhi_thri
PostPosted: Mon Jan 22, 2018 12:59 am Post subject: Reply with quote

Disciple

Joined: 17 Jul 2017
Posts: 155
Location: UK

Regarding reading records from DB, I've seen the approach of using a stored proc which retrieves a set no. of records from the DB (which is a configurable parm) worked very well.

Say the flow calls the stored proc with no. of records as 100 and the DB has got 1000 records the stored proc returns the first 100 records as marks it as extracted (using a flag column) and the flow will auto trigger itself (by sending a msg to itself) so that the stored proc can be called multiple times until all the data is retrieved.

Regarding the insert scenario, calling a stored proc along with propagate works fine but that was not dealing with massive volumes.
Back to top
View user's profile Send private message
Vitor
PostPosted: Mon Jan 22, 2018 6:02 am Post subject: Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 24972
Location: Ohio, USA

How much data is in these 70K records? 1K each? 10K each? 1Mb each?

IMHO:

1) I'd use a stored procedure, not because of the size but to prevent table locking

2) Depending on the amount of data pulled back, you might have to limit the number of rows. You'd certainly need to follow the guidelines in the InfoCenter for handling large data.

Though frankly, I'd question using IIB for this at all. Unless you're doing something more than a lift-and-shift of 70K records into or out of a database, there are better ETL tools you can use.

You shouldn't use IIB just because you have IIB. It's a really good screwdriver but if all you want to do is bang in nails, a hammer is better & cheaper.
_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
Display posts from previous:
Post new topicReply to topic Page 1 of 1

MQSeries.net Forum IndexWebSphere Message Broker SupportBest practices for DB interaction
Jump to:



You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
Protected by Anti-Spam ACP


Theme by Dustin Baccetti
Powered by phpBB 2001, 2002 phpBB Group

Copyright MQSeries.net. All rights reserved.