Posted: Tue Feb 01, 2011 3:16 am Post subject: Processing a file of 50 MB in broker
Voyager
Joined: 13 Jun 2010 Posts: 78
Hi,
Is broker the right choice to support a file of 50 MB which contains 50,000 transactions. All these transaction have to be in environment for validation check and duplication check.
Can we achieve a good performance with this kind of bulk processing in broker.
We are suggesting the client to break the file and then send us to process.
Quote:
Do they? Can't you just keep a subset of the data to do the duplicate checking?
Yes all of these have to be in environment as there are some complex business logic for each transaction and each has to go in database.
Currently we are supporting a file size of 5 MB and providing a TPS (transaction per second) of 25 at peak load but I really doubt if we can achieve the same with this data load.
Joined: 14 Apr 2010 Posts: 522 Location: Craggy Island
saurabh867 wrote:
We are suggesting the client to break the file and then send us to process.
OK. But how will that affect the ability to check for duplicates?
saurabh867 wrote:
Yes all of these have to be in environment as there are some complex business logic for each transaction and each has to go in database.
Just because the processing of each 'transaction' is complex doesn't mean you have to have all of them in the environment at the same time. You'll need to be a bit more specific about why you are doing this.
saurabh867 wrote:
Currently we are supporting a file size of 5 MB and providing a TPS (transaction per second) of 25 at peak load but I really doubt if we can achieve the same with this data load.
But if its a batch process i.e. a file of transactions, then is TPS important to you? _________________ Never let the facts get in the way of a good theory.
You cannot post new topics in this forum You cannot reply to topics in this forum You cannot edit your posts in this forum You cannot delete your posts in this forum You cannot vote in polls in this forum