Author |
Message
|
pdmenon |
Posted: Thu Jan 13, 2011 2:13 am Post subject: Optimisation while converting data into an xml |
|
|
 Voyager
Joined: 05 Apr 2010 Posts: 80
|
Dear Experts,
We are retrieving data from db2 and converting it into an xml file using MB 6.1.0.4, with the help of compute node. Doing this activity, it is taking lot of time.
How can I optimize it?
Thanks |
|
Back to top |
|
 |
mqjeff |
Posted: Thu Jan 13, 2011 3:01 am Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
Step 1: Update to 6.1.0.8.
Step 2: Analyze the performance of your flow
Step 3: Adjust your flow to improve the performance |
|
Back to top |
|
 |
fatherjack |
Posted: Thu Jan 13, 2011 3:01 am Post subject: Re: Optimisation while converting data into an xml |
|
|
 Knight
Joined: 14 Apr 2010 Posts: 522 Location: Craggy Island
|
pdmenon wrote: |
We are retrieving data from db2 and converting it into an xml file using MB 6.1.0.4, with the help of compute node. Doing this activity, it is taking lot of time. |
Which activity is taking lot of time? Retrieving from the database or the data conversion? Or maybe writing the data out?
And a bit more detail about your environment, your message flows, the volume of data and size of XML messages, what testing you have done to identify the problem area, what you mean by a 'lot of time' etc. would be nice. _________________ Never let the facts get in the way of a good theory. |
|
Back to top |
|
 |
pdmenon |
Posted: Thu Jan 13, 2011 3:15 am Post subject: |
|
|
 Voyager
Joined: 05 Apr 2010 Posts: 80
|
Dear fatherjack,
Fetching data from db is happening in milliseconds, converting the fetched rows into xml is taking huge time. No. of rows need to convert to an xml is around 15000 rows. file size generated for these 15000 rows will be around 9 MB.
It is taking 5-6 minutes to convert this into an xml file. |
|
Back to top |
|
 |
pdmenon |
Posted: Thu Jan 13, 2011 3:17 am Post subject: |
|
|
 Voyager
Joined: 05 Apr 2010 Posts: 80
|
Moreover, at present, we can't upgrade it to 6.1.0.8.
Time constraint as well as need more approvals from higher management. |
|
Back to top |
|
 |
fatherjack |
Posted: Thu Jan 13, 2011 3:25 am Post subject: |
|
|
 Knight
Joined: 14 Apr 2010 Posts: 522 Location: Craggy Island
|
What do your monitoring tools say is happening on the box. _________________ Never let the facts get in the way of a good theory. |
|
Back to top |
|
 |
pdmenon |
Posted: Thu Jan 13, 2011 3:42 am Post subject: |
|
|
 Voyager
Joined: 05 Apr 2010 Posts: 80
|
fatherjack wrote: |
What do your monitoring tools say is happening on the box. |
We are not using any monitoring tools.  |
|
Back to top |
|
 |
fatherjack |
Posted: Thu Jan 13, 2011 3:53 am Post subject: |
|
|
 Knight
Joined: 14 Apr 2010 Posts: 522 Location: Craggy Island
|
pdmenon wrote: |
We are not using any monitoring tools.  |
What - none at all? Not even the ones that come with the OS e.g. Task Manager on Windows? _________________ Never let the facts get in the way of a good theory. |
|
Back to top |
|
 |
mgk |
Posted: Thu Jan 13, 2011 4:40 am Post subject: |
|
|
 Padawan
Joined: 31 Jul 2003 Posts: 1642
|
How are you doing the extract and transform, can you post your code? If you can do the extract and transform in a single SELECT directly into the OutputRoot you will get the best performance possible. If you SELECT into the Envrionment or LocalEnvironment and then transform afterwards this will take longer...
Regards, _________________ MGK
The postings I make on this site are my own and don't necessarily represent IBM's positions, strategies or opinions. |
|
Back to top |
|
 |
Vitor |
Posted: Thu Jan 13, 2011 5:21 am Post subject: |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
pdmenon wrote: |
converting the fetched rows into xml is taking huge time. |
What, broadly, are you doing with this data? Is it a straight convert or is the data being processed in some way? Do you have the word CARDINALITY anywhere in the Compute node that isn't in CopyMessageHeaders?
Or more specifically, are you accessing any of these 15,000 rows with an [index]? Because that's going to cost you. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
nimisha.parihar |
Posted: Sun Jan 16, 2011 9:43 pm Post subject: |
|
|
Apprentice
Joined: 03 Nov 2010 Posts: 41
|
Here's a sample of the code we are using -->
Code: |
SET columnRecordCounter = columnRecordCounter + 1 ;
SET recordCounter = 1;
SET Environment.Variables.QueryRowCountCheck[] = PASSTHRU( 'SELECT * FROM TBL_Table1 WHERE TimeStamp >= ? AND Code1 = ? FETCH FIRST 1 ROWS ONLY ',Environment.Variables.TS,Environment.Variables.userId);
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].(XMLNSC.Attribute)Name = 'Table 1';
IF (CARDINALITY(Environment.Variables.QueryRowCountCheck[]) = 0 ) THEN
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Header.Flush = 'No';
ELSE
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Header.Flush = 'Yes';
END IF;
IF (CARDINALITY(Environment.Variables.QueryRowCountCheck[]) > 0 ) THEN
SET Environment.Variables.Query1[] = PASSTHRU( 'select distinct column1,column2,column3,column4,column5,column6,column7,column8
from TBL_Table1 ,WHERE column1 = ?
group by column1,column2,column3,column4,column5,column6,column7,column8 for read only',Environment.Variables.userId);
SET rowCount = CARDINALITY(Environment.Variables.Query1[]);
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Header.NoOfRecords = rowCount;
IF (CARDINALITY(Environment.Variables.Query1[]) > 0) THEN
DECLARE myref2 REFERENCE TO Environment.Variables.Query1[1];
WHILE LASTMOVE(myref2)=TRUE DO
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter].(XMLNSC.Attribute)Item = recordCounter;
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter].column1 = myref2.column1;
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter].column2 = myref2.column2;
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter].column3 = myref2.column3;
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter].column4 = myref2.column4;
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter].column5 = myref2.column5;
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter].column6 = myref2.column6;
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter].column7 = myref2.column7;
SET OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter].column8 = myref2.column8;
SET recordCounter = recordCounter +1;
MOVE myref2 NEXTSIBLING;
END WHILE;
END IF; |
Also,
1. We have 17 such queries in the compute node that need to be converted to XML.
2. Each of these 17 queries return multiple rows that need to be captured in the XML.
3. We have already used indexing on all the queries. |
|
Back to top |
|
 |
fjb_saper |
Posted: Sun Jan 16, 2011 10:24 pm Post subject: |
|
|
 Grand High Poobah
Joined: 18 Nov 2003 Posts: 20756 Location: LI,NY
|
Very poor performance on your
Code: |
OutputRoot.XMLNSC.StructuredData.Body.TBL[columnRecordCounter].Row[recordCounter] |
You were smart and declared an input reference to assign the values.
Now it's time to declare an output references like
Code: |
DECLARE mytbl REFERENCE TO OutputRoot.XMLNSC.StructureData.Body.TBL[1];
DECLARE myrow REFERENCE TO mytbl.Row[1];
SET myrow.colum1 = myref2.column1; |
And you can also use
Code: |
DEFINE NEXT SIBLING OF myrow AS myrow name 'Row'; |
This should allow to speed up your code considerably.
Have fun  _________________ MQ & Broker admin |
|
Back to top |
|
 |
nimisha.parihar |
Posted: Sun Jan 16, 2011 10:57 pm Post subject: |
|
|
Apprentice
Joined: 03 Nov 2010 Posts: 41
|
Thankss... Let me just try that up.
Will keep everyone posted.  |
|
Back to top |
|
 |
pdmenon |
Posted: Thu Jan 27, 2011 11:27 pm Post subject: |
|
|
 Voyager
Joined: 05 Apr 2010 Posts: 80
|
Quote: |
Dear Experts,
We are retrieving data from db2 and converting it into an xml file using MB 6.1.0.4, with the help of compute node. Doing this activity, it is taking lot of time.
How can I optimize it?
Thanks |
Quote: |
Step 1: Update to 6.1.0.8. |
Dear Experts,
Is it possible to likely to see improved performance on our XML pain point if we upgrade to MB 7.0. |
|
Back to top |
|
 |
Vitor |
Posted: Fri Jan 28, 2011 5:06 am Post subject: |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
pdmenon wrote: |
Is it possible to likely to see improved performance on our XML pain point if we upgrade to MB 7.0. |
You're likely to see some.
I'd expect you to see more if you use a reference instead of an index as suggested a few posts back. As I myself said way back, using an [index] construction will cost you with a large structure. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
|