Author |
Message
|
fr3dl33 |
Posted: Thu Nov 14, 2002 2:25 pm Post subject: [Solved]Dynamic Arrays: Can you do this?!!!! |
|
|
 Newbie
Joined: 14 Nov 2002 Posts: 6
|
I'm trying to to make a data structure in Buildtime grow dynamically. I know that that you can set an array to a determined number (ie. 20). But, I want to make it "grow" depending upon the message I receive.
So . . . How do I do this? |
|
Back to top |
|
 |
jmac |
Posted: Thu Nov 14, 2002 2:52 pm Post subject: |
|
|
 Jedi Knight
Joined: 27 Jun 2001 Posts: 3081 Location: EmeriCon, LLC
|
Unfortunately you dont.... You will have to dimension the array to the maximum number of entries you will ever need. _________________ John McDonald
RETIRED |
|
Back to top |
|
 |
fr3dl33 |
Posted: Thu Nov 14, 2002 2:58 pm Post subject: |
|
|
 Newbie
Joined: 14 Nov 2002 Posts: 6
|
So, what's stopping me from setting the array to 1,000,000? |
|
Back to top |
|
 |
jmac |
Posted: Thu Nov 14, 2002 3:39 pm Post subject: |
|
|
 Jedi Knight
Joined: 27 Jun 2001 Posts: 3081 Location: EmeriCon, LLC
|
You are currently limited to a maximum of 512 members in a Data Structure. _________________ John McDonald
RETIRED |
|
Back to top |
|
 |
Ratan |
Posted: Thu Nov 14, 2002 5:08 pm Post subject: |
|
|
 Grand Master
Joined: 18 Jul 2002 Posts: 1245
|
And that is 512 members in total. That is your array will handle less than 512 elements, depending on how many other members you have.
-Laze |
|
Back to top |
|
 |
Vladimir |
Posted: Thu Nov 14, 2002 7:24 pm Post subject: |
|
|
 Acolyte
Joined: 14 Nov 2002 Posts: 73 Location: USA, CA, Bay Area
|
We had the same problem in our project.
And our arrays are dynamic and can be from 0 to several thousand size long.
We had pretty long discussion with WF development team from Germany and they suggested us to keep our data structures as small as possible. Literally small. And yes - 512 member is a limit, but there is another limit - in the import/export utility. It will stop loadinig you data structure definitions when one of these definitions will reach 8K (or 10K - I do not remember the exact number, but it is not big). It is because this utility will translate your FDL into some internal text language and there is a limit on statement size in this language.
Another limitation is performance. They said WF performance will be decreased significantly with the growth of data you put in the data structures. And this is my biggest concern. If IBM can fix previous issues - this problem doesn't seem to be even on there plate these days.
So, my suggestion is: do not plan you data structure being big or to grow significantly later. Keep them small.
The way we solved this problem is: in our datastructures where we have dynamic arrays we defined only one member - pointer to our own database where actual text of this dynamic array is stored in XML form. And we have a layer oj our code that is storing and retrieving these arrays to and from our database.
I know it is not perfect solution, because you suppose to write some code, keep your database running (even if you have it running anyway), XML input validation will not be used (if you use XML interface).
But for me it seems to be the only solution. At least for now. |
|
Back to top |
|
 |
jmac |
Posted: Fri Nov 15, 2002 5:21 am Post subject: |
|
|
 Jedi Knight
Joined: 27 Jun 2001 Posts: 3081 Location: EmeriCon, LLC
|
Vladimir is of course right... The Data structures should be as small as possible for performance reasons. The database footprint will grow as each new instance is created.
For example:
I have a 5 Program activity process. Every Container in the Diagram is assigned to the Structure "TestContainer". Test Container has 100 bytes of data. SO, for each Instance you will have 1200 bytes of container data (100 for each of the 12 containers). Now, if you have 100 instances active concurrently that is 120,000 bytes. Granted this is not much, but I am just trying to illustrate the affect that container size has on your database footprint.
REMEMBER, the only data you need in the process diagram is the data that is necessary for Process Model Navigation, and the KEYS to obtain any other "Application" data from your application database. Do not abuse the MQWF database by putting unnecessary data into it.
GOOD LUCK _________________ John McDonald
RETIRED |
|
Back to top |
|
 |
fr3dl33 |
Posted: Fri Nov 15, 2002 7:32 am Post subject: |
|
|
 Newbie
Joined: 14 Nov 2002 Posts: 6
|
So, I guess you are talking about storing the messages into a separate DB, and only passing required data (required for navigating the process model) into MQWF. That's fine. With some thought, I can architect that.
But, that still doesn't solve my issue of a dynamically growing set of data. I too have the issue of 0 or possible many sets of data. And, I have no way of knowing how many will be in a particular incoming message.
Hmm . . . I'll have to think about this more. If anyone else has some thoughts, I'd appreciate it. |
|
Back to top |
|
 |
Vladimir |
Posted: Fri Nov 15, 2002 3:24 pm Post subject: |
|
|
 Acolyte
Joined: 14 Nov 2002 Posts: 73 Location: USA, CA, Bay Area
|
As I said we are storing these "Dynamic Arrays" in our own database (actually one single field in one single table) and we did this first of all to solve a problem of arrays of unknown size mostly. It reduced the size of our data structures too, but it is just a logical side effect and not the primary goal we were aiming to.
So, we solved this "Dynamic Arrays" problem and using it in production and everybody is quite happy with this solution.  |
|
Back to top |
|
 |
ucbus1 |
Posted: Sat Nov 23, 2002 8:24 am Post subject: |
|
|
Knight
Joined: 30 Jan 2002 Posts: 560
|
just to support what vladimir and john,put forth. We have seperate databse in DB2( Application data base) where we store the information with unique key and the key is part of the instance name. So we just pass the key along thru each of the activieites and use store procedures to retrieve the data using the key as and when needed.
Now after vieing the discussion realized the logic behind it
Thanks |
|
Back to top |
|
 |
educos |
Posted: Fri Jan 03, 2003 1:08 pm Post subject: |
|
|
 Apprentice
Joined: 18 Jul 2001 Posts: 34 Location: Salt Lake City, UT
|
If you never have to reference any value in your array in the process model, or activity description, and as long as you keep the size of your array under control , you could consider using a binary container element where you store your array (or any serializable object Java or whatever) as a serialized object.
Other than the fact that passing a serialized business object through your process instance data flow could offer interesting possibilities (again, as long as you know you can stay reasonable size-wise), this approach is very flexible, and in the case of a serialized array, allows you to dynamically grow or shrink it as needed.
If all your activity implementations are Java-based, then you can use the Java serialization mechanism throughout - you can even compress your byte array to save space prior to serialization into the binary container element. Otherwise, you'd have to come up with your own serialization scheme (XDR can be used in C & C++).
By the way - and this could be a problem - I have never dealt with binary container elements in UPES activities, so I don't know how it would look in a UPES XML message... Maybe someone knows??? I guess you could Base64 encode the bytes in your data container element... hmm, maybe this is more trouble than it's worth  _________________ Eric Ducos
EmeriCon, LLC.
Phone: (801) 789-4348
e-Mail: Eric.Ducos@EmeriCon.com
Website: www.EmeriCon.com |
|
Back to top |
|
 |
nwhi |
Posted: Tue Jan 07, 2003 4:10 am Post subject: |
|
|
Apprentice
Joined: 19 Dec 2002 Posts: 25 Location: UK
|
It's an interesting disucssion topic: whether to pass information through workflow even if it isn't key indexing data or data used for workflow control.
Sometimes it is just a convienient way of passing information from system A to system B, but in general I try and avoid any such data in a workflow.
If you're using the dynamic arrays somehow in the workflow, perhaps to call an external system with each value in turn, then you'll still need another program call to increment the array counter (at least). Why not store the dynamic array outside workflow and make the program increment a counter and retrieve one value at the same time? Then you just use a single value in workflow, understanding that the value is maintained by means outside of workflow control.
As for workflow performance - I believe the operation of MQWF is optimised for small data containers, say less than 1K (don't have the exact size to hand). I think these are passed around and accessed internally as a whole, but if the container size exceeds a certain size it takes another level of indirection (i.e. another database table join) to access the data. Note that the calculations for the size of a data container include the element names - there's a support pac / performance document that gives some insight about this ... but not for the light hearted!  _________________ Nick Whittle
IBM Certified Solutions Designer -
WebSphere MQ Workflow V3.4
MQSolutions (UK) Ltd |
|
Back to top |
|
 |
|