Author |
Message
|
maven |
Posted: Thu Nov 13, 2014 9:44 am Post subject: datatype in DFDL to be dependent on database |
|
|
Novice
Joined: 03 Jun 2007 Posts: 15
|
Have a requirement where the need is to validate the flat file against data-type that is dynamic.
We have a database where the user will set the datatype of the columns required in output file(example: string, 999, 9.99, 99.999) and while validating the file created in broker have to validate the data in the column as per the datatype set for the column in the database.
Is is possible to accommodate such requirement in WMB and also is it a good approach.
Tools used:
-------------
MB v8
SQL Server 2008 |
|
Back to top |
|
 |
Vitor |
Posted: Thu Nov 13, 2014 10:13 am Post subject: Re: datatype in DFDL to be dependent on database |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
maven wrote: |
Is is possible to accommodate such requirement in WMB |
No.
maven wrote: |
also is it a good approach. |
No. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
mqjeff |
Posted: Thu Nov 13, 2014 10:23 am Post subject: Re: datatype in DFDL to be dependent on database |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
Vitor wrote: |
maven wrote: |
Is is possible to accommodate such requirement in WMB |
No. |
Not using DFDL or an XMLNSC schema, or any of the predefined parsers.
Can you write code to do this? Yes.
Vitor wrote: |
maven wrote: |
also is it a good approach. |
No. |
 |
|
Back to top |
|
 |
kimbert |
Posted: Thu Nov 13, 2014 3:14 pm Post subject: |
|
|
 Jedi Council
Joined: 29 Jul 2003 Posts: 5542 Location: Southampton
|
Quote: |
We have a database where the user will set the datatype of the columns required in output file(example: string, 999, 9.99, 99.999) and while validating the file created in broker have to validate the data in the column as per the datatype set for the column in the database. |
If the datatypes can change then so will the processing logic in the message flow. So it will be easier to
- model all of the columns as xs:string
- whenever a datatype changes, adjust the message flow logic ( if necessary) to cope with the change in the data format. Much simpler, since you are only changing one part of the system instead of two.
But I have to ask why you want this flexibility. How often do you expect the datatypes to change? How can you safely write integration logic when the input data changes on a regular basis? _________________ Before you criticize someone, walk a mile in their shoes. That way you're a mile away, and you have their shoes too. |
|
Back to top |
|
 |
Vitor |
Posted: Fri Nov 14, 2014 5:42 am Post subject: |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
kimbert wrote: |
But I have to ask why you want this flexibility. How often do you expect the datatypes to change? How can you safely write integration logic when the input data changes on a regular basis? |
 _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
akil |
Posted: Fri Nov 14, 2014 9:53 am Post subject: |
|
|
 Partisan
Joined: 27 May 2014 Posts: 338 Location: Mumbai
|
My guess ( from having seen similar requirements ) :
1. business users will define file formats via a user interface
2. Files that come in get validated against the defined format
3. On a successful validation, the elements are inserted into a table ( all columns are strings to accommodate any data type )
4. Further processing ( reading records and doing something ) is via predefined database functions.. _________________ Regards |
|
Back to top |
|
 |
maven |
Posted: Tue Nov 18, 2014 10:13 am Post subject: |
|
|
Novice
Joined: 03 Jun 2007 Posts: 15
|
Thanks guys for your replies.
Well the datatype will not change frequently but this is required as the option was provided by the old application.
Currently was using DFDL and current datatype but since the datatype is dynamic have made them all string. But not sure how to validate the data against the datatype in ESQL.
I believe Akil, you have faced the similar requirements based on you questions
1. business users will define file formats via a user interface
-> Yes.
2. Files that come in get validated against the defined format
-> Yes.
3. On a successful validation, the elements are inserted into a table ( all columns are strings to accommodate any data type )
-> On successful validation the data will be written to the file.
4. Further processing ( reading records and doing something ) is via predefined database functions..
-> Read the data from DB, validate the data against datatype stored in DB by user interface and if successful write to the file.
Please do let me know if there is any way to write in ESQL efficiently. Thanks again.. |
|
Back to top |
|
 |
mqjeff |
Posted: Tue Nov 18, 2014 10:16 am Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
Perhaps you could merely allow the "user interface" that the business users will define the format with to be the Toolkit with a DFDL Editor. |
|
Back to top |
|
 |
Vitor |
Posted: Tue Nov 18, 2014 10:23 am Post subject: |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
maven wrote: |
Well the datatype will not change frequently but this is required as the option was provided by the old application. |
How did the old application handle the change of integration logic occassioned by the change in data type?
How often was this option provided by the old application actually used? Does the frequency of change justify the cost of including it in IIB?
What business need does this option satisfy? Is the business stakeholder happy to foot the bill for this work? _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
kimbert |
Posted: Tue Nov 18, 2014 11:18 am Post subject: |
|
|
 Jedi Council
Joined: 29 Jul 2003 Posts: 5542 Location: Southampton
|
If this guess from Akil is correct:
Quote: |
1. business users will define file formats via a user interface
2. Files that come in get validated against the defined format
3. On a successful validation, the elements are inserted into a table ( all columns are strings to accommodate any data type )
4. Further processing ( reading records and doing something ) is via predefined database functions.. |
...then I don't think you gain anything by trying to write a flexible 'validation' message flow. It will be simpler to put the custom logic that performs the validation into one more 'database function'.
I am not recommending that approach - I am just pointing out that using WMB or IIB in this way is getting the worst of both worlds. _________________ Before you criticize someone, walk a mile in their shoes. That way you're a mile away, and you have their shoes too. |
|
Back to top |
|
 |
akil |
Posted: Tue Nov 18, 2014 11:52 pm Post subject: |
|
|
 Partisan
Joined: 27 May 2014 Posts: 338 Location: Mumbai
|
Hi
I think it would meet the purpose if the following is done :
1. Give users a front end that creates a DFDL ( this is possible to do, as the java libraries to validate the DFDL are provided along with the ESB installation )
2. Use ESQL ASBITSTREAM to validate (you cant use Validate / Input nodes to validate as the domain / model names in those nodes aren't configurable programmatically )
This however still doesn't work very well, as the validation exceptions raised aren't so easy to parse and display in a meaningful way to users .. It works but you'll get a few complaints from people saying that the exception could be better written.. (for example, the element that fails the validation is part of a 'Insert/Text' that needs to be parsed).. _________________ Regards |
|
Back to top |
|
 |
maven |
Posted: Wed Nov 26, 2014 10:21 am Post subject: |
|
|
Novice
Joined: 03 Jun 2007 Posts: 15
|
How did the old application handle the change of integration logic occassioned by the change in data type?
-> New functionality required in broker.
How often was this option provided by the old application actually used? Does the frequency of change justify the cost of including it in IIB?
-> New functionality required in broker. Well it is a case where there is no clarity about process but this is what is required. All that is known is that it needs to be done for few fields which are dynamic in datatype.
What business need does this option satisfy? Is the business stakeholder happy to foot the bill for this work?
-> Give user the flexibility to change the data type of few important fields at will.
Give users a front end that creates a DFDL ( this is possible to do, as the java libraries to validate the DFDL are provided along with the ESB installation )
-> But change to DFDL would require deployment and that is difficult to do everytime on the Prod environment. Front end is there but it is not intergrated with broker.
Use ESQL ASBITSTREAM to validate (you cant use Validate / Input nodes to validate as the domain / model names in those nodes aren't configurable programmatically )
-> Not sure how to do this step
Can I use pattern and subpattern and get it working.
http://www-01.ibm.com/support/knowledgecenter/SSKM8N_8.0.0/com.ibm.etools.mft.doc/ak05615_.htm |
|
Back to top |
|
 |
kimbert |
Posted: Wed Nov 26, 2014 2:31 pm Post subject: |
|
|
 Jedi Council
Joined: 29 Jul 2003 Posts: 5542 Location: Southampton
|
Quote: |
How did the old application handle the change of integration logic occassioned by the change in data type?
-> New functionality required in broker. |
I am puzzled by your answers. Most of the answers do not appear to answer the questions.
Please can you explain your requirements more clearly, using examples. Thanks. _________________ Before you criticize someone, walk a mile in their shoes. That way you're a mile away, and you have their shoes too. |
|
Back to top |
|
 |
|