Author |
Message
|
wilsonho3 |
Posted: Mon May 04, 2009 2:26 am Post subject: How to resolve this strange issue |
|
|
Voyager
Joined: 20 Nov 2001 Posts: 98 Location: Hong Kong
|
I have a job to clear msg on the dead letter queue and get the following issue.
CSQU950I CSQUDLQ IBM WebSphere MQ for z/OS V6
CSQU200I CSQUDLQ Dead-letter Queue Handler Utility - 2009-05-01 18:33:56
CSQU220E Unable to connect to queue manager QS1A, MQCC=2 MQRC=2058 <-- Q mgr name error,
CSQU221E Unable to open queue manager, MQCC=2 MQRC=2018 <-- Hconn error
CSQU202I Dead-letter queue handler ending. Successful actions: 0 retries, 0 for
How come such error will appear together, I can not control the MQopen in the dead letter queue handler utility. How to solve this case. any idea, MQ erpert on the forum
wilson |
|
Back to top |
|
 |
fjb_saper |
Posted: Mon May 04, 2009 2:32 am Post subject: |
|
|
 Grand High Poobah
Joined: 18 Nov 2003 Posts: 20756 Location: LI,NY
|
Where do you execute the DLQ handler? What is the environment? If it is CICS you can only have one qmgr connected to a CICS environment.
I would suggest you make sure you submit the DLQ Handler from the right environment...
Have fun  _________________ MQ & Broker admin |
|
Back to top |
|
 |
Vitor |
Posted: Mon May 04, 2009 2:42 am Post subject: |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
fjb_saper wrote: |
Where do you execute the DLQ handler? What is the environment? If it is CICS you can only have one qmgr connected to a CICS environment. |
And if it's in JCL check that QS1A is accessable from where (and how) the job is run.
A 2058, even on z/OS, means the same as always. Diagnose it as such. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
Mr Butcher |
Posted: Mon May 04, 2009 5:59 am Post subject: |
|
|
 Padawan
Joined: 23 May 2005 Posts: 1716
|
i agree with vitor and bet my $$$ on a job running in the wrong LPAR _________________ Regards, Butcher |
|
Back to top |
|
 |
zhanghz |
Posted: Thu May 07, 2009 6:04 pm Post subject: |
|
|
Disciple
Joined: 17 Jun 2008 Posts: 186
|
Just to clarify, running a job from a wrong LAPR will give 2059 (MQRC_Q_MGR_NOT_AVAILABLE) instead of 2058.. |
|
Back to top |
|
 |
Mr Butcher |
Posted: Thu May 07, 2009 9:52 pm Post subject: |
|
|
 Padawan
Joined: 23 May 2005 Posts: 1716
|
haha true. i should not bet $$$ on anything  _________________ Regards, Butcher |
|
Back to top |
|
 |
kevinf2349 |
Posted: Mon May 11, 2009 5:20 am Post subject: |
|
|
 Grand Master
Joined: 28 Feb 2003 Posts: 1311 Location: USA
|
zhanghz wrote: |
Just to clarify, running a job from a wrong LAPR will give 2059 (MQRC_Q_MGR_NOT_AVAILABLE) instead of 2058.. |
Not so.
I have just run a test job for a Queue manager on the wrong lpar and got
CSQU120I Connecting to MQPR
CSQU009E MQCONN failed for MQPR. MQCC= 2 MQRC=2058
Now running a job on an lpar with an inactive qmgr yields
CSQU120I Connecting to MQQA
CSQU009E MQCONN failed for MQQA. MQCC= 2 MQRC=2059
Just to clarify:
The job that ran on the wrong LPAR ran on a system that doesn't have MQPR defined as a subsystem at all.
The job that yielded the 2059 has MQQA defined as a subsystem but it isn't started. |
|
Back to top |
|
 |
zhanghz |
Posted: Mon May 11, 2009 5:43 pm Post subject: |
|
|
Disciple
Joined: 17 Jun 2008 Posts: 186
|
Hi kevin, thanks for clarification. I agree. I did a test again trying to connect to a non-existing QMGR, yes, it gave 2058. So Mr Butcher's money is safe .
In my previous test, I submitted a job in the wrong LPAR, but within the same SYSPLEX, and the QMGR is defined as a sub-system. The test was not exhaustive, my bad. |
|
Back to top |
|
 |
bruce2359 |
Posted: Tue May 12, 2009 4:58 am Post subject: |
|
|
 Poobah
Joined: 05 Jan 2008 Posts: 9469 Location: US: west coast, almost. Otherwise, enroute.
|
Quote: |
In my previous test, I submitted a job in the wrong LPAR, but within the same SYSPLEX, wrong LPAR, |
All MQCONNects (exception being the client) are local - in the same o/s image (same LPAR). Did you mean Parallel Sysplex? Since the DLQ handler identifies a real local queue (the DLQ), connecting the the QSG is not a good option.
Quote: |
and the QMGR is defined as a sub-system. The test was not exhaustive, my bad. |
Qmgrs must be defined to the subsystem name table. _________________ I like deadlines. I like to wave as they pass by.
ב''ה
Lex Orandi, Lex Credendi, Lex Vivendi. As we Worship, So we Believe, So we Live. |
|
Back to top |
|
 |
zhanghz |
Posted: Wed May 13, 2009 8:26 pm Post subject: |
|
|
Disciple
Joined: 17 Jun 2008 Posts: 186
|
bruce2359 wrote: |
Quote: |
In my previous test, I submitted a job in the wrong LPAR, but within the same SYSPLEX, wrong LPAR, |
All MQCONNects (exception being the client) are local - in the same o/s image (same LPAR). Did you mean Parallel Sysplex? Since the DLQ handler identifies a real local queue (the DLQ), connecting the the QSG is not a good option.
... |
We have 2 LPARs, sharing DASD etc. I heard people calling SYSPLEX. I don't know if they are refering to Parallel Sysplex. I am not familiar with Sysplex.  |
|
Back to top |
|
 |
bruce2359 |
Posted: Wed May 13, 2009 8:36 pm Post subject: |
|
|
 Poobah
Joined: 05 Jan 2008 Posts: 9469 Location: US: west coast, almost. Otherwise, enroute.
|
Some use sysplex and parallel sysplex to mean the same thing. The big difference would be if your configuration is using a Coupling Facility (CF) to store database tables and MQ shared-queues. If you have a CF, then you are Parallel Sysplex.
If you are Parallel Sysplex AND have implemented shared-queues, any qmgr in the QueueSharingGroup (QSG) can get or put to shared-queues in the CF; and an application can MQCONNect to either a specific qmgr OR any qmgr in the QSG.
This last item is where your attempt to run the DLQ handler failed. Since each DLQ is unique to a specific qmgr, your batch JCL must identify which qmgr to MQCONNect to and the name of the DLQ you want to process. _________________ I like deadlines. I like to wave as they pass by.
ב''ה
Lex Orandi, Lex Credendi, Lex Vivendi. As we Worship, So we Believe, So we Live. |
|
Back to top |
|
 |
|