ASG
IBM
Zystems
Cressida
Icon
Netflexity
 
  MQSeries.net
Search  Search       Tech Exchange      Education      Certifications      Library      Info Center      SupportPacs      LinkedIn  Search  Search                                                                   FAQ  FAQ   Usergroups  Usergroups
 
Register  ::  Log in Log in to check your private messages
 
RSS Feed - WebSphere MQ Support RSS Feed - Message Broker Support

MQSeries.net Forum Index » IBM MQ Telemetry / Low Latency Messaging / Everyplace » MQ Exception 2009 and 2019

Post new topic  Reply to topic
 MQ Exception 2009 and 2019 « View previous topic :: View next topic » 
Author Message
robiijohn
PostPosted: Mon Oct 20, 2008 5:56 pm    Post subject: MQ Exception 2009 and 2019 Reply with quote

Newbie

Joined: 13 Aug 2008
Posts: 7

Hi,
I had been working on the MDB off late and had encountered the following error
while trying to send out a message from the my system to an MQ.

All the connections are established when the service is brought up in Websphere.
My application is hosted on WebSphere 6.1.0.11.
On every incoming message to a particular queue my program consumes it and
onmessage() some other message will be send to a different queue. My program is able
to consume the first message but fails to write into the other queue.

I'm getting two errors, first MQ Exception 2009 and next MQ Exception 2019

I did some search on the below error which is an extract from my log created for the program.
People are saying its related to some MQ problem or something external rather than my application.
Its a known issue and they even have a fix in MQ for AIX but sadly couldn't find any for Windows.

Anybody there who can help me by letting me know how to handle this programmatically.
As far as connection objects are concerned it is not losing the object or getting null. Everything seems to ok.
But rather I believe there is some problem on the physical connection between the MQ and WebSphere.
I'm not sure too. But please provide me your valuable suggestions.

I tried changing the WebSphere connection pool and session pool settings to Entirepool and all but no Luck!

LOG EXTRACT FOR THE ISSUE


--------------------------------------------------------------------------------
FINE: Time : 06/10/2008 05:43:55.921 Inside SendSyncMessage
Oct 6, 2008 5:43:55 PM ejbs.MAPListenerBean
FINE: Time : 06/10/2008 05:43:55.921 IP Queue Name for Request Message : UIST.ONL.RQ.SVRG1.EWSS.BW
Oct 6, 2008 5:43:55 PM ejbs.MAPListenerBean
INFO: Reason Code : 2019
Oct 6, 2008 5:43:55 PM ejbs.MAPListenerBean
INFO: Reason Code : MQJE001: Completion Code 2, Reason 2019
Oct 6, 2008 5:43:55 PM ejbs.MAPListenerBean
SEVERE: MQException Occurred MQJE001: Completion Code 2, Reason 2019


--------------------------------------------------------------------------------

Thanks,
Robins John
Back to top
View user's profile Send private message
Gaya3
PostPosted: Mon Oct 20, 2008 8:25 pm    Post subject: Reply with quote

Jedi

Joined: 12 Sep 2006
Posts: 2493
Location: Boston, US

Reason Code 2019 x'7E3'
MQRC_HOBJ_ERROR

The object handle Hobj is not valid. If the handle is a shareable handle, the handle may have been made invalid by another thread issuing the MQCLOSE call using that handle. If the handle is a nonshareable handle, the call may have been issued by a thread that did not create the handle. This reason also occurs if the parameter pointer is not valid, or (for the MQOPEN call) points to read-only storage. (It is not always possible to detect parameter pointers that are not valid; if not detected, unpredictable results occur.)

Corrective action: Ensure that a successful MQOPEN call is performed for this object, and that an MQCLOSE call has not already been performed for it. For MQGET and MQPUT calls, also ensure that the handle represents a queue object. Ensure that the handle is being used within its valid scope.


Reason Code 2009 x'7D9'
MQRC_CONNECTION_BROKEN
_________________
Regards
Gayathri
-----------------------------------------------
Do Something Before you Die
Back to top
View user's profile Send private message
robiijohn
PostPosted: Mon Oct 20, 2008 8:59 pm    Post subject: Reply with quote

Newbie

Joined: 13 Aug 2008
Posts: 7

Hi Gayathri,

Thanks for ur reply.
But the sadest part is that the same application is running fine on a SIT environment and has problems only in the UAT environment. Therefore there is very little probability for a code error.
Please find the below links to know more about the error.
I need a solution since the one mentioned by them are not working.

The below are extracts from ibm sites on these errors.

JMS connections fail with Reason Code 2019
Technote (FAQ)

Problem
An application running in WebSphere® Application Server V5 or V6 may receive failures when sending messages to, or receiving messages from, a WebSphere MQ or Embedded Messaging queue. The MQ reason code associated with the error is 2019. For example:

javax.jms.JMSException: MQJMS2002: failed to get message from MQ queue
at com.ibm.mq.jms.services.ConfigEnvironment.newException(ConfigEnvironment.java:540)
at com.ibm.mq.jms.MQSession.consume(MQSession.java:2950)
at com.ibm.mq.jms.MQSession.run(MQSession.java:1484)
at com.ibm.ejs.jms.JMSSessionHandle.run(JMSSessionHandle.java:924)
at com.ibm.ejs.jms.listener.ServerSession.connectionConsumerOnMessage(ServerSession.java:752)
...
---- Begin backtrace for Nested Throwables
com.ibm.mq.MQException: MQJE001: Completion Code 2, Reason 2019
at com.ibm.mq.jms.MQSession.consume(MQSession.java:2924)
at com.ibm.mq.jms.MQSession.run(MQSession.java:1484)
at com.ibm.ejs.jms.JMSSessionHandle.run(JMSSessionHandle.java:924)
at com.ibm.ejs.jms.listener.ServerSession.connectionConsumerOnMessage(ServerSession.java:752)
...

Note that the cause of the JMSException can be determined by the MQ reason code that appears in the backtrace. In this case, it is reason code 2019.

Cause
Reason code 2019 usually occurs after a connection broken error (reason code 2009) occurs. You would see a JMSException with reason code 2009 preceding reason code 2019 in the SystemOut.log. Reason code 2009 indicates that the connection to the MQ queue manager is no longer valid, usually due to a network or firewall issue.
Reason code 2019 errors will occur when invalid connections remain in the connection pool after the reason code 2009 error occurs. The next time that the application tries to use one of these connections, the reason code 2019 occurs.


Solution
To resolve the problem, change the Purge Policy for the connection and session pools used by your queue connection factory (QCF) or topic connection factory (TCF) from its default value of FailingConnectionOnly to EntirePool. With this setting, the entire pool of connections will be purged when the reason code 2009 error occurs and no broken connections will remain in the pool.

To do this:
Select the QCF or TCF that your application is using in the Administration Console.
Under Additional Properties: Select Connection Pool and set the Purge Policy to EntirePool.
Then select Session Pools and set the Purge Policy to EntirePool.
After making these changes, save your configuration and
Restart the application server for the changes take effect.


How to resolve JMSException due to com.ibm.mq.MQException: MQJE001: Completion Code 2, Reason 2009
Technote (FAQ)

Problem
The IBM® WebSphere® MQ Reason Code 2009 (MQRC_CONNECTION_BROKEN) may occur when an application installed in WebSphere Application Server V5 tries to connect to a WebSphere MQ or Embedded Messaging queue manager. Here are some examples of errors that are caused by Reason Code 2009:

The following exception was logged javax.jms.JMSException:
MQJMS2008: failed to open MQ queue
com.ibm.mq.MQException: MQJE001: Completion Code 2, Reason 2009

javax.jms.JMSException: MQJMS2005: failed to create MQQueueManager for 'mynode:WAS_mynode_server1'
at com.ibm.mq.jms.services.ConfigEnvironment.newException(ConfigEnvironment.java:556)
at com.ibm.mq.jms.MQConnection.createQM(MQConnection.java:1736)
...
com.ibm.mq.MQException: MQJE001: An MQException occurred: Completion Code 2, Reason 2009
MQJE003: IO error transmitting message buffer
at com.ibm.mq.MQManagedConnectionJ11.<init>(MQManagedConnectionJ11.java:239)
...

WMSG0019E: Unable to start MDB Listener MyMessageDrivenBean, JMSDestination
jms/MyQueue : javax.jms.JMSException: MQJMS2005: failed to create
MQQueueManager for 'mynode:WAS_mynode_server1'
at com.ibm.mq.jms.services.ConfigEnvironment.newException(ConfigEnvironment.java:556)
at com.ibm.mq.jms.MQConnection.createQM(MQConnection.java:1736)
...

Cause
The connection may be broken for a number of different reasons; the 2009 return code indicates that something prevented a successful connection to the Queue Manager. The most common causes for this are the following:

1. A firewall that is terminating the connection.
2. An IOException that causes the socket to be closed.
3. An explicit action to cause the socket to be closed by one end.
4. The queue manager is offline.
5. The maximum number of channels allowed by the queue manager are open.
6. A configuration problem in the Queue Connection Factory (QCF).

Solution
Preventing the firewall from terminating connections
Configure the Connection Pool and Session Pool settings for the QCF that is configured in WebSphere Application Server so that WebSphere can remove connections from the pool before they are dropped by the firewall. Change the value of Min Connections to 0 and set the Unused Timeout to half the number of seconds as the firewall timeout. For example, if the firewall times out connections after 15 minutes (900 seconds), set the Unused Timeout to 450 seconds.

Configuring to minimize the possibility of an IOException
On a UNIX® system, configure the TCP stanza of the qm.ini for your queue manager to contain this entry:
KeepAlive=YES
This setting causes TCP/IP to check periodically that the other end of the connection is still available. If it is not, the channel is closed.

Also follow the instructions in Tuning operating systems in the WebSphere Application Server Info Center. These will have you set the operating system configuration for TCP/IP to try to prevent sockets that are in use from being closed unexpectedly. For example, on Solaris, you will set the TCP_KEEPALIVE_INTERVAL setting on the WebSphere MQ machine. This should be set to be less than the firewall timeout value. If you do not set the TCP_KEEPALIVE_INTERVAL to be lower than the firewall timeout, then the keepalive packets will not be frequent enough to keep the connection open between WebSphere Application Server and MQ.

NOTE: You must be sure that the firewall is configured to allow keepalive packets to pass through. A connection broken error could be caused by the firewall not letting the keepalive packets through.

An explicit action can cause this
An action such as stopping the queue manager or restarting the queue manager would also cause Reason Code 2009. There are also some MQ defects that could result in unexpected 2009 errors. When this document was written, APARs that addressed these defects included IY59675, IC42636, PQ87316, and PQ93130. It is a good idea to install the latest available Fix Pack for WebSphere MQ or Interim Fix for Embedded Messaging.

The maximum number of channels has been reached
This could be due to the number of channels for the JMS provider not being large enough, or there could be some errors occurring that are causing channels to not close, so that they cannot be reused. For additional information, refer to these technotes, MQ Manager Stops Responding To JMS Requests. Also, WebSphere Application Server and MQ do not agree on the number of JMS connections.

A QCF Configuration problem
This problem could also occur because of a QCF configuration problem. If the Queue Manager, Host, Port, and Channel properties are not set correctly, a Reason Code 2009 would occur when an application uses the QCF to try to connect to the queue manager.

Other best practices
1. Set the Purge Policy of the QCF Connection Pool and Session Pool to EntirePool. The default value is FailingConnectionOnly. When the Purge Policy is set to EntirePool, the WebSphere connection pool manager will flush the entire connection pool when a fatal connection error, such as Reason Code 2009, occurs. This will prevent the application from getting other bad connections from the pool.

2. If the Reason Code 2009 error occurs when a message-driven bean (MDB) tries to connect to the queue manager, configure the MAX.RECOVERY.RETRIES and RECOVERY.RETRY.INTERVAL properties so that the message listener service will retry the connection. See Message listener service custom properties for more information on these properties.

3. If you are not using an MDB, but the Reason Code 2009 error occurs for an application that sends messages to a queue, the application should to have logic to retry the connection when the error occurs. See, Developing a J2EE application to use JMS, for information on how to program your application to use a JMS connection. Also, see Tips for troubleshooting WebSphere Messaging for additional details.










Thanks,
Arun Prithviraj
Back to top
View user's profile Send private message
squidward
PostPosted: Wed Sep 01, 2010 12:44 pm    Post subject: Reply with quote

Novice

Joined: 27 Mar 2009
Posts: 10

I know this post is pretty old, but we are having the same problem and this thread is the first thing to come up on google. Looks like the issue is addressed in this APAR:
http://www-01.ibm.com/support/docview.wss?rs=180&uid=swg1PK83875
Back to top
View user's profile Send private message
Gaya3
PostPosted: Wed Sep 01, 2010 12:56 pm    Post subject: Reply with quote

Jedi

Joined: 12 Sep 2006
Posts: 2493
Location: Boston, US

squidward wrote:
I know this post is pretty old, but we are having the same problem and this thread is the first thing to come up on google. Looks like the issue is addressed in this APAR:
http://www-01.ibm.com/support/docview.wss?rs=180&uid=swg1PK83875


2 years old post, you could have initiated a new post

whats MQ Version?
Where is apps running, is it running on Apps server or stand alone?
_________________
Regards
Gayathri
-----------------------------------------------
Do Something Before you Die
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic  Reply to topic Page 1 of 1

MQSeries.net Forum Index » IBM MQ Telemetry / Low Latency Messaging / Everyplace » MQ Exception 2009 and 2019
Jump to:  



You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
Protected by Anti-Spam ACP
 
 


Theme by Dustin Baccetti
Powered by phpBB © 2001, 2002 phpBB Group

Copyright © MQSeries.net. All rights reserved.