ASG
IBM
Zystems
Cressida
Icon
Netflexity
 
  MQSeries.net
Search  Search       Tech Exchange      Education      Certifications      Library      Info Center      SupportPacs      LinkedIn  Search  Search                                                                   FAQ  FAQ   Usergroups  Usergroups
 
Register  ::  Log in Log in to check your private messages
 
RSS Feed - WebSphere MQ Support RSS Feed - Message Broker Support

MQSeries.net Forum Index » General Discussion » Global Cache for handling large data

Post new topic  Reply to topic
 Global Cache for handling large data « View previous topic :: View next topic » 
Author Message
chaitanyauk
PostPosted: Thu May 18, 2017 2:23 am    Post subject: Global Cache for handling large data Reply with quote

Apprentice

Joined: 16 Apr 2017
Posts: 30

Hi Folks,
We are trying to use Global cache for storing large amount of data(like 70000 records from a DB).
Is there any efficient way to fetch those data?
Can hashmap be used for indexing in the global cache?
Any input,resources,code sample would be great.

Thanks
Back to top
View user's profile Send private message
bruce2359
PostPosted: Thu May 18, 2017 5:28 am    Post subject: Reply with quote

Poobah

Joined: 05 Jan 2008
Posts: 9394
Location: US: west coast, almost. Otherwise, enroute.

Is your question related to IBMs message broker? Or something else?
_________________
I like deadlines. I like to wave as they pass by.
ב''ה
Lex Orandi, Lex Credendi, Lex Vivendi. As we Worship, So we Believe, So we Live.
Back to top
View user's profile Send private message
Vitor
PostPosted: Thu May 18, 2017 5:51 am    Post subject: Re: Global Cache for handling large data Reply with quote

Grand High Poobah

Joined: 11 Nov 2005
Posts: 26093
Location: Texas, USA

chaitanyauk wrote:
We are trying to use Global cache for storing large amount of data(like 70000 records from a DB).
Is there any efficient way to fetch those data?


Yes - database forums are full of them.

chaitanyauk wrote:
Can hashmap be used for indexing in the global cache?


If you bothered to read the documentation, you'll observe the global cache is a hashmap.
_________________
Honesty is the best policy.
Insanity is the best defence.
Back to top
View user's profile Send private message
chaitanyauk
PostPosted: Thu May 25, 2017 11:23 pm    Post subject: Reply with quote

Apprentice

Joined: 16 Apr 2017
Posts: 30

Ok Thank you. Sorry for delayed reply.

I have new problems now.
I am trying to fetch 800000 record from DB and insert into global cache.
My java code:
Code:
static public Boolean insertCache(String cacheName, String key, String value) {
 try {
  MbGlobalMap map = MbGlobalMap.getGlobalMap(cacheName);
  String test = (String) map.get(key);
  if (test == null) {
   map.put(key, value);
   // System.out.println("insertCache if block");
  } else {
   map.update(key, value);
   // System.out.println("insertCache else block");
  }
 } catch (MbException e) {
  // TODO Auto-generated catch block
  System.err.println("error writing into cache");
  return false;
 }
 return true;
}

My ESQL Code:

Code:
CREATE LASTCHILD OF Environment.Variables DOMAIN ('XMLNSC') NAME 'Data';
--SET cQuery = 'Query to fetch records';
SET Environment.Variables.Data.Result[] = PASSTHRU(cQuery);
DECLARE SubConfig BLOB ASBITSTREAM(Environment.Variables.Data OPTIONS options CCSID 1208);
DECLARE KEY CHARACTER;
SET KEY = 'SOME_ID';
DECLARE isCached BOOLEAN insertCache('LookupCache', KEY, CAST(SubConfig AS CHARACTER));

When try to insert these many records into cache I get error:
'java.lang.ClassCastException: java.lang.OutOfMemoryError incompatible with java.lang.Exception' (CHARACTER)

I have increased my Broker JVM heap size to 1GB.
Using IIB 10.


Then I tried with only 200000 record which is getting inserted for the first time.
When I try to insert for the second time it shows same error.
Then I reduced my record size to 100000. After couple of insert,same error.Then reduced it to 50000. And it continues.

Sorry for lengthy story.!
Need some help on this!
Thanks
Back to top
View user's profile Send private message
sumit
PostPosted: Tue May 30, 2017 12:39 pm    Post subject: Reply with quote

Partisan

Joined: 19 Jan 2006
Posts: 398

chaitanyauk wrote:

Then I tried with only 200000 record which is getting inserted for the first time.
When I try to insert for the second time it shows same error.
Then I reduced my record size to 100000. After couple of insert,same error.Then reduced it to 50000. And it continues.

Does your code clear the cache after each run?
_________________
Regards
Sumit
Back to top
View user's profile Send private message Yahoo Messenger
mqjeff
PostPosted: Wed May 31, 2017 3:35 am    Post subject: Reply with quote

Grand Master

Joined: 25 Jun 2008
Posts: 17447

Do you still need
Code:
SET Environment.Variables.Data.Result[] = PASSTHRU(cQuery);

after your compute node is done?

If not, specifically DELETE it.

You might also want to wrap your call to the insertcache method in an esql atomic block...
_________________
chmod -R ugo-wx /
Back to top
View user's profile Send private message
chaitanyauk
PostPosted: Mon Jun 12, 2017 1:01 am    Post subject: Reply with quote

Apprentice

Joined: 16 Apr 2017
Posts: 30

sumit wrote:

Does your code clear the cache after each run?


No.

Now I am clearing the Environment variable hloding the data.
But still no use.
Code:
SET SubConfig = NULL;
  SET Environment.Variables = NULL;

When I try to insert large record(200000 records) it throws following error.

java.lang.ClassCastException: java.lang.OutOfMemoryError incompatible with java.lang.Exception

My jvm max heap(of both execution group and broker) is set to 1GB.

Thanks.
Back to top
View user's profile Send private message
mqjeff
PostPosted: Mon Jun 12, 2017 4:47 am    Post subject: Reply with quote

Grand Master

Joined: 25 Jun 2008
Posts: 17447

chaitanyauk wrote:
sumit wrote:

Does your code clear the cache after each run?


No.

Does it need to stick around?


chaitanyauk wrote:
Now I am clearing the Environment variable hloding the data.
But still no use.
Code:
SET SubConfig = NULL;
  SET Environment.Variables = NULL;

When I try to insert large record(200000 records) it throws following error.

That is not the same as
Code:
DELETE Environment.Variables


chaitanyauk wrote:
My jvm max heap(of both execution group and broker) is set to 1GB.

Is that sufficient? Are your query results always going to be reasonably less than that?
_________________
chmod -R ugo-wx /
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic  Reply to topic Page 1 of 1

MQSeries.net Forum Index » General Discussion » Global Cache for handling large data
Jump to:  



You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
Protected by Anti-Spam ACP
 
 


Theme by Dustin Baccetti
Powered by phpBB © 2001, 2002 phpBB Group

Copyright © MQSeries.net. All rights reserved.