Author |
Message
|
chaitanyauk |
Posted: Thu May 18, 2017 2:23 am Post subject: Global Cache for handling large data |
|
|
Apprentice
Joined: 16 Apr 2017 Posts: 30
|
Hi Folks,
We are trying to use Global cache for storing large amount of data(like 70000 records from a DB).
Is there any efficient way to fetch those data?
Can hashmap be used for indexing in the global cache?
Any input,resources,code sample would be great.
Thanks  |
|
Back to top |
|
 |
bruce2359 |
Posted: Thu May 18, 2017 5:28 am Post subject: |
|
|
 Poobah
Joined: 05 Jan 2008 Posts: 9469 Location: US: west coast, almost. Otherwise, enroute.
|
Is your question related to IBMs message broker? Or something else? _________________ I like deadlines. I like to wave as they pass by.
ב''ה
Lex Orandi, Lex Credendi, Lex Vivendi. As we Worship, So we Believe, So we Live. |
|
Back to top |
|
 |
Vitor |
Posted: Thu May 18, 2017 5:51 am Post subject: Re: Global Cache for handling large data |
|
|
 Grand High Poobah
Joined: 11 Nov 2005 Posts: 26093 Location: Texas, USA
|
chaitanyauk wrote: |
We are trying to use Global cache for storing large amount of data(like 70000 records from a DB).
Is there any efficient way to fetch those data? |
Yes - database forums are full of them.
chaitanyauk wrote: |
Can hashmap be used for indexing in the global cache? |
If you bothered to read the documentation, you'll observe the global cache is a hashmap. _________________ Honesty is the best policy.
Insanity is the best defence. |
|
Back to top |
|
 |
chaitanyauk |
Posted: Thu May 25, 2017 11:23 pm Post subject: |
|
|
Apprentice
Joined: 16 Apr 2017 Posts: 30
|
Ok Thank you. Sorry for delayed reply.
I have new problems now.
I am trying to fetch 800000 record from DB and insert into global cache.
My java code:
Code: |
static public Boolean insertCache(String cacheName, String key, String value) {
try {
MbGlobalMap map = MbGlobalMap.getGlobalMap(cacheName);
String test = (String) map.get(key);
if (test == null) {
map.put(key, value);
// System.out.println("insertCache if block");
} else {
map.update(key, value);
// System.out.println("insertCache else block");
}
} catch (MbException e) {
// TODO Auto-generated catch block
System.err.println("error writing into cache");
return false;
}
return true;
}
|
My ESQL Code:
Code: |
CREATE LASTCHILD OF Environment.Variables DOMAIN ('XMLNSC') NAME 'Data';
--SET cQuery = 'Query to fetch records';
SET Environment.Variables.Data.Result[] = PASSTHRU(cQuery);
DECLARE SubConfig BLOB ASBITSTREAM(Environment.Variables.Data OPTIONS options CCSID 1208);
DECLARE KEY CHARACTER;
SET KEY = 'SOME_ID';
DECLARE isCached BOOLEAN insertCache('LookupCache', KEY, CAST(SubConfig AS CHARACTER));
|
When try to insert these many records into cache I get error:
'java.lang.ClassCastException: java.lang.OutOfMemoryError incompatible with java.lang.Exception' (CHARACTER)
I have increased my Broker JVM heap size to 1GB.
Using IIB 10.
Then I tried with only 200000 record which is getting inserted for the first time.
When I try to insert for the second time it shows same error.
Then I reduced my record size to 100000. After couple of insert,same error.Then reduced it to 50000. And it continues.
Sorry for lengthy story.!
Need some help on this!
Thanks
 |
|
Back to top |
|
 |
sumit |
Posted: Tue May 30, 2017 12:39 pm Post subject: |
|
|
Partisan
Joined: 19 Jan 2006 Posts: 398
|
chaitanyauk wrote: |
Then I tried with only 200000 record which is getting inserted for the first time.
When I try to insert for the second time it shows same error.
Then I reduced my record size to 100000. After couple of insert,same error.Then reduced it to 50000. And it continues. |
Does your code clear the cache after each run? _________________ Regards
Sumit |
|
Back to top |
|
 |
mqjeff |
Posted: Wed May 31, 2017 3:35 am Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
Do you still need
Code: |
SET Environment.Variables.Data.Result[] = PASSTHRU(cQuery); |
after your compute node is done?
If not, specifically DELETE it.
You might also want to wrap your call to the insertcache method in an esql atomic block... _________________ chmod -R ugo-wx / |
|
Back to top |
|
 |
chaitanyauk |
Posted: Mon Jun 12, 2017 1:01 am Post subject: |
|
|
Apprentice
Joined: 16 Apr 2017 Posts: 30
|
sumit wrote: |
Does your code clear the cache after each run? |
No.
Now I am clearing the Environment variable hloding the data.
But still no use.
Code: |
SET SubConfig = NULL;
SET Environment.Variables = NULL; |
When I try to insert large record(200000 records) it throws following error.
java.lang.ClassCastException: java.lang.OutOfMemoryError incompatible with java.lang.Exception
My jvm max heap(of both execution group and broker) is set to 1GB.
Thanks. |
|
Back to top |
|
 |
mqjeff |
Posted: Mon Jun 12, 2017 4:47 am Post subject: |
|
|
Grand Master
Joined: 25 Jun 2008 Posts: 17447
|
chaitanyauk wrote: |
sumit wrote: |
Does your code clear the cache after each run? |
No. |
Does it need to stick around?
chaitanyauk wrote: |
Now I am clearing the Environment variable hloding the data.
But still no use.
Code: |
SET SubConfig = NULL;
SET Environment.Variables = NULL; |
When I try to insert large record(200000 records) it throws following error. |
That is not the same as
Code: |
DELETE Environment.Variables |
chaitanyauk wrote: |
My jvm max heap(of both execution group and broker) is set to 1GB. |
Is that sufficient? Are your query results always going to be reasonably less than that? _________________ chmod -R ugo-wx / |
|
Back to top |
|
 |
|