Caused By Org.postgresql.util.psqlexception Error Out Of Memory
pgsql-announce pgsql-bugs pgsql-docs pgsql-general pgsql-interfaces pgsql-jobs pgsql-novice pgsql-performance pgsql-php pgsql-sql pgsql-students Developer lists Regional lists Associations User groups Project lists Inactive lists IRC Local User Groups Featured Users International Sites Propaganda Resources Weekly News ERROR: out of shared memory From: "Sorin N(dot) Ciolofan"
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow the company Business Learn more about hiring developers or posting ads with us Stack Overflow Questions Jobs Documentation Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community Stack Overflow is a community of 4.7 million programmers, just like you, helping each other. Join them; it only takes a minute: Sign up org.postgresql.util.PSQLException: ERROR: out of shared memory up vote 0 down vote favorite I am calling a function having more than 200 DROP Table Statements using JAVA and I am getting org.postgresql.util.PSQLException: ERROR: out https://www.postgresql.org/message-id/20070326123127.55F808E40FC@mailhost.ics.forth.gr of shared memory. What approach should i follow in order to avoid out of shared memomry ? PS : Restriction is that I can't change any parameters related to PostgresSQL. java postgresql share|improve this question asked Oct 3 '11 at 12:17 Abhishek Parikh 61217 By "function" you mean a server side function or a Java method? Is the cause of the exception on the server-side or on the client-side? Your little information bits don't provide enough information http://stackoverflow.com/questions/7634871/org-postgresql-util-psqlexception-error-out-of-shared-memory even for guessing. –A.H. Oct 3 '11 at 12:48 By Function I mean a PostgresSQL Function(Procedure). The exception is at the end from where i am calling the function that is on the JAVA Side. –Abhishek Parikh Oct 3 '11 at 13:31 add a comment| 1 Answer 1 active oldest votes up vote 1 down vote If the cause of the error is on the server side: In PostgreSQL a function is always executed inside a transaction. DO blocks are anonymous functions and are handled the same way. And because even DML commands like CREATE or DROP are transactional in PostgreSQL, these commands also stress the usual resources used for ROLLBACK and COMMIT. My guess is that dropping a huge number of large tables eats to much memory. So if you don't need transactional behaviour in your function, the easiest way is to split the large function into several smaller ones. Call each function in a separate transaction. share|improve this answer answered Oct 3 '11 at 13:51 A.H. 34.7k95983 add a comment| Your Answer draft saved draft discarded Sign up or log in Sign up using Google Sign up using Facebook Sign up using Email and Password Post as a guest Name Email Post as a guest Name Email discard By posting your answer, you agree to the privacy policy and terms of service. Not the answer you're looking for? Browse other questions tag
a out of memory https://mxforum.mendix.com/questions/2819/Delete--500000-records--out-of-memory problem, when I tried to insert a binary file (256MB) to bytea caused by column;I want to get a way to insert files (vary from 1byte to 2GB) or byte array or binary stream intoPostgreSQL bytea field, never caused by org.postgresql.util.psqlexception cause out of memory. Fellowed by the details.Anybody know about this, please write to me.Thanks in advance!Table defination:create table image_bytea(t_id int, t_image bytea);Major code:String sql = "insert into image_bytea(t_id, t_image) values (?, ?)";ps = conn.prepareStatement(sql); ps.setInt(1, 88);File file = new file("d://1.jpg"); InputStream in = new BufferedInputStream(new FileInputStream(file));ps.setBinaryStream(2, in, (int) file.length());System.out.println("set");ps.executeUpdate();Error detail:org.postgresql.util.PSQLException: Error: out of memoryDetails:Failed on request of size 268443660.at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2157)at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1886)at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:555)at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:417)at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:410)at com.highgo.hgdbadmin.migrationassistant.controller.MigrationController.executeInsert(MigrationController.java:1400)at com.highgo.hgdbadmin.migrationassistant.controller.MigrationController.insertDate2HG(MigrationController.java:1143)at com.highgo.hgdbadmin.migrationassistant.controller.MigrationController.migrateTable(MigrationController.java:898)at com.highgo.hgdbadmin.migrationassistant.controller.MigrationController.migrate(MigrationController
the app-store. We however face the problem that there are a lot of entries in written to the db (over 500.000) and clicking the "delete all" button shows the "in progress" screen but after 30? seconds the screen freezes and the logging shows the exception below. After restarting m2ee and trying again, deleting doesn't work (same problem) or only partitially works. I pasted the stacktrace and the delete-java-action-code below. My question is how I can fix the out of memory. Is just the batchsize too high (10.000) Or would it at all be impossible to delete this amount of records with 1 buttonpress (meaning that the action should be changed, by letting only delete 50.000? records for e.g.)? Stacktrace Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: An error has occurred while handling the request. [User 'gcoroberto@greencloudsonline.com' with roles 'GreenClouds_Operations'] Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (1/102) com.mendix.core.CoreException: Exception occurred in action 'Microflow [Logging.DeleteMessages]', all database changes executed by this action were rolled back Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (2/102) at com.mendix.core.actionmanagement.CoreAction.d(SourceFile:553) Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (3/102) Caused by: com.mendix.core.CoreException: Exception occurred in microflow 'Logging.DeleteMessages' for activity 'batch delete all records Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (4/102) ', all database changes executed by this microflow were rolled back Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (5/102) at kM.b(SourceFile:252) Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (6/102) Caused by: com.mendix.core.CoreException: af: An exception has occurred for the following request(s): Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (7/102) jF (depth = 0, amount = 10000): //Logging.Message Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (8/102) at it.b(SourceFile:167) Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (9/102) Caused by: af: An exception has occurred for the following request(s): Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (10/102) jF (depth = 0, amount = 10000): //Logging.Message Sep 12 16:38:22 127.0.0.1 greprod: ERROR - Connector: (11/102) Caused by: af: Exception occurred while retrieving data. (SQL State: 53200, Error Code: 0) Detail Message: java.lang.OutOfMemoryError: Java heap space, org.postgresql.util.PSQLException: Ran out of memory retrieving query results., java.lang.OutOfMemoryError: Java hea