Python Memory Error Linux
Contents |
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow the company Business memory error python list Learn more about hiring developers or posting ads with us Stack Overflow Questions Jobs Documentation python memory error increase memory Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community Stack Overflow is a community of 6.2 million programmers, just python out of memory exception like you, helping each other. Join them; it only takes a minute: Sign up Python Memory Error (Unix vs Windows) up vote 1 down vote favorite 1 I am developing an application which loads big JPEG2000 images memory error python pandas and converts them to TIF. Later on, it extracts the pixel data and there is some processing. It perfectly works except in one case: When I load the biggest image and extract the data (10956x10956 array) the program crashes but it only crashes on Windows -- never on Linux. (I'd like a portable application). I detected the problem and it is because of a memory error when interpolating 20x20 to the image sampling. f=RectBivariateSpline(x,y,Sun_angles) xnew
Python Memory Error Numpy
= numpy.linspace(x.min(),x.max(),rows) ynew = numpy.linspace(y.min(),y.max(),columns) Sun_angles_new = f(xnew, ynew) #here it crashes I also tried with interp2d and mapcoordinates and I got the same result. Python is supposed to be completely portable but I've got the impression is optimised for Unix systems (see also: Running python on a Windows machine vs Linux) Note: I run the program on two different computers but with the same processor and RAM. Therefore... what could be the reason? And is there any other interpolation with less memory consumption? Note: One solution is dividing the image in tiles and interpolate. python linux memory share|improve this question edited Nov 25 '11 at 9:45 Ferdinand Beyer 37.8k781110 asked Nov 25 '11 at 9:41 gorro 316 2 Which architecture (32-bit vs 64-bit) is each OS? –Marcelo Cantos Nov 25 '11 at 9:48 I checked again and the memory is the same but the processors are different. In windows it is a Core2Duo 32 bits and in linux an Intel i5 64 bits...sorry. –gorro Nov 25 '11 at 10:08 Possibly there is a different instruction set or other causes but the dynamic memory it's still the same.... –gorro Nov 25 '11 at 10:09 All Core 2 CPUs are 64-bit, but that doesn't mean much. In Windows, you can check the actual running architecture by lookin
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow the company Business Learn more about hiring developers or posting ads with us
How To Solve Memory Error In Python
Stack Overflow Questions Jobs Documentation Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community python increase memory limit windows Stack Overflow is a community of 6.2 million programmers, just like you, helping each other. Join them; it only takes a minute: Sign up What are python increase memory allocation the workaround options for python out of memory error? up vote 2 down vote favorite I am reading a x,y,z point file (LAS) into python and have run into memory errors. I am interpolating unknown points between known points for a project http://stackoverflow.com/questions/8267142/python-memory-error-unix-vs-windows I am working on. I began working with small files (< 5,000,000 points) and was able to read/write to a numpy array and python lists with no problem. I have received more data to work with (> 50,000,000 points) and now my code fails with a MemoryError. What are some options for handling such large amounts of data? I do not have to load all data into memory at once, but I will need to look at neighboring points using scipy kd-tree I am using Python http://stackoverflow.com/questions/19960166/what-are-the-workaround-options-for-python-out-of-memory-error 2.7 32 bit on a 64 bit Windows XP OS. Thanks in advance. EDIT: Code is posted below. I took out code for long calculations and variable definitions. from liblas import file import numpy as np f = file.File(las_file, mode='r') num_points = int(f.__len__()) dt = [('x', 'f4'), ('y', 'f4'), ('z', 'f4'), ('i', 'u2'), ('c', 'u1'), ('t', 'datetime64[us]')] xyzict = np.empty(shape=(num_points,), dtype = dt) counter = 0 for p in f: newrow = (p.x, p.y, p.z, p.intensity, p.classification, p.time) xyzict[counter] = newrow counter += 1 dropoutList = [] counter = 0 for i in np.nditer(xyzict): # code to define P1x, P1y, P1z, P1t if counter != 0: # code to calculate n, tDiff, and seconds if n > 1 and n < scanN: # code to find v and vD for d in range(1, int(n-1)): # Code to interpolate x, y, z for points between P0 and P1 # Append tuple of x, y, and z to dropoutList dropoutList.append(vD) # code to set x, y, z, t for next iteration counter += 1 python numpy scipy out-of-memory share|improve this question edited Nov 13 '13 at 19:01 asked Nov 13 '13 at 17:15 Barbarossa 176412 Can you show the code that is giving the error? (Or a small snippet that reproduces the problem?) There may be a way to make it more efficient, but it's impossible to tell without the code. –David Robinson Nov 13 '13 at 17:17 1 Are you using np.loadtxt or np.genfromtxt? If so, they're quite inefficient for large fi
here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site About Us Learn more about Stack Overflow the company Business Learn more about hiring developers or posting ads with us Stack Overflow Questions http://stackoverflow.com/questions/32723597/numpy-memory-error-on-linux-server-but-not-mac Jobs Documentation Tags Users Badges Ask Question x Dismiss Join the Stack Overflow Community Stack Overflow is a community of 6.2 million programmers, just like you, helping each other. Join them; it only takes a minute: Sign up Numpy Memory Error on Linux Server but not Mac up vote 1 down vote favorite I know there are a ton of numpy memory error topics, so I hope I haven't duplicated anything. I'm trying to create a np array using np.zeros((500000,10000)). This works fine on my Mac with 16G memory error of memory, but on a Linux server with 28G of RAM it fails instantly with Memory Error. I've verified that I'm running the 64 bit version of Ubuntu and Python, and I'm on Numpy 1.9.3. The only difference I noticed between systems (apart from the obvious) is that when running ulimit -a I get: Linux: max locked memory (kbytes, -l) 64 Mac: max locked memory (kbytes, -l) unlimited Could this be the reason I can't run this command? If not, is there some other configuration option I'm missing? python python memory error linux osx numpy share|improve this question asked Sep 22 '15 at 17:51 jpavs 918 Is the server yours or a host you are just using administered by someone else? –Shawn Mehan Sep 22 '15 at 17:55 It's on AWS but I have sudo access –jpavs Sep 22 '15 at 18:01 1 well, you will want to read something like this. You seem to have been locked down by admin. I don't know if you can affect this yourself or need to ask for the hand of Bezos, but you were on the right track. Good luck. –Shawn Mehan Sep 22 '15 at 18:04 @ShawnMehan This is a few months later, but it turned out that I had no virtual memory when I tried to do this. I had to cajole my sysadmin into giving me like 20GB but it worked after that :) –jpavs Jan 21 at 15:18 add a comment| active oldest votes Know someone who can answer? Share a link to this question via email, Google+, Twitter, or Facebook. Your Answer draft saved draft discarded Sign up or log in Sign up using Google Sign up using Facebook Sign up using Email and Password Post as a guest Name Email Post as a guest Name Email discard By posting your answer, you agree to the privacy policy and terms of service. Browse other questions tagged python linux osx numpy or ask your own question. asked 1 year ago viewed 82 times Blog Stack Overflow Podcast #92 - The Guerilla Guide to Interviewing