IOError: [Errno 24] Too many open files

Having issues with your DietPi installation, or, found a bug? Post it here.
Post Reply
annlee
Posts: 11
Joined: Wed Nov 28, 2018 4:19 pm

IOError: [Errno 24] Too many open files

Post by annlee »

Hi there, I have a script that must run for hours writing in a CSV file a row every few seconds...

Code: Select all

f = open('/bin/stats.csv', 'at')  
writer = csv.writer(f)
writer.writerow(['Date', 'Light', 'Temperature', 'Humidity', 'Air Quality', 'Loudness'])
writer.writerow([campo_csv_reloj, campo_csv_luz, campo_csv_temp, campo_csv_hum, campo_csv_air, campo_csv_sound])
f.close()	
used this other way too:

Code: Select all

with open('/bin/stats.csv', 'at') as f:
writer = csv.writer(f)
writer.writerow(['Date', 'Light', 'Temperature', 'Humidity', 'Air Quality', 'Loudness'])
writer.writerow([campo_csv_reloj, campo_csv_luz, campo_csv_temp, campo_csv_hum, campo_csv_air, campo_csv_sound])
...but after around 30 minutes throws this error and stops:
IOError: [Errno 24] Too many open files
It must be an issue of Dietpi because I have a similar program running in way older DietPi distro and never had this problem.

If I type ulimit it says ''unlimited'', do you have any solution? what to modify in the OS?

Thanks in advance
User avatar
MichaIng
Site Admin
Posts: 2293
Joined: Sat Nov 18, 2017 6:21 pm

Re: IOError: [Errno 24] Too many open files

Post by MichaIng »

Hmm I don't know Python syntax, but in the first script at least I see a f.close() which looks like you close the file after writing to it. And as well it is only a single file (right?) and Python would be bad, if it wouldn't lock a file that is opened via f = open. So in theory indeed there should be only one file opened at all time.

But as I said, I am no Python expert, perhaps it creates tmp files or such, and, indeed you face the issue.

How do you execute the script? Does it run in a background, executing the code in a loop, or is it executed via external trigger, cron, systemd timer or such?

Please check htop to check whether there are more than one instances of the script running in parallel. In case you need to increase the execution intervals. Generally I would advice to run the script only once at start, and do the file writes via loop every X seconds. Then also it might be not required to declare, open and close the file each time, but just do this once at script start.
annlee
Posts: 11
Joined: Wed Nov 28, 2018 4:19 pm

Re: IOError: [Errno 24] Too many open files

Post by annlee »

hi, there are no more instances in htop, it runs at start-up in infinite loop

the point is it never closes the file, so in the end it reaches the limit of 1024 I see in ulimit

I did a temporary solution which is doing the loop once per minute and not every 2 seconds as I did before, so in the 8 hours it needs to run, it wont reach the limit of 1024, so no crash

apparently someone had the same issue and solved by this way https://stackoverflow.com/questions/520 ... e-csv-file still dont understand why?
User avatar
MichaIng
Site Admin
Posts: 2293
Joined: Sat Nov 18, 2017 6:21 pm

Re: IOError: [Errno 24] Too many open files

Post by MichaIng »

Interesting, so it seems that f.close does not really close the file, at least does not reduce that count for the limit.
EDIT: Or it is opened once per line, which is the issue? But we would need to compare the whole code with loop to check if this is the case, as it seems for the guy in the linked topic.

Jep then solution as I thought above, open/declare the file and writer once before the loop starts and then just write to it within the loop.

Perhaps there is also some sort of garbage collector that really frees the files from limit after f.close, but the above sounds way cleaner, especially when writing in seconds intervals.
annlee
Posts: 11
Joined: Wed Nov 28, 2018 4:19 pm

Re: IOError: [Errno 24] Too many open files

Post by annlee »

MichaIng wrote: Wed Jan 16, 2019 1:37 pm Interesting, so it seems that f.close does not really close the file, at least does not reduce that count for the limit.
EDIT: Or it is opened once per line, which is the issue? But we would need to compare the whole code with loop to check if this is the case, as it seems for the guy in the linked topic.

Jep then solution as I thought above, open/declare the file and writer once before the loop starts and then just write to it within the loop.

Perhaps there is also some sort of garbage collector that really frees the files from limit after f.close, but the above sounds way cleaner, especially when writing in seconds intervals.
yep, everyday is to learn a new thing lol
Post Reply