Discussion on python writing a large number of files

  • 2021-01-25 07:46:14
  • OfStack

Today to prepare to combine several txt files into one file, use f. write method when writing, found the program execution, this should be around 100000 lines of txt record, actually only be written more than four k lines.

I checked it online because the program is running too fast, so that the file has been closed before the content is fully written into the file

Method 1: Add a buffer


f.flush()
//operation
os.fsync(output)
f.close() 

Execute flush() after opening the file and os.fsync () before closing the file to ensure that the contents of the buffer are written out.

Method 2: Use sleep

This doesn't work, because I might explode the buffer by iterating output after opening the file. Adding the sleep function to the loop ensures that the contents will be written each time


 with open(outputfile,'a') as output:
  for i in all_txt_name:
   f =open(dir+'/'+i)
   for a in f:
    output.write(a)
    time.sleep(0.00000001)
   f.close()

Then it was discovered:

txt file is too large, only a small part of the file will be displayed in pycharm


Related articles: