php notes: regular reading and writing of large files

  • 2020-06-01 08:44:31
  • OfStack

I have been doing one thing these days. I have studied PHP to read files with a large number of lines (about millions of lines). Considering the efficiency, I have conducted a simple study

Article 1. The efficiency of the file() function.

The file() function is very inefficient if it is a regular file, such as 1 line of corresponding data, then try not to use the file() function

You can use file_get_contents() and cut with explode

Here's an example:

The file style is as follows:








If you read it in file($file), it takes a long time.

You can use explode("\n",file_get_contents($file)); It's a lot faster.

Article 2. Array traversal.

The data has been read into the array.

All I need is to make sure that there is a value in the array, such as 44444 in the array. The first thing that comes to mind is in_array().

However, the efficiency was found to be very low after the experiment. So I thought of a way to refer to other people's code. I flipped the array and turned all the values into 1.

If the array size is very large and the data in the array is not fully used, it is better to extract the array used for traversing. This will improve the efficiency.

Article 3, storage of arrays.

Save the calculated data in one file. Consider three methods. One is written directly as php file. One is serialize, and one is json string.

The first way

Write directly to file < ? php connection var_export($var) connection ";" The connection? > Save as PHP

require comes in when needed.

The second way. Put the variable serialize and then file_put_contents() into the file. unserialize is ok when used.

The third way is similar to the second way, except it's written as an json string.

After testing, it was found that the second one was the most efficient. The third one was the second most efficient. It was almost as efficient as the second one. The first one was the slowest.

Related articles: