Java reads simple instances of large files
- 2020-04-01 02:36:50
- OfStack
I'm going to extract useful data from a text file
Text files over 200 MB
Is it possible to build a cache to extract the useful data paragraph by paragraph? What should I do?
---------------------------------------------------------------
Oh,200MB...
In JAVA, memory-mapped files can be used to manipulate large files.
Up to 2GB.
Here is a simple example, more specifically looking at Java API DOCS or related materials
import java.io.*;
import java.nio.*;
import java.nio.channels.*;
public class LargeMappedFiles {
static int length = 0x8FFFFFF; // 128 Mb
public static void main(String[] args) throws Exception {
MappedByteBuffer out =
new RandomAccessFile("test.dat", "rw").getChannel()
.map(FileChannel.MapMode.READ_WRITE, 0, length);
for(int i = 0; i < length; i++)
out.put((byte)'x');
System.out.println("Finished writing");
for(int i = length/2; i < length/2 + 6; i++)
System.out.print((char)out.get(i)); //read file
}
} ///