Question:
Java - Efficient way to read large number of double values from text file?
msparko
2009-01-05 00:58:34 UTC
I was looking for a way to parse a text file efficiently in Java. The text file consists of over a million double values. I tried using Double.parseDouble() and also using the Scanner.nextDouble() methods however these seem to be very inefficient as the parsing takes a long time, Is there a faster way of doing this. Thanks in advance
Five answers:
Blackcompe
2009-01-05 01:47:11 UTC
import java.io.*;

import java.util.*;



public class Parser

{



public static void main(String []args)

{



System.out.println("starting timer .....");

long startTime = 0, endTime = 0;



try{

FileOutputStream fos = new FileOutputStream("out.txt");

DataOutputStream dos = new DataOutputStream(fos);



Double b = new Double(0.0);

startTime = System.currentTimeMillis();

for(int i = 0; i < 1000000; i++)

{

dos.writeDouble(b);

b += 1.0;

}

endTime = System.currentTimeMillis();

//Close the output stream

fos.close();

dos.close();

}catch (Exception e){//Catch exception if any

System.err.println("Error: " + e.getMessage());

}



System.out.println("finished writing 1 million doubles as bytes to out.txt");

System.out.println("elapsed write time = "+((endTime-startTime))+" ms");



ArrayList list = new ArrayList();

try{

FileInputStream fis = new FileInputStream("out.txt");

DataInputStream dis = new DataInputStream(fis);



boolean EOF = false;

startTime = System.currentTimeMillis();

while(!EOF)

{

try

{

list.add(new Double(dis.readDouble()));

}catch(EOFException e){ EOF = true; }

}

endTime = System.currentTimeMillis();

//Close the output stream

fis.close();

dis.close();

}catch (Exception e){//Catch exception if any

}



System.out.println("finished reading 1 doubles from out.txt");

System.out.println("array list size = "+list.size());

System.out.println("elapsed read time = "+((endTime-startTime)/1000)+" ms");

}



}
snabby
2009-01-05 01:22:58 UTC
If this is for repeated parsing you may want to consider using serialization instead of a text file.



However Scanner.nextDouble() is probably the most efficient you're going to get. Remember it has to parse a string and convert that representation to a double. Double representation is far more complex than the integral types.
deonejuan
2009-01-05 01:46:48 UTC
I, personally have done about 800,000 numbers like you are doing. Have you tried: ArrayListbigDoubles to read the text file THEN go through bigDoubles with another ArrayList myDoubles();



myDoubles.add( bigDoubles.get(i).doubleValue() );

( be sure to: bigDoubles = null; afterwards)



or

keep ArrayList bigDoubles and use the values from RAM when you specifically need one.



I don't know for sure about any speed gain, but BigDecimal uses StringBuilder which is the most efficient. Having the Strings already in memory should give you execution at n^-10



And, I'm just guess. Interesting problem. I'm not at my computer this week or I would experiment.
P34C3
2009-01-05 03:47:58 UTC
You could try experimenting with the nio package in java (I think java5 and above have this). Try using the doublebuffer from a memory mapped file and see if your throughput gets any better.
mondragon
2016-10-20 16:05:20 UTC
nicely, you comprehend what the vowels are precise? so which you parse by using the enter testing for the occurance of ur situation(ie: being a vowel) do no longer hate cuz I dont submit the code..you submit united statesfirst and we will help debug. its no longer My task


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...