File read - takes all CPU?

Hi, I am trying to load a file at runtime but somehow everything else is stoping when I do that. The file is around 500kb and I was hoping that someone around here could assist me or point me at the right direction.

When the file is getting loaded, line by line I want it to return a status - but that never happens until the file reading is done?

-- create a file path for corona i/o  
local path = system.pathForFile( filename, nil )  
  
-- will hold contents of file  
local contents = ""  
  
-- io.open opens a file at path. returns nil if no file found  
local file = io.open( path, "r" )  
  
if file then  
  
 -- Reading lines so that we can report back progress  
 for line in file:lines() do  
  
 contents = contents .. line -- display the line in the terminal  
  
 --This happens only when all lines are done and not for every line....  
 if stage == "F" then  
 -- Just a dummy sample code......  
 myParallax.position.x = myParallax.position.x + xx  
 end  
  
 end  
  
 io.close( file ) -- close the file after using it  
  
end  
return contents  

My file that I am reading from is formatted so it has a new line for every line. I know how many lines there are in the file, but theres seems not to be a valid API that I could use to get a specified row?

e.g

local myLine10 = file:lines(10);  
print("Line 10 contains: " .. myLine10)  

would have solved my issues, any suggestions or am I totally lost here?

Joakim

[import]uid: 81188 topic_id: 25770 reply_id: 325770[/import]

Do not read the file line by line.

Read the file as a whole and parse the resultant string ( like you can search for “\n\r” or something like that)

The performance increase would be significant. [import]uid: 64174 topic_id: 25770 reply_id: 104196[/import]

Well, I tried that to - but that is freezing the app during the load time of the file. The best is if I can read line by line and respond back to the app.

Joakim [import]uid: 81188 topic_id: 25770 reply_id: 104198[/import]

http://www.lua.org/pil/21.2.1.html

Maybe this would help you?
Instead of reading line by line, you can read file chunks of specific sizes… [import]uid: 64174 topic_id: 25770 reply_id: 104201[/import]

Hmmm, I just realized that the problem was not related to the file reading sequence - it is the JSON parsing that takes to much time…I guess I have to look for another approach…

Joakim [import]uid: 81188 topic_id: 25770 reply_id: 104216[/import]

try only load or read files at the beginning … [import]uid: 86417 topic_id: 25770 reply_id: 104278[/import]

@Martin.Edmaier - I tried that but we are working with over 100K of data, and that made the device go down on its knees :wink:

I solved it anyway, got rid of the json format and decoding and made my own loader. So now the loading time is five times faster. :slight_smile:

J

[import]uid: 81188 topic_id: 25770 reply_id: 104343[/import]