Can any Corona / software developers cast their eyes over this for me please....

Hi Guys,

             Thank you for looking at my post and let me explain my situation :).

Background

The app in question is called Check Your Change, currently published in the Google Play Store.

We have about 20000 active users and over 100k installs over about 3 years. It’s a coin based app so UK coin collectors can mark off what coins they find in their change. We update the app regularly but being only a 1 man band with limited coding knowledge (basic Corona / Lua but proficient Swift / Xcode) we store all the data locally on the device and just manually update a pre loaded database each time.

The Problem

The problem we have is during the update process, we are seeing an increasing number of users lose their data. It is a low number like under 2% of reported cases but it is big enough that it is causing us problems and we really want to fix it for our users

The Update Process

Not having much experience with database migrations and limited info on how to persist data during updates, I ended up with the following solution:

A master database contains all of our coin data.

It is stored in the resource directory.

When a user opens the app for the first time, the database is copied to the users documents directory.

Users manipulate data through the app and it writes to their database in the documents directory.

During an update, I have a column in the databases called version, If the versions don’t match then an update is performed…

The user database is read and any data that needs to be persisted is written to arrays.

The user database is deleted.

The master database (with a higher version than what the user database was) is copied to the users documents directory.

The data in the arrays is written to the newly copied database, which just involves looping through the arrays and updating the newly copied database.

My Theories

IS this totally the wrong way to update user data?

Am I doing something incredibly stupid in my code that means a database is being closed before the loops are finishing through the user data?

Is it a limitation of Corona?

Is it something database related?

Is it something Android system related? (there appears to be no pattern to the devices involved)

Is it something to do with not being able to multi thread?

I have had problems with the activityIndicator performing incorrectly during the update too. Originally I had it timer based but then changed it all to try and resolve the issue.

I have spent many sleepless nights and hours looking into this problem and I can never replicate it, we beta test each update with about 10 devices and we can never replicate it but there is a hidden daemon there and I thought I would try reaching out to see if anyone has any ideas or advice?

I have attached my code with comments for the update process if anyone would be kind enough to have a look.

Kind Regards,

Mike.

hi,

if i understand this correctly, when you say the persisten data is written to arrays, does that mean it is loaded to an array in memory, holding it there until the database has been replaced with the new version, then is inserted back?

If so that is risky business and i can well imagine that 2% of your users experience issues with this.

A safer way would be to never rely on ram only but ensure critical data always remain on disk, in case of any failiure or other “hick-up” issue, a device may experience. I’d probably copy the new db to documents with a new name, then read user data from the old base and store it in the new one. I’d even run a verify to be sure, before you either delete or rename (db_v3) the old one, and rename the new updated one to db.

That said, for 5-6 usd per month and 20 usd per year (for a lua api), you can have your own database online and read/write directly to it at all times, if that would server your purpose.

anaqim

one more thing, if you experience delays when inserting / updating rows in sqllite3 on devices, make sure you open a db connetion first, then loop through the inserts/updates, then close the db connetion.

it is way faster because you only open the db once.

if you just loop a bunch of inserts/updates, every single call will first open a database connection, insert/update one row, then close the database connection.

that can very well be experienced as a “freeze” on some devices.

Hi Anaqim,

                  Thank you for taking the time to reply and for your input!

Yes in answer to “does that mean it is loaded to an array in memory”, it loops through the users database of about 600 rows, loading ones that the user has manipulated into arrays. Once the loop has finished, it deletes the users database, copies a new one from resources then loops through the arrays writing the data back into the database.

It’s almost like something is closing or finishing early, maybe due to system resources or the next piece of code is running before the first bit has finished but its so inconsistent and not I’ve not been able to replicate it in a test environment.

I like your idea of keeping their database and introducing a third with a verification process. I can see if the verification fails I can fall back to their old DB so they don’t lose data but then I can’t foresee a recovery process from that :(.

I think in that situation the same mysterious issue may occur, unless it’s safer to write from one DB to another? Is that possible in Corona, like to have 2 database connections open at once and loop through one set of results and write to the other database? That could work!?

What’s this Lua API you mentioned? I’d very much like to take a look at that :).

I just saw your second reply too - I learnt the hard way about that ;). I was very careful to make sure I wasn’t opening a DB connection in a loop!

Thank you again, I can’t tell you how good it feels to have some feedback and ideas.

Kind Regards,

Krivvenz.

You absolutely can have multiple database connections open at once. Are you transforming the database structure dramatically during updates? If it’s a case of adding/removing a few columns, you could just do that to the existing DB.

I also keep a backup before I do any serious work on the DB, set a processing flag to true in the ‘live’ version and if that flag is never set to false again, I restore from the backup.

Hi Mike,

With only 600ish rows i’d look into doing something like this:

Why not keep the database as it is permanently, and instead add a new table for each version you release to it.

That way you can easily merge data while keeping full user retention, in case of ever needing to roll back or recover user data.

If you go to the marketplace, you can look up “coronium core”. It works wonders for me and I especially love being able to write server side code, in lua. Just make sure you implement a good level of security in your code, to make it tamper/hacker resistant.

Happy to share mate :slight_smile:

If you do decide to go for coronium core, i can give you some pointers regarding the security.

Been working a lot with this lately  :slight_smile:

Hi Nick,

            Thanks for the response!

During the update, there is no real structure change.

The newly copied master database may contain more rows if new coins have been added or slightly adjusted information from already existing fields but there are never any new columns or tables.

As an example… The master database contains about 600 rows of coin data in a coins table with about 15 columns consisting of things like date, mintage, description, obverse_image, reverse_image, acquired, used_quantity, new_quantity, notes… ect

During an update we might add 5 new rows and update a few pieces of information for 30 or so already existing rows, like the mintage figures or a typo in a description.

There are only 4 pieces of information which the user can change… acquired, used_quantity, new_quantity and notes.

This is the data that gets stored in the arrays, then written back to the new fresh master that gets copied to the documents directory.

In the arrays I always store the coin id against the piece of information so when it comes to writing back to a new master… I just update the columns I need to using the stored id and the piece of information, if that makes sense :/.

Kind Regards,

Krivvenz.

Everytime something changes in the master DB that ships with the app, you could store the insert/update command that generated the change, along with the version number. If the version is greater than the users, loop through the commands to update their DB.

Hi Nick,

             That too sounds a good solution, I can see how having like a switch statement on the database version number along with all the various update statements that go for that particular version, the only problem with this is that it will involve a lot of typing of statements to amend the values which I currently do directly in DBBrowser for SQLite.

It’s certainly a possibility but in each update there could be 30 - 50 updates along with whole new rows added and maybe its my inexperience with databases but to me, it sounds quite messy? (Never having been involved in a proper database migration - maybe this is industry standard normal practice and where my problem lies?!).

Someone else suggested the following but I’m not sure if its doable…

Basically it involves separating out the user data into a new table.

user_data_table

coin_id (FK), acquired, new_quantity, used_quantity and notes

coin_table

id (PK), desc, image_reverse, image_obverse, mintage ect…

Then during an update, just perform a DROP on the coin table and COPY the new coin table from the master database. 

It sounds quite good in theory but having a quick look I can’t actually see any commands to easily copy a whole table from one DB to another in Corona SDK… I’ll keep researching though. I think if anything happened during the DROP there would be no way to roll back either :(.

Hmmm what to try!!!

Kind Regards,

Mike.

As for typing statements, there is a program called sqldiff into which you should be able to feed in the previous and current versions of your master DB, and it will spit out the statements required to bring them in line. Not tried it myself though.

If I were developing an app like this, I would perhaps consider not updating the master database at all, and store any user updates in a separate db.  Basically, use the master db as a read-only db, open the user’s db and grab the user data based on a key.

there should never be a single nanosecond where persisted (user) data isn’t somewhere on disk.   rename or copy the current database to a backup for the duration of the update operation.  only after the new/updated version has been written AND read-verified should you remove the backup.  if any error during update process, then opposite: remove the update and restore backup.

hi,

if i understand this correctly, when you say the persisten data is written to arrays, does that mean it is loaded to an array in memory, holding it there until the database has been replaced with the new version, then is inserted back?

If so that is risky business and i can well imagine that 2% of your users experience issues with this.

A safer way would be to never rely on ram only but ensure critical data always remain on disk, in case of any failiure or other “hick-up” issue, a device may experience. I’d probably copy the new db to documents with a new name, then read user data from the old base and store it in the new one. I’d even run a verify to be sure, before you either delete or rename (db_v3) the old one, and rename the new updated one to db.

That said, for 5-6 usd per month and 20 usd per year (for a lua api), you can have your own database online and read/write directly to it at all times, if that would server your purpose.

anaqim

one more thing, if you experience delays when inserting / updating rows in sqllite3 on devices, make sure you open a db connetion first, then loop through the inserts/updates, then close the db connetion.

it is way faster because you only open the db once.

if you just loop a bunch of inserts/updates, every single call will first open a database connection, insert/update one row, then close the database connection.

that can very well be experienced as a “freeze” on some devices.

Hi Anaqim,

                  Thank you for taking the time to reply and for your input!

Yes in answer to “does that mean it is loaded to an array in memory”, it loops through the users database of about 600 rows, loading ones that the user has manipulated into arrays. Once the loop has finished, it deletes the users database, copies a new one from resources then loops through the arrays writing the data back into the database.

It’s almost like something is closing or finishing early, maybe due to system resources or the next piece of code is running before the first bit has finished but its so inconsistent and not I’ve not been able to replicate it in a test environment.

I like your idea of keeping their database and introducing a third with a verification process. I can see if the verification fails I can fall back to their old DB so they don’t lose data but then I can’t foresee a recovery process from that :(.

I think in that situation the same mysterious issue may occur, unless it’s safer to write from one DB to another? Is that possible in Corona, like to have 2 database connections open at once and loop through one set of results and write to the other database? That could work!?

What’s this Lua API you mentioned? I’d very much like to take a look at that :).

I just saw your second reply too - I learnt the hard way about that ;). I was very careful to make sure I wasn’t opening a DB connection in a loop!

Thank you again, I can’t tell you how good it feels to have some feedback and ideas.

Kind Regards,

Krivvenz.

You absolutely can have multiple database connections open at once. Are you transforming the database structure dramatically during updates? If it’s a case of adding/removing a few columns, you could just do that to the existing DB.

I also keep a backup before I do any serious work on the DB, set a processing flag to true in the ‘live’ version and if that flag is never set to false again, I restore from the backup.

Hi Mike,

With only 600ish rows i’d look into doing something like this:

Why not keep the database as it is permanently, and instead add a new table for each version you release to it.

That way you can easily merge data while keeping full user retention, in case of ever needing to roll back or recover user data.

If you go to the marketplace, you can look up “coronium core”. It works wonders for me and I especially love being able to write server side code, in lua. Just make sure you implement a good level of security in your code, to make it tamper/hacker resistant.

Happy to share mate :slight_smile: