Import/Export json files between users

Hello there,

I’m currently struggling with file sharing between users.
As I have not implemented any account system or whatever that looks like (it seems very complex to set this up and I’m a “newby” to dev).

What I want to do :
Share jsons files between users.

Where am I in doing that :
For now I choose to share this by email (as I didn’t find any other suitable solution).
The export function should be Ok quite soon, based on the “native.showPopup” email solution (that handle the attachements).
For the import I’m stuck…
My idea is to get those files from “download” directory, and remove them just after import, but I don’t know how to do so.
Anyone could help with that?
Did I miss something obvious? If it is the case I’m already sorry :wink:

Many thanks for your help!

Your best bet is some free/cheap web hosting and use that to upload to and download from.

Hum do you have some free web hosting solution in head?

As done this way, I will have somehow to go through an interface to explore files on the hosting server…
Sounds like much trouble LOL

I have no way to do what I planned to if I understand correctly ?

edit: Thinking it again, I somehow may enter a link as variable and then make the download of the file then :slight_smile: can do the trick. It still painful if there are quite some files to be downloaded… so doesn’t solve completely my issue.

Here is my very basic lib for uploading JSON data to a database and downloading it again via an ID key.

Please note, it is very basic and the PHP stuff is not good, for a start it should be using stored procedures but it should get you going.

Hello Graham,

I gave quite a quick look to it I must confess… not sure how does it work, neither how to use it yet…
Do I have to have an URL of my own to push the files or has it a “targeted” cloud by default?
By any chance is there any sample of working code (push and download) from wich I can start from?

Sorry for the dumb questions :slight_smile:

Just wandered if by any chance there is an easy way to use google drive or amazon drive to do it?

I read through the forum and see some old posts that speeks about pluggin I’m not finding…or even worse speaking about functions integrated to Corona which appear to have disappeared with SOLAR2d…
Any thoughts?

Download the code and shove it in a folder called ‘cloud’ in the root of your project, then in your code require it like this:

require( "cloud.core" )

Later on call init passing in a url to the upload.php and download.php files that you’ve put on your webhost like this:

Scrappy.Cloud:init
{
	apiUrl = { upload = "https://www.example.com/upload.php", download = "https://www.example.com/download.php" }
}

To upload stuff you’ll call this:

local onComplete = function( id )

   print( "Data saved with ID: ", id )

end

local data = 
{
   level = 10,
   score = 100
}
Scrappy.Cloud:upload( data, onComplete )

Then later on you can download it like this:

local onComplete = function( data )
	print( "Level:" , data.level, " - Score: ", data.score )		
end

Scrappy.Cloud:download( string.upper( id ), onComplete )

You’ll also need a mysql database created witha table called ‘cloud’ that has ‘id’, ‘name’, ‘data’ fields, like so:

CREATE TABLE `cloud` (
 `id` int(11) NOT NULL AUTO_INCREMENT,
 `name` text NOT NULL,
 `data` mediumtext NOT NULL,
 PRIMARY KEY (`id`)
) ENGINE=MyISAM AUTO_INCREMENT=106 DEFAULT CHARSET=latin1

And on top of that you’ll want a config.php file sitting alongside the upload.php and download.php files that has the database credentials in it like so:

<?php
    $servername = "localhost";
    $username = "USERNAME";
    $password = "PASSWORD";
    $dbname = "DATABASENAME";
?>

Thanks a lot.

Clearer now. But still I have a php domain to target.

My prefered option would be to use Gdrive, seems like a plugin use to exist but I can’t find it anymore…
Any exemple on how to upload / download from gdrive?

You could also use a backend service like playfab. Whatever method you choose it will not be as simple as you would like.

I do exactly this in my games but I have my own dedicated backend to run this.

Indeed sounds like more complex than expected…
Quite surprise there is no “easy” plugin for using Gdrive or Amazon drive.
I have found some stuff but seems outdated with no new one like :


or even what is discussed there :

Making up my own backend has popped up at some point but not my favorite solution for now.

Quick Google search found this but never used it myself https://jsonbin.io/

Thanks Graham, I will give a look into that then :slight_smile:

I found myself on the Solar2D marketplace and found this : https://solar2dmarketplace.com/plugins?FirebaseStorage_tech-scotth

Sounds like something that could do the trick too, no ?
I didn’t chek yet about Firebase storage if it’s free or not…

Digging into solar2D I found this :


and mostly this:

in which there is : https://github.com/coronalabs/com.coronalabs-shared.google.play.services.drive

My problem is to find the documentation now…
I understand that solar2D is gdrive capable, but the question is how ?

On the other hand I found this :


Clearer but a lot dependencies… not a aio thing then.
But might worth a try anyway :slight_smile:

After loosing my mind in this quest, I ended up finding a solution with a personal space from my internet provider :slight_smile:
Will be far more easier than all that !!!

Just wandered if by any chance there is an easy way to use google drive or amazon drive to do it?

There is, but very very far from easy :slight_smile:
I gave up… no plugin available, the auth is a nightmare… Better find another way …

Maybe the Amazon one is more accessible with S3 lite plugin. But I didn’t dig too much in that direction as I’m not confortable with bucket management…

Hello there,

I’ve been looking to Google Play Games Services into the console and not sure of what am I doing there…
This allow to use some google APIs (likely cloud one then) but this seems very complex to me… anyone with experience ?

Yes.

You can use Amazon Aws S3 do this.

Please note that this code is just to help you understand S3. You should separate this code, execute the List first, and then download the object.

local s3 = require("plugin.s3-lite")

s3:new({
	key = ***********************,
	secret = ***********************,
	region = s3.***********************
})

local function onListObjects( evt )
	if evt.error then
		print(evt.error)
	else
		local function onGetObject( evt )
			if evt.error then
				print(evt.error)
			else
				if evt.progress then
					print("Doing")
				else
					print("Done")
				end			
			end
		end
		local objects = evt.objects
		for i=1, #objects do
			s3:getObject(
				***********************,
				objects[i].key,
				system.DocumentsDirectory,
				objects[i].key,
				onGetObject
			)
		end
	end
end

s3:listObjects(************, onListObjects)

I ended up doing it through Dropbox :slight_smile:
It is working quite OK, now I have to find out how to implement PKCE in order to not have my token in clear in the code :frowning: