How do I make a loop run in the background of multiple scenes?

This is basically a question about how Composer works.

I am making a turn based game where the player can run a local multiplayer server.  This requires a server loop.  I’m wanting to make it so that the player can have multiple games going at the same time but I want one server loop.

What I want to do is start a server loop from the menu “scene” but access client info from inside the various instances of the game which are a different scene.  Also, I need the server running and accessible while the player is in other scenes.

Normally, I would put the server loop in a module, but I’m talking about a loop that is running off a timer.performwithdelay() and I don’t know what causes that to freeze (besides leaving the app)

So, should I put the server loop function in main.lua and make it global?  If I then switch scenes can the loop run in the background?  Do I have to put it in main.lua?  As of now, the only thing in main.lua is a call to create the main menu scene.

Any advice is appreciated.

I did more research and it seems that I should create a module with my startServer function in it and require it from “main.lua”  but only conditionally start the server at that time based on a saved variable (a variable that tracks if the player is currently hosting a game and took a phone call in the middle of it).

If I do that, it seems that a loop would persist through all scenes and can be accessed and possibly changed by simply requiring the same module in another scene.  Does that sound right?

I’m going to start trying it now, I was just hoping to save myself some iterations by getting the answer up front.

It sounds to me like your server needs to be using a Runtime “enterFrame” listener.  This is a process that’s not part of any scene, but is tied to the global runtime object.  Depending on the frame rate of your app, it will trigger 30 times per second or 60 times per second and happens regardless of what’s going on with the screen.   This is a good time to check for input from the server and process it.

If you’re doing multiple games with a single server, then your data structures need to support multiple games and then your scenes would need to know which game it’s getting data for.

Rob

I saw some information about that.  I don’t really need to check for new clients and to see if the active player is finished with his turn 30 times per second… that seems excessive for a turn based game.

I suppose I could set a callback for 1000 and then refresh it every time it’s called.  That will leave plenty of space for multiple sessions.

So, you advise against using timer.performwithdelay() in this application?

The more I read about enterFrame callbacks, the more I wonder what the difference is between timer.performwithdelay() and this method of performing a loop?  I always assumed that this was what timer.performwithdelay() was doing.

I guess I want to ask, why did you make timer.performwithdelay() if it is better to attach a callback to enterFrame?

To me, a timer is a discreet thing that fires at set intervals over a fixed period of time, though I guess by setting the iterations to 0, it will run indefinitely.  It seems to me there would be overhead with it, but under the hood, it’s probably an enterFrame listener driving the timer any way.  But in my old school way of thinking the enterFrame is more like a loop to me as opposed to something that happens at a certain time interval.

If it was discreet, that would be extremely inefficient.  It would make more sense to implement a timer as simply registering a callback at a certain point and only have one loop that checks to see if any callbacks in the list need to fire.  Which is what enterFrame loops do.  If timer.performwithdelay() is in fact creating a new loop every time it is called then it should be changed.  You wouldn’t even need to change the syntax, from the end users point of view it would work exactly the same.  It already asks for an interval, a function, and the number of repeats.

Assuming that the timer.performwithdelay() is as efficient as enterFrame loop, then the benefit of timer.performwithdelay() would be that it cleans itself up (unlike enterFrame where you have to remove the listener) and that you can (at least theoretically) have a shorter interval than 1/60.

The benefit of enterFrame would be that it would be synched with the screen update, which is useful when you are doing animations.

I would really be interested in knowing if timer.performwithdelay() is in fact registering a callback.  I really prefer not having to check for several system events to unregister an enterFrame loop.  If you have time, please ask one of your engineers to take a peek at the code and see how it works.

I have an update.  I’ve been working on this for a few weeks and can say that the loop will run just fine in a module.  It also seems to be passed to another scene without breaking the loop.  I just load the game scene which includes the module before trashing the menu scene.

The real work is accommodating when the clint or server device becomes locked and unlocked.  It isn’t impossible, but it takes some finagling.  Basically, if you lock the device it clears the sockets, so your client and server need a system that can accommodate that.

I did this by having everything except the initial mating go into a cue instead of sending it right away.  Every message gets an index number and when the other side receives it then it echoes back the index.  If it receives the same index again it just ignores the message.  Either side will just keep sending the message until the index gets echoed back and then it is removed from the cue.  Whether this is a best practice or not, I can’t say… I pretty much make this stuff up as I go.  All I can say is, “it works” and as complicated as it might be, it is transparent to the end user.

I did more research and it seems that I should create a module with my startServer function in it and require it from “main.lua”  but only conditionally start the server at that time based on a saved variable (a variable that tracks if the player is currently hosting a game and took a phone call in the middle of it).

If I do that, it seems that a loop would persist through all scenes and can be accessed and possibly changed by simply requiring the same module in another scene.  Does that sound right?

I’m going to start trying it now, I was just hoping to save myself some iterations by getting the answer up front.

It sounds to me like your server needs to be using a Runtime “enterFrame” listener.  This is a process that’s not part of any scene, but is tied to the global runtime object.  Depending on the frame rate of your app, it will trigger 30 times per second or 60 times per second and happens regardless of what’s going on with the screen.   This is a good time to check for input from the server and process it.

If you’re doing multiple games with a single server, then your data structures need to support multiple games and then your scenes would need to know which game it’s getting data for.

Rob

I saw some information about that.  I don’t really need to check for new clients and to see if the active player is finished with his turn 30 times per second… that seems excessive for a turn based game.

I suppose I could set a callback for 1000 and then refresh it every time it’s called.  That will leave plenty of space for multiple sessions.

So, you advise against using timer.performwithdelay() in this application?

The more I read about enterFrame callbacks, the more I wonder what the difference is between timer.performwithdelay() and this method of performing a loop?  I always assumed that this was what timer.performwithdelay() was doing.

I guess I want to ask, why did you make timer.performwithdelay() if it is better to attach a callback to enterFrame?

To me, a timer is a discreet thing that fires at set intervals over a fixed period of time, though I guess by setting the iterations to 0, it will run indefinitely.  It seems to me there would be overhead with it, but under the hood, it’s probably an enterFrame listener driving the timer any way.  But in my old school way of thinking the enterFrame is more like a loop to me as opposed to something that happens at a certain time interval.

If it was discreet, that would be extremely inefficient.  It would make more sense to implement a timer as simply registering a callback at a certain point and only have one loop that checks to see if any callbacks in the list need to fire.  Which is what enterFrame loops do.  If timer.performwithdelay() is in fact creating a new loop every time it is called then it should be changed.  You wouldn’t even need to change the syntax, from the end users point of view it would work exactly the same.  It already asks for an interval, a function, and the number of repeats.

Assuming that the timer.performwithdelay() is as efficient as enterFrame loop, then the benefit of timer.performwithdelay() would be that it cleans itself up (unlike enterFrame where you have to remove the listener) and that you can (at least theoretically) have a shorter interval than 1/60.

The benefit of enterFrame would be that it would be synched with the screen update, which is useful when you are doing animations.

I would really be interested in knowing if timer.performwithdelay() is in fact registering a callback.  I really prefer not having to check for several system events to unregister an enterFrame loop.  If you have time, please ask one of your engineers to take a peek at the code and see how it works.

I have an update.  I’ve been working on this for a few weeks and can say that the loop will run just fine in a module.  It also seems to be passed to another scene without breaking the loop.  I just load the game scene which includes the module before trashing the menu scene.

The real work is accommodating when the clint or server device becomes locked and unlocked.  It isn’t impossible, but it takes some finagling.  Basically, if you lock the device it clears the sockets, so your client and server need a system that can accommodate that.

I did this by having everything except the initial mating go into a cue instead of sending it right away.  Every message gets an index number and when the other side receives it then it echoes back the index.  If it receives the same index again it just ignores the message.  Either side will just keep sending the message until the index gets echoed back and then it is removed from the cue.  Whether this is a best practice or not, I can’t say… I pretty much make this stuff up as I go.  All I can say is, “it works” and as complicated as it might be, it is transparent to the end user.