New Android Audio Lag

Hi,

We experienced the same lag issues with one of our apps on many devices (Nexus 7, Nexus 4, Galaxy Nexus) and didn’t have them in another of our apps although both are built using the same sdk version (2013.1094). The only difference we found in the code base was that the app with lags had the audioPlayFrequency in config.lua set to 11025 (the codebase is quite old, I don’t know exactly why we did this) although our audio files all use 44100 Hz sample rate. After removing this setting, the lags dropped to an acceptable level (I didn’t measure it but before the lag was >500ms and now you don’t notice it unless you know it is there).

@app.enterprise, what format are your sounds in? Bit rate? Mono or stereo? Are your sounds trimmed? @wolfgang, that is a good catch. That is probably forcing an in-CPU down sample of the audio every time it plays.

can I ask for a synopsis of the state of affairs here?

i’m facing truly unacceptable lag on a nexus 7 (my “go to” device for most testing, as it performs well across the board, without being too far out in the stratosphere for most “real-world” consumers to have, or so goes my thinking :D)

the top of the media api says “should not be used”, the individual methods like playEventSound word it more softly as “recommended” to use newer audio lib".  advice here (and all around the 'net) says “audio.* has insufferable lag, for ui sounds and such still use playEventSound” .  so which is it, who’s right?  use media.* if/when required, or no way, not supported, use at your own risk, will crash-and-burn for sure?

particularly with ui sounds you can literally “feel” the lag with audio.*, whereas any latency with media.playEventSound() is essentially imperceptible.  should mention i’m using tightly trimmed .wav files, mono, linear pcm, 16-bit, 22050hz, have config.lua specifying 22050hz, everything should be matching up not requiring sample-rate conversion.  (only difference with media.* is I instead load an .ogg conversion of same files, as .wav doesn’t seem supported, .mp3 has an “inbuilt” silence at the beginning, etc, but ogg not supported with audio.* so hard to do a perfect apples-to-apples comparison w exact same file format between the two libs)

i have a flag in my dev version i can hit and switch between the two libs/files at runtime, and the difference is glaring.  somehow the big-boy heavy-hitter AAA titles out there have solved the problem, just wish I knew how.

i’d just keep using media.* for such sounds (and reserving audio.* for streaming music), except my concern at this point comes about as a result of investigating a side issue of “application has stopped unexpectedly” immediately on startup which some on the 'net seem to have attributed to media.* tho i can’t confirm.  can anyone else?

i suppose I’ll try again doing everything at 44.1khz again, maybe nexus 7 isn’t “respecting” my request for 22khz, or maybe 44.1khz is an optimized “sweet spot”?  worth trying i guess, but i doubt it – cuz that’s where i started, and latency was there back then, and i only converted down to save space, cuz any diff in quality isn’t noticable on a tablet, but it’s possible that some recent os update might have changed things.  (?)

any advice?  sorry for the long post.  thx

Yes, you should use the media.playEventSound() function on Android for sound effects.  It calls the fastest sound playing API on Android, as I’ve posted up above.  That’s as good as it gets on Android.  However, this function should *not* be used on iOS.

I’m quite positive the big boy AAA studios that you are referring to are using the same Android API that we’re using for media.playEventSound().  There’s really nothing faster on Android.  The only other possible optimization that you can make is to reduce the sample rate so that it takes less time to decode, but that should just be a one time hit when you load the sound when calling media.newEventSound().

Now, if you still hear audio latency when using media.playEventSound(), then it’s like an issue with the hardware/firmware.  Some Android devices just aren’t designed for gaming.  The 1st generation Kindle Fire is one of those devices.  That device has severe audio latency.  You can hear it with all apps that you download from the app store… including the AAA titles.

Hi Joshua, playing a lot of short sounds using media.playEventSound like for example coins being consumed by a player will produce a lag on android.

fwiw and ymmv, of course, but…

i just did an experiment where i went back to my source audio files, re-exported all as 44100hz 16-bit signed PCM mono .wav files (had been using 22050hz) and adjusted config.lua to match.

result?  audio.play() on nexus 7 has become quite respectable!  it would seem that nexus 7 at least, does indeed have an inbuilt preference for 44k.

it’s still a bit more latency than media.*, but well within the realm of “ok i can tolerable that, given that it’s a mobile device”, and where accepting that tradeoff simplifies my code by using a single audio library.

it isn’t a rigorous test by any means, and i wonder if doing so might negatively affect *other* devices, but it might be an experiment worth doing for others facing this problem?

Hi,

We experienced the same lag issues with one of our apps on many devices (Nexus 7, Nexus 4, Galaxy Nexus) and didn’t have them in another of our apps although both are built using the same sdk version (2013.1094). The only difference we found in the code base was that the app with lags had the audioPlayFrequency in config.lua set to 11025 (the codebase is quite old, I don’t know exactly why we did this) although our audio files all use 44100 Hz sample rate. After removing this setting, the lags dropped to an acceptable level (I didn’t measure it but before the lag was >500ms and now you don’t notice it unless you know it is there).

@app.enterprise, what format are your sounds in? Bit rate? Mono or stereo? Are your sounds trimmed? @wolfgang, that is a good catch. That is probably forcing an in-CPU down sample of the audio every time it plays.

can I ask for a synopsis of the state of affairs here?

i’m facing truly unacceptable lag on a nexus 7 (my “go to” device for most testing, as it performs well across the board, without being too far out in the stratosphere for most “real-world” consumers to have, or so goes my thinking :D)

the top of the media api says “should not be used”, the individual methods like playEventSound word it more softly as “recommended” to use newer audio lib".  advice here (and all around the 'net) says “audio.* has insufferable lag, for ui sounds and such still use playEventSound” .  so which is it, who’s right?  use media.* if/when required, or no way, not supported, use at your own risk, will crash-and-burn for sure?

particularly with ui sounds you can literally “feel” the lag with audio.*, whereas any latency with media.playEventSound() is essentially imperceptible.  should mention i’m using tightly trimmed .wav files, mono, linear pcm, 16-bit, 22050hz, have config.lua specifying 22050hz, everything should be matching up not requiring sample-rate conversion.  (only difference with media.* is I instead load an .ogg conversion of same files, as .wav doesn’t seem supported, .mp3 has an “inbuilt” silence at the beginning, etc, but ogg not supported with audio.* so hard to do a perfect apples-to-apples comparison w exact same file format between the two libs)

i have a flag in my dev version i can hit and switch between the two libs/files at runtime, and the difference is glaring.  somehow the big-boy heavy-hitter AAA titles out there have solved the problem, just wish I knew how.

i’d just keep using media.* for such sounds (and reserving audio.* for streaming music), except my concern at this point comes about as a result of investigating a side issue of “application has stopped unexpectedly” immediately on startup which some on the 'net seem to have attributed to media.* tho i can’t confirm.  can anyone else?

i suppose I’ll try again doing everything at 44.1khz again, maybe nexus 7 isn’t “respecting” my request for 22khz, or maybe 44.1khz is an optimized “sweet spot”?  worth trying i guess, but i doubt it – cuz that’s where i started, and latency was there back then, and i only converted down to save space, cuz any diff in quality isn’t noticable on a tablet, but it’s possible that some recent os update might have changed things.  (?)

any advice?  sorry for the long post.  thx

Yes, you should use the media.playEventSound() function on Android for sound effects.  It calls the fastest sound playing API on Android, as I’ve posted up above.  That’s as good as it gets on Android.  However, this function should *not* be used on iOS.

I’m quite positive the big boy AAA studios that you are referring to are using the same Android API that we’re using for media.playEventSound().  There’s really nothing faster on Android.  The only other possible optimization that you can make is to reduce the sample rate so that it takes less time to decode, but that should just be a one time hit when you load the sound when calling media.newEventSound().

Now, if you still hear audio latency when using media.playEventSound(), then it’s like an issue with the hardware/firmware.  Some Android devices just aren’t designed for gaming.  The 1st generation Kindle Fire is one of those devices.  That device has severe audio latency.  You can hear it with all apps that you download from the app store… including the AAA titles.

fwiw and ymmv, of course, but…

i just did an experiment where i went back to my source audio files, re-exported all as 44100hz 16-bit signed PCM mono .wav files (had been using 22050hz) and adjusted config.lua to match.

result?  audio.play() on nexus 7 has become quite respectable!  it would seem that nexus 7 at least, does indeed have an inbuilt preference for 44k.

it’s still a bit more latency than media.*, but well within the realm of “ok i can tolerable that, given that it’s a mobile device”, and where accepting that tradeoff simplifies my code by using a single audio library.

it isn’t a rigorous test by any means, and i wonder if doing so might negatively affect *other* devices, but it might be an experiment worth doing for others facing this problem?

@Rob

There is still a massive audio delay on Android ( using a hp slate 7 running 4.1.1 ).
I really don’t want to have to do a device detection and use media.playSound() but I guess I might have to.

Do you know of a way to retrieve the soundFile string required for media.playSound() from an audio object already setup with audio.loadSound() ?

Basically, I have a whole system in place whereby audio objects are loaded for each scene and passed through to functions that create various UI in my game (and handle sound interaction). If I could just do the device detection in one place when playing the audio and decide whether to use audio.play() or media.playSound() that would be best. But for this I need to retrieve the soundFile string from the audio object.

Inspecting the audio object, it just says userdata. Any ideas?

Just because your device is running 4.1, unless you are planning on limiting your customers to 4.1 or higher, you  need to assume that you’re customers could be on 2.2, 3.x etc.  In other words, for Android, you probably should plan to use media.playEventSound() if the android version is less than 4.2.  I wouldn’t worry about the device, but the OS.

Rob

@Rob

Do you know of a way to get the soundFile string required for media.playSound() from an audio object already setup with audio.loadSound() ?
I know it seems like working backwards in this case but I have my reasons.

As far as I know you can’t play sounds loaded by audio.loadSound() with the media.* API.

For this reason alone, I’ve written my own audio library that automatically uses the media.* API on Android < 4.2 and otherwise uses the audio.* API. 

@ingemar

Care to share?

Unfortunately I can’t share the code.

But the gist of it is to write a module with your own loadSound() and playSound() functions. 

The loadSound(id, fileName, {isShort=true | false}) function checks for Android and if it’s defined as a short sound will load the file using the media.* API and store the returned handle in a table appAudio[id].handle with a type of appAudio[id].type = “media”.

For long sounds (isShort = false)  or omitted / iOS, I use the audio.* API to load the sound into the table appAudio[id].handle with the type of appAudio[id].type = “audio”.

The playSound(id) function will check appAudio[id].type and use the appropriate API to play the handle stored in appAudio[id].handle.

You’ll also need some housekeeping functions to only keep your active handles in memory while you change scenes etc.

The major drawback of the media.* API is that only one sound can be played at a time. If you play a second sound before the media.* API had finished playing the current sound, it will stop playing the first sound before playing the second one. This can cause “ugly” abrupt sound effects if you have a fast paced action scene where you want to play short sounds in quick succession.

So sometimes you’ll have to make a decision to either have nice sound effects with a slight delay, or to have sounds played on-time, but with less-than-perfect results.

It’s a dilemma I struggle with every time on Android…

@Rob

There is still a massive audio delay on Android ( using a hp slate 7 running 4.1.1 ).
I really don’t want to have to do a device detection and use media.playSound() but I guess I might have to.

Do you know of a way to retrieve the soundFile string required for media.playSound() from an audio object already setup with audio.loadSound() ?

Basically, I have a whole system in place whereby audio objects are loaded for each scene and passed through to functions that create various UI in my game (and handle sound interaction). If I could just do the device detection in one place when playing the audio and decide whether to use audio.play() or media.playSound() that would be best. But for this I need to retrieve the soundFile string from the audio object.

Inspecting the audio object, it just says userdata. Any ideas?

Just because your device is running 4.1, unless you are planning on limiting your customers to 4.1 or higher, you  need to assume that you’re customers could be on 2.2, 3.x etc.  In other words, for Android, you probably should plan to use media.playEventSound() if the android version is less than 4.2.  I wouldn’t worry about the device, but the OS.

Rob

@Rob

Do you know of a way to get the soundFile string required for media.playSound() from an audio object already setup with audio.loadSound() ?
I know it seems like working backwards in this case but I have my reasons.