Yet another Android audio latency thread (yes, we need a definite answer)

This seems to be the official attitude of Ansca right now:
https://developer.coronalabs.com/forum/2012/04/25/android-audio-does-it-work#comment-103095

We ask what we can do about the audio latency, and the answer seriously is “Android is bad”?
We released a game 6 months ago with virtually no audio latency on Android. Same exact setup as we have with our current one.
The changes that have been made within Corona inevitably have something to do with this.

I understand that backward compatibility is a killer, and that there are many factors involved, but if you don’t have the time to wrap your heads around it, why not let us use a quick and dirty solution like media.playEventSound was? Or keep the “hidden” audio API?

Is there any method at all, clean or dirty, that can be done to restore the functionality of, say, build 840 of Corona in terms of audio latency? Or are we forced to use older builds? [import]uid: 117153 topic_id: 33899 reply_id: 333899[/import]

Hi @kasplask,
You can, and should, use the “media” version for sfx on Android. See Joshua’s technical reason in the thread below (posted this week). I understand it’s a nuisance to use a different API for iOS and Android, but until it gets resolved (and Google might never get audio resolved properly), at least you have the former option of the lower-latency media API.

http://developer.coronalabs.com/forum/2012/01/07/sample-event-sound-code-only-playing-once

Best regards,
Brent [import]uid: 200026 topic_id: 33899 reply_id: 134774[/import]

Can the documentation be updated with a little more info on this?

Here are my main questions:

  1. Why do you recommend that playEventSound only be used for 1-3 second sounds? What happens if we don’t follow this recommendation?

  2. This sentence in the documentation doesn’t make sense “The media.playEventSound() is intended for short alert sounds and therefore has way to control program the volume.” What do you mean by this?

  3. Does the newEventSound method accept the same file types as audio.loadSound? [import]uid: 135827 topic_id: 33899 reply_id: 134782[/import]

The history about playEventSound is it was originally an iOS API to only play event sounds (key clicks, short tones, etc.). The volume was controlled by the device’s volume control. The media setVolume API doesn’t control the eventSound – only the playSound API. Since It loads the entire audio file into the buffer before it starts playing, you shouldn’t load large sound files because of the delay to play the sound. It does support the same formats as audio.loadSound.

We have been recommending that you don’t use any of the media sound APIs and instead use the audio APIs because you have more control of the audio and can play multiple channels. Since we made some changes that add extra latency when playing short sounds on Android, we are saying that playEventSound may help because it directly loads into the hardware (instead of mixing into one large audio stream).

We are looking at improving the playEventSound API by including it in our Audio library and providing the same controls we have with the existing audio library. We hope to have this implement in the near future. [import]uid: 7559 topic_id: 33899 reply_id: 135002[/import]

Thanks Tom! That clears up some of my questions, but I still have a couple more. I can probably figure out the answer on my own in a few days once we start polishing the android build of our game. But it might be useful to know ahead of time (And for the benefit of others).

  1. Since the Media Sound API does not support multiple channels, does that mean we can’t have two or more sounds loaded with loadEventSound playing at once?

  2. When you say that the Media Sound API has a delay when it loads the sound to play it, do you mean that it has a delay on loadEventSound or that it actually has a delay when we call playEventSound? [import]uid: 135827 topic_id: 33899 reply_id: 135013[/import]

George,

  1. playEventSound doesn’t support the Audio “channels” but I believe you can load and play multiple event sounds along with the audio.loadStream and audio.loadSound.

  2. The delay is in loadEventSound. If you load a large audio file and try to play it right afterwards, there will be a delay until it finishes loading. You generally want to preload small “event” sounds at the start of your program so they can be immediately played when needed. [import]uid: 7559 topic_id: 33899 reply_id: 135016[/import]

Perfect! Thanks.

Here’s how I’m probably going to implement it:

[code]
local loadedSounds = {}

local function loadSounds()
local files = {“explosion”,“jump”,“punch”} – List of filename prefixes for all the sounds we plan on using

for i,file in ipairs(files) do
if(system.getInfo( “platformName” ) == ‘Android’) then
– We use ogg files on Android and m4a files on iOS
loadedSounds[file] = media.newEventSound(“audioAssets/android/”…file…".ogg")
else
loadedSounds[file] = audio.loadSound(“audioAssets/iOS/”…file…".m4a")
end
end
end

local function playSound(name)
if(system.getInfo( “platformName” ) == ‘Android’) then
media.playEventSound( loadedSounds[name] )
else
audio.play( loadedSounds[name] )
end
end
[/code] [import]uid: 135827 topic_id: 33899 reply_id: 135021[/import]

The [lua]media.playEventSound()[/lua] can play more than one sound on Android. That is, it support audio mixing. The 1-3 second sound limitation is a limitation on both iOS and Android. On Android, it will not play a sound file that is larger than 1 MB, although Google’s documentation does not state if that 1 MB is the compressed or decompressed form in memory. As a general rule, only use it for short low latency sound effects.

Now, the reason why [lua]media.playEventSound()[/lua] is so much faster on Android is because we are using Android’s SoundPool class in Java, which is documented as their fastest audio API. See paragraph 3 in the documentation in the link below. That Java API loads sounds directly to the hardware.
http://developer.android.com/reference/android/media/SoundPool.html

The reason our [lua]audio[/lua] API has high latency on Android is because its audio is processed by a middle software layer (OpenAL) which is CPU bound before it is streamed out to the hardware. That middle layer is responsible for mixing all of the audio channels into a single stream and it then applies audio effects such as fades and volume level. This extra processing naturally adds latency and the more channels you have, the more latency you have. On iOS, OpenAL is supported and is hardware accelerated, which is why it is so much faster compared to Android. Google hasn’t provided a fast audio streaming API until Android 4.2 via OpenSL, but based on our testing, Android’s SoundPool class is still faster because it loads the sound directly to the hardware which bypasses that whole middle audio processing layer.

In any case, the above is the technical reason why one API is faster than the other. We do recognize that this is a pain for Corona developers to deal with and we want to look into updating our [lua]audio[/lua] API to make it easier so that you can code it the same for all platforms. Perhaps a new audio API designed for playing short low latency sound effects that bypasses audio processing, but still respects your volume settings.

In the meantime, a solution much like @george18’s up above will do the trick.
[import]uid: 32256 topic_id: 33899 reply_id: 135028[/import]

Hi @kasplask,
You can, and should, use the “media” version for sfx on Android. See Joshua’s technical reason in the thread below (posted this week). I understand it’s a nuisance to use a different API for iOS and Android, but until it gets resolved (and Google might never get audio resolved properly), at least you have the former option of the lower-latency media API.

http://developer.coronalabs.com/forum/2012/01/07/sample-event-sound-code-only-playing-once

Best regards,
Brent [import]uid: 200026 topic_id: 33899 reply_id: 134774[/import]

Can the documentation be updated with a little more info on this?

Here are my main questions:

  1. Why do you recommend that playEventSound only be used for 1-3 second sounds? What happens if we don’t follow this recommendation?

  2. This sentence in the documentation doesn’t make sense “The media.playEventSound() is intended for short alert sounds and therefore has way to control program the volume.” What do you mean by this?

  3. Does the newEventSound method accept the same file types as audio.loadSound? [import]uid: 135827 topic_id: 33899 reply_id: 134782[/import]

The history about playEventSound is it was originally an iOS API to only play event sounds (key clicks, short tones, etc.). The volume was controlled by the device’s volume control. The media setVolume API doesn’t control the eventSound – only the playSound API. Since It loads the entire audio file into the buffer before it starts playing, you shouldn’t load large sound files because of the delay to play the sound. It does support the same formats as audio.loadSound.

We have been recommending that you don’t use any of the media sound APIs and instead use the audio APIs because you have more control of the audio and can play multiple channels. Since we made some changes that add extra latency when playing short sounds on Android, we are saying that playEventSound may help because it directly loads into the hardware (instead of mixing into one large audio stream).

We are looking at improving the playEventSound API by including it in our Audio library and providing the same controls we have with the existing audio library. We hope to have this implement in the near future. [import]uid: 7559 topic_id: 33899 reply_id: 135002[/import]

Thanks Tom! That clears up some of my questions, but I still have a couple more. I can probably figure out the answer on my own in a few days once we start polishing the android build of our game. But it might be useful to know ahead of time (And for the benefit of others).

  1. Since the Media Sound API does not support multiple channels, does that mean we can’t have two or more sounds loaded with loadEventSound playing at once?

  2. When you say that the Media Sound API has a delay when it loads the sound to play it, do you mean that it has a delay on loadEventSound or that it actually has a delay when we call playEventSound? [import]uid: 135827 topic_id: 33899 reply_id: 135013[/import]

George,

  1. playEventSound doesn’t support the Audio “channels” but I believe you can load and play multiple event sounds along with the audio.loadStream and audio.loadSound.

  2. The delay is in loadEventSound. If you load a large audio file and try to play it right afterwards, there will be a delay until it finishes loading. You generally want to preload small “event” sounds at the start of your program so they can be immediately played when needed. [import]uid: 7559 topic_id: 33899 reply_id: 135016[/import]

Perfect! Thanks.

Here’s how I’m probably going to implement it:

[code]
local loadedSounds = {}

local function loadSounds()
local files = {“explosion”,“jump”,“punch”} – List of filename prefixes for all the sounds we plan on using

for i,file in ipairs(files) do
if(system.getInfo( “platformName” ) == ‘Android’) then
– We use ogg files on Android and m4a files on iOS
loadedSounds[file] = media.newEventSound(“audioAssets/android/”…file…".ogg")
else
loadedSounds[file] = audio.loadSound(“audioAssets/iOS/”…file…".m4a")
end
end
end

local function playSound(name)
if(system.getInfo( “platformName” ) == ‘Android’) then
media.playEventSound( loadedSounds[name] )
else
audio.play( loadedSounds[name] )
end
end
[/code] [import]uid: 135827 topic_id: 33899 reply_id: 135021[/import]

The [lua]media.playEventSound()[/lua] can play more than one sound on Android. That is, it support audio mixing. The 1-3 second sound limitation is a limitation on both iOS and Android. On Android, it will not play a sound file that is larger than 1 MB, although Google’s documentation does not state if that 1 MB is the compressed or decompressed form in memory. As a general rule, only use it for short low latency sound effects.

Now, the reason why [lua]media.playEventSound()[/lua] is so much faster on Android is because we are using Android’s SoundPool class in Java, which is documented as their fastest audio API. See paragraph 3 in the documentation in the link below. That Java API loads sounds directly to the hardware.
http://developer.android.com/reference/android/media/SoundPool.html

The reason our [lua]audio[/lua] API has high latency on Android is because its audio is processed by a middle software layer (OpenAL) which is CPU bound before it is streamed out to the hardware. That middle layer is responsible for mixing all of the audio channels into a single stream and it then applies audio effects such as fades and volume level. This extra processing naturally adds latency and the more channels you have, the more latency you have. On iOS, OpenAL is supported and is hardware accelerated, which is why it is so much faster compared to Android. Google hasn’t provided a fast audio streaming API until Android 4.2 via OpenSL, but based on our testing, Android’s SoundPool class is still faster because it loads the sound directly to the hardware which bypasses that whole middle audio processing layer.

In any case, the above is the technical reason why one API is faster than the other. We do recognize that this is a pain for Corona developers to deal with and we want to look into updating our [lua]audio[/lua] API to make it easier so that you can code it the same for all platforms. Perhaps a new audio API designed for playing short low latency sound effects that bypasses audio processing, but still respects your volume settings.

In the meantime, a solution much like @george18’s up above will do the trick.
[import]uid: 32256 topic_id: 33899 reply_id: 135028[/import]