Media API (for audio) Limitations

Hi.

It is know that when use the Corona Media API you will have some limitations when playing audios (like, no simultaneous music, no control of how long a sound is played,…).

For a while I thought that was a limitation imposed by the Android system, but after reading the Android Media API, I didn’t see any of these limitation there.

So, are all these Media API limitation something imposed by Corona? Why?

It depends on the platform.  The limitations come from the native APIs that the Lua media APIs are tied to.

Android can definitely play multiple audio files at the same time via the media API.  In fact, the Android Java API that media.playEventSound() is tied to has the *least* latency when it comes to audio playback, but has the limitation that it can only play short sounds (like 1-2 seconds, but I thinks it’s more of a buffer size limit).

If I’m remembering right, the “play one sound at a time” limitation is only on Mac, iOS, and Windows.

Our Lua audio APIs are tied to more advanced native APIs on each platform.  Such as CoreAudio on Mac and iOS.  OpenSL on Android.  DirectX XAudio2 on WP8.  These platform specific libraries/APIs are designed for gaming and support audio channel mixing… and the mixing is what allows multiple audio channels/files to be heard at the same time.

Anyways, that’s my brief explanation.

Geez, you and I must have the same Google search history right now. I was researching this exact issue (and Android/iOS audio latency in general) last night.

According to the last forum entry in this thread, there is no limitation on simultaneous audio. While the thread is from close to 2.5 years ago, it does describe the problems with audio latency, where it comes from, why it happens, and what can be done to circumvent the problem. In this thread, Ingemar knocks it out of the park in describing the problem with audio on Android.

I mention Android specifically, because I saw you were one of the threads where the Android audio lag was being discussed. Maybe you’re not talking about Android latency lag now, but the above threads still talk a lot about the way media.* calls work. 

*** Meta-Update: I was putting together the annotations for this post when Joshua came right in! Joshua, the the first link above has an explanation from you regarding the media.* API calls. Can you confirm if the info still applies? Thanks!

Yeah, the Android audio latency issue is an issue for *all* Android developers.  It’s not a Corona issue.  It’s a combination of an issue with the operating system and sometimes the hardware (or perhaps the hardware’s audio drivers).  Google’s OpenSL library which provides the advanced audio features everyone wants for gaming is notorious for having audio latency on many Android devices.  Google’s Java SoundPool class is documented by Google to have the least audio latency in their operating system, but it comes with limitations such as it can only short sound files.  Even then, some devices such as the 1st generation Kindle Fire *always* have audio latency, regardless of which native Android audio API you use, which proves that it’s a hardware/driver problem for some devices.

I guess at the end of the day, you have to look at Android devices kind of like PCs.  Cheap PCs won’t play games well.  Just like how cheap Android devices won’t play games as well as a more expensive Android device.  You get what you pay for.

Now, if you’re dead set in requiring the Android hardware to support low-latency, then you can add an AndroidManifest.xml “uses-feature” item named “android.hardware.audio.low_latency”.  Although, I’m not sure if this is a good idea or not.  It may restrict your app to a very small portion of the device market.

Now, if you’re dead set in requiring the Android hardware to support low-latency, then you can add an AndroidManifest.xml “uses-feature” item named “android.hardware.audio.low_latency”.  Although, I’m not sure if this is a good idea or not.  It may restrict your app to a very small portion of the device market.

That is a fantastic gem, and is great to know. Audio latency is the  non-starter for anything dealing with audio playback at a rapid rate. Restricting the app to  only devices that can reliably provide that low latency is just ducky. 

Here is Google’s documentation on that “low_latency” feature…

   http://developer.android.com/guide/topics/manifest/uses-feature-element.html#hw-features

Yeah, I agree.  It doesn’t seem like an appropriate solution.  Maybe it’s okay for a music generating app.  That “Ocarina” app on the iOS app store comes to mind.  But other than that, using that setting will just damage the success of your app.  And probably confuse some of your potential customers as to why they can’t download it.

My thought would be that, in marketing the app, you would be sure to note that the app is only available on devices that can reliably (according to Google, at least) support it. The flipside of the “confusion” coin is that, by limiting the devices, you are ensuring that you aren’t going to get any reviews talking about the audio latency, from people that are unfamiliar with the issue.

One could make the argument that an intelligent audiofile would see that review and scoff, knowing it’s an Android/device limitation and not the original developer’s fault, but now we’ve fallen down the rabbit hole. :wink:

Hi Joshua.

Thanks for you quick reply. I always appreciate when someone is technically sound in his answer as you usually are.

The limitation of playing just 1 sound at time I got it from the docs:

Gotchas

Only one sound can be playing using this sound API. Calling this API with a different sound file will stop the existing sound and play the new sound.

But I didn’t test it to see if that actually happens on the device (I trusted the doc :slight_smile:  ). I think that limitation is the worse one, so if it indeed does not happen on the Android device, it is a great thing. I will test it and let it know about it.

Oh wait.  I was talking about media.playEventSound()…

   http://docs.coronalabs.com/api/library/media/playEventSound.html

I recommend that you use that API on Android for sound effects.

I do not recall if media.playSound() can play multiple audio files at the same time on Android.  That *might* be true, but I usually recommend our audio.* API over media.playSound().  I don’t think media.playSound() provides any additional benefits other than it’s simpler to use.

I can confirm that on my Nexus 4 running 4.4.4 the media.playEventSound() API allows for multiple simultaneous sounds. I am not seeing any lag when using this API. 

It depends on the platform.  The limitations come from the native APIs that the Lua media APIs are tied to.

Android can definitely play multiple audio files at the same time via the media API.  In fact, the Android Java API that media.playEventSound() is tied to has the *least* latency when it comes to audio playback, but has the limitation that it can only play short sounds (like 1-2 seconds, but I thinks it’s more of a buffer size limit).

If I’m remembering right, the “play one sound at a time” limitation is only on Mac, iOS, and Windows.

Our Lua audio APIs are tied to more advanced native APIs on each platform.  Such as CoreAudio on Mac and iOS.  OpenSL on Android.  DirectX XAudio2 on WP8.  These platform specific libraries/APIs are designed for gaming and support audio channel mixing… and the mixing is what allows multiple audio channels/files to be heard at the same time.

Anyways, that’s my brief explanation.

Geez, you and I must have the same Google search history right now. I was researching this exact issue (and Android/iOS audio latency in general) last night.

According to the last forum entry in this thread, there is no limitation on simultaneous audio. While the thread is from close to 2.5 years ago, it does describe the problems with audio latency, where it comes from, why it happens, and what can be done to circumvent the problem. In this thread, Ingemar knocks it out of the park in describing the problem with audio on Android.

I mention Android specifically, because I saw you were one of the threads where the Android audio lag was being discussed. Maybe you’re not talking about Android latency lag now, but the above threads still talk a lot about the way media.* calls work. 

*** Meta-Update: I was putting together the annotations for this post when Joshua came right in! Joshua, the the first link above has an explanation from you regarding the media.* API calls. Can you confirm if the info still applies? Thanks!

Yeah, the Android audio latency issue is an issue for *all* Android developers.  It’s not a Corona issue.  It’s a combination of an issue with the operating system and sometimes the hardware (or perhaps the hardware’s audio drivers).  Google’s OpenSL library which provides the advanced audio features everyone wants for gaming is notorious for having audio latency on many Android devices.  Google’s Java SoundPool class is documented by Google to have the least audio latency in their operating system, but it comes with limitations such as it can only short sound files.  Even then, some devices such as the 1st generation Kindle Fire *always* have audio latency, regardless of which native Android audio API you use, which proves that it’s a hardware/driver problem for some devices.

I guess at the end of the day, you have to look at Android devices kind of like PCs.  Cheap PCs won’t play games well.  Just like how cheap Android devices won’t play games as well as a more expensive Android device.  You get what you pay for.

Now, if you’re dead set in requiring the Android hardware to support low-latency, then you can add an AndroidManifest.xml “uses-feature” item named “android.hardware.audio.low_latency”.  Although, I’m not sure if this is a good idea or not.  It may restrict your app to a very small portion of the device market.

Now, if you’re dead set in requiring the Android hardware to support low-latency, then you can add an AndroidManifest.xml “uses-feature” item named “android.hardware.audio.low_latency”.  Although, I’m not sure if this is a good idea or not.  It may restrict your app to a very small portion of the device market.

That is a fantastic gem, and is great to know. Audio latency is the  non-starter for anything dealing with audio playback at a rapid rate. Restricting the app to  only devices that can reliably provide that low latency is just ducky. 

Here is Google’s documentation on that “low_latency” feature…

   http://developer.android.com/guide/topics/manifest/uses-feature-element.html#hw-features

Yeah, I agree.  It doesn’t seem like an appropriate solution.  Maybe it’s okay for a music generating app.  That “Ocarina” app on the iOS app store comes to mind.  But other than that, using that setting will just damage the success of your app.  And probably confuse some of your potential customers as to why they can’t download it.

My thought would be that, in marketing the app, you would be sure to note that the app is only available on devices that can reliably (according to Google, at least) support it. The flipside of the “confusion” coin is that, by limiting the devices, you are ensuring that you aren’t going to get any reviews talking about the audio latency, from people that are unfamiliar with the issue.

One could make the argument that an intelligent audiofile would see that review and scoff, knowing it’s an Android/device limitation and not the original developer’s fault, but now we’ve fallen down the rabbit hole. :wink:

Hi Joshua.

Thanks for you quick reply. I always appreciate when someone is technically sound in his answer as you usually are.

The limitation of playing just 1 sound at time I got it from the docs:

Gotchas

Only one sound can be playing using this sound API. Calling this API with a different sound file will stop the existing sound and play the new sound.

But I didn’t test it to see if that actually happens on the device (I trusted the doc :slight_smile:  ). I think that limitation is the worse one, so if it indeed does not happen on the Android device, it is a great thing. I will test it and let it know about it.

Oh wait.  I was talking about media.playEventSound()…

   http://docs.coronalabs.com/api/library/media/playEventSound.html

I recommend that you use that API on Android for sound effects.

I do not recall if media.playSound() can play multiple audio files at the same time on Android.  That *might* be true, but I usually recommend our audio.* API over media.playSound().  I don’t think media.playSound() provides any additional benefits other than it’s simpler to use.

I can confirm that on my Nexus 4 running 4.4.4 the media.playEventSound() API allows for multiple simultaneous sounds. I am not seeing any lag when using this API.