native.setKeyboardFocus() doesnt work properly for win32 apps

Win32 apps that run on windows tablets (which dont have a keyboard) really need a way to trigger the onscreen keyboard to come up. This function I thought was supposed to do this (according to the doco) is native.setKeyboardFocus() but it doesn’t seem to work. It sets focus to the specified native.textfield, but the onscreen keyboard is not displayed.

I have a fullscreen win32 app which has a login prompt, but at the moment i cant find a way to get the keyboard displayed for those on windows tablets.

Does anyone know how to do this, or is this a bug?

Have you tried building our sample apps that use the keyboard?

Yes, I tried the NativeKeyboard sample app and that doesn’t bring up the onscreen keyboard on the Win 8.1 tablet i tested with.

What tablet is it?

Is it running a version of Windows32 (7, 8, 10)?

Is it running a version of Windows Phone 8 (8, 8.1)?

Hi Rob,

Its a Dell Venue 8 Pro running Windows 8.1 Professional. It has also been tried on other windows tablets by some of my customers who have also reported the same thing. 

As mentioned in the OP, this is for win32 apps running on the proper windows OS, so not WP8 apps/os.

As far as I know, classic Win32 desktop apps (ie: a *.exe app) has no control over the virtual keyboard.  It is controlled by the Windows OS.  Remember, that these types of apps were originally designed by Microsoft for standard desktop PCs.  Microsoft talks about this here…

   https://msdn.microsoft.com/en-us/library/windows/apps/hh972345.aspx#touch_keyboard

And here is a quick excerpt from the above link…

If your app sets focus programmatically to a text input control, the touch keyboard is not invoked. This eliminates unexpected behaviors not instigated directly by the user. However, the keyboard does automatically hide when focus is moved programmatically to a non-text input control.

I’m not on a Windows 10 machine at the moment, but I know there is an option under “Settings\Typing” that you can check on to “Automatically show the touch keyboard in windowed apps when there’s no keyboard attached”.  I don’t know if that setting only applies to WinRT universal apps or not, but that might invoke the behavior you are looking for.  But my point is, this is under the OS’ control.

That setting you mentioned doesn’t exist on the Win 8.1 tablet I have, so it might be windows 10 specific or windows RT specific. 

I would imagine that the majority of people who use corona develop for phones/tablets, so the addition of win32 apps at least gives us the ability to install on modern windows 8.1+ tablets which are becoming very prolific. Being able to access the onscreen keyboard seems like a pretty fundamental thing for a tablet app to be able to do. I understand that having it “automatic” might not be possible, but surely it can be triggered by a function call. The intention of that setKeyboardFocus (according to the doco) seems to imply that its to set the focus to a particular text entry and show the keyboard. The same function can also be used to hide the keyboard. 

I know you can bring up the win 8.1+ onscreen keyboard by running C:\Program Files\Common Files\Microsoft Shared\ink\TabTip.exe, so perhaps a solution is to have a way to execute another win32 process from within corona, or have the setKeyboardFocus function do this.

I found someone else who referred to being able to hide the keyboard programmatically. It appears that TabTip.exe is always running on win8+ tablet devices, so it should be possible to find the window handle and post a message to it telling it to show. Below is the code that was posted for hiding it

void HideOSK()
{
    HWND KeyboardWnd = FindWindow(“IPTip_Main_Window”, NULL);
    PostMessage(KeyboardWnd, WM_SYSCOMMAND, (int)SC_CLOSE, 0);
}

My point is that this is *not* normal behavior on Windows.  Microsoft did not intend for the virtual keyboard to be brought up programmatically as can be seen in their documentation.  And from a Windows user’s standpoint, they would expect your app to behave like other Windows apps.  That is, it should adhere to Microsoft’s UI guidelines.  Especially since the vast majority of times a physical keyboard will be connected to the PC, where displaying a virtual keyboard would be unnecessary and frankly annoying.  If the user wants to enter text into the field, then they would opt-in to it by tapping on the field just like they would in other Windows apps.

The behavior you are currently seeing is correct.

It is not normal behaviour for a DESKTOP. Yes windows has primarily been a desktop OS for a very long time, but not anymore. Windows tablets are extremely common and since Corona only supports win32 and not proper windows universal apps, the only option we have for windows tablets is win32. The vast majority of times a keyboard will NOT be connected to windows tablets, because docks are optional. When playing touch based games on a windows tablet you generally wouldnt have it connected to the dock.

If it is your intention that Corona Win32 apps should be for desktop PC’s only then so be it, but offering the ability for Corona developers to successfully target windows tablets I would have thought would warrant some consideration too since that market is growing very quickly. 

This is primarily a problem with Fullscreen apps, in a windowed app you can obviously just click the keyboard icon on the tablet to bring up the on screen keyboard, but a windowed app is not really normal for a tablet or game for that matter.

I’m not sure why you’re arguing with me about this.

Microsoft’s documentation states it very clearly that Windows apps should *not* do this.

That’s why Microsoft does not offer a native Win32 API to do so.

And Microsoft does manufacture “Surface” tablets and they’re clearly okay with this behavior.

If you don’t like this behavior, then I think your complaint is more towards Microsoft, not us.  I personally agree with how Microsoft made it because the user is in control.  All a user has to do is tap on the field and the virtual keyboard will be displayed.  It’s pretty simple.  This really isn’t an issue.  Just a native behavioral difference.

Really, please tell me how because the when I tap on the text field a virtual keyboard does not appear. This is the whole point of this thread, the virtual keyboard is NOT displayed when i tap on the field and because the app is fullscreen the user does not have access to the keyboard icon in the task bar to bring it up manually.

If the virtual keyboard is not appearing, then that means Windows is detecting a physical keyboard.  Such as a Bluetooth keyboard.  Windows will typically only show a virtual keyboard if it doesn’t detect a physical keyboard or if put into “Tablet Mode”.  You’ll run into the same thing with other Windows apps such as Notepad or Internet Explorer.  For example, if you put Internet Explorer into fullscreen mode by pressing F11 (or tap and hold on the tab bar, display the menu bar, and tap the “View\Fullscreen” menu item), then notice that touching the URL bar won’t display the virtual keyboard either.  Again, Windows will only display the virtual keyboard if a physical keyboard is not detected.

I do not have a bluetooth keyboard and this particular tablet is not dockable. When i run Internet Explorer which is full screen by default on Win 8.1 in touchscreen mode, clicking the address bar brings up the touch keyboard fine, the same does not happen in my full screen corona app when i click in the native text boxes.

Also the same issue happens when using your sample app called “NativeKeyboard” which i modified to also run fullscreen.

>> When i run Internet Explorer which is full screen by default on Win 8.1 in touchscreen mode, clicking the address bar brings up the touch keyboard fine

You must be using the WinRT version of Internet Explorer (aka: the metro app version), because the Win32 version of IE does not do this for me.  My point being is that Win32 desktop apps do *not* always show the virtual keyboard when tapped.  Try it with Windows “Notepad”.  You’ll see what I mean.  Win32 apps play be different rules compared to WinRT apps.  That’s by Microsoft’s design.  This is the normal native Win32 behavior.

That said, there is a Windows 8 *exclusive* feature that we can use that requests the OS to show the virtual keyboard when a text field has been tapped regardless if a physical keyboard is connected.  But it *only* works in Windows 8, not Windows 10.  Microsoft took that public API away from developers in Windows 10.  This is because Microsoft wants virtual keyboard handling to be OS controlled.  In Windows 10, this is a setting that can be turned on/off globally (found under “Settings\Devices\Typing”), which impacts all Win32 apps (not WinRT/Metro apps).

We’re willing to implement the above mentioned feature when running a Corona app on Windows 8, but that’s as far as we’re willing to go.  Anything more would violate Microsoft’s UI guidelines.  Keep an eye out for a daily build regarding this.

And if it makes you feel any better, know that it isn’t any better for other Win32 app developers either.  Your app will behave consistently with other Win32 apps.  Case-in-point, see the bug report link below regarding the Mozilla FireFox browser regarding this same topic.  The most they can do is implement the Windows 8 feature I mentioned above, which is not available on Windows 10.  And from the looks of the bug report status, they didn’t resolve it.

   https://bugzilla.mozilla.org/show_bug.cgi?id=963157

I think part of the problem is that very few win32 apps (other than games) ever run in fullscreen mode, so the user has always had access to the touch keyboard via the task bar which is probably why it hasnt been considered very well by ms. An interesting point on this is that MS is investing alot of resources into Project Centennial which will allow win32 apps to be packaged up and sold via the windows store, im not sure if it would also be forcing them to run fullscreen as well or how it would work exactly, so this issue may become even more problematic to the general win32 developer community once this happens (depending on how win10 handles touch keyboard input by default with that setting)

If that setting in windows 10 is on by default for pure touchscreen devices then it would be a workable solution having the win8 exclusive option and relying on win10 to do the correct thing.

If its possible, perhaps time would be better off spent on an api function that allows up to execute an external process. This would allow people like myself to have a keyboard icon similar to the task bar’s in the corner of our fullscreen app to allow the user to trigger the touch keyboard popup when they want it (by calling tabtip.exe). It would also have the added advantage of being potentially very useful for alot of other people. Win10 is also being agressively promoted and pushed out (via automatically installed recommended updates) to users so win8 devices will probably be a minority soon enough.

Thanks for looking into this. 

Daily build #2843 has that Windows 8 change I was talking about above.  It’s available now…

   https://developer.coronalabs.com/release/2016/2843

Interestingly enough, I’ve recently upgraded a Windows 8 laptop to Windows 10.  That setting I mentioned was not enabled by default.  But since all Win32 apps are effected by this, I would hope that Windows users would learn to enable it.  It’s the less PC savvy users that it’ll likely be an issue with.

Regarding launching external applications/EXEs, the Lua os.execute() function can do this, but I don’t really recommend it on Windows since the Lua library (not written by us) routes it to a C function that executes the given string at the command line.  On Windows, this will cause a Console window (aka: DOS box) to popup, which looks ugly.  And that console window will remain onscreen until the application that was executed has been closed.  Providing a new API to launch external apps would mean coming to an agreement with the rest of the team on how to do this on other platforms such as OS X.  That’s not likely to happen any time soon.

If you’re willing to write your own DLL in native code, then you can try doing this yourself.  The Win32 C function you’ll want to call is ShellExecute() to do this.  But it’s a bit trickier than that since you’ll need to hunt down the “TabTip.exe” which is normally installed under ProgramFiles.  Anyways, I’ve posted instructions on how to create your own Win32 plugin DLL here…

   https://forums.coronalabs.com/topic/57623-support-for-nativenewtextfield/?p=298499

Thanks Joshua

Have you tried building our sample apps that use the keyboard?