I've embedded CEF into a Win32 C++ app which needs to function on touchscreen machines as well as non-touchscreen. I'm currently having issue where clicking on a text box (in a webpage) does not bring up the Windows touch keyboard. I've done some google searching to see how I can make this work, but so far I've been unsuccessful.
I found this related issue post for CefPython https://github.com/cztomczak/cefpython/issues/57 and tried a couple of things:
I've embedded the following switches in my OnBeforeCommandLineProcessing handler and OnBeforeChildProcessLaunch:
- Code: Select all
command_line->AppendSwitchWithValue(CefString("touch-events"), CefString("enabled"));
command_line->AppendSwitchWithValue(CefString("disable-usb-keyboard-detect"), CefString("1"));
I registered the browser as a touch window in my OnAfterCreated handler:
- Code: Select all
RegisterTouchWindow(browser->GetHost()->GetWindowHandle(), NULL);
The second comment in the CefPython issue says
When clicking an edit field there is no virtual keyboard appearing, but this could be implemented on your own by injecting javascript on web pages through LoadHandler.OnLoadStart() or OnLoadEnd(), the
same can be done for other touch related behavior, when --touch-events=enabled flag is passed then javascript touch events will be enabled (touchstart, touchend, touchmove), allowing you to override the default behavior or implement the missing one.
However, I'm not a web developer, so I'm not sure how it would be possible to accomplish this by injecting javascript. I'm assuming they must be referring to some method of accessing some Windows API to bring up the keyboard, but I haven't found anything related to that in my searching.
Any help is appreciated.
Thank you.