As mentioned in yesterday’s multitouch post, there are a number of gestures that are recognized by Windows 7 out of the box:
Jan 09, 2019 Microsoft has introduced a bunch of new touchpad gesture with Windows 10.These new touchpad gestures allow you quickly switch between running apps, open task view, open Cortana, and perform many other jobs by simply moving two or more fingers on your laptop’s touchpad. Feb 24, 2010 Related Drivers 10. Filename: TouchPadWIN764z70510.zip Asus VivoBook S200E TouchPad Driver 1.0.36 Asus TouchPad Driver 8.0.5.1 Asus Eee PC R015PX TouchPad Driver 7.0.5.13 Asus Eee PC 1018P Synaptics Touchpad Driver 15.0.20.0 Asus Eee PC 1018P Elantech Touchpad Driver 7.0.5.11 Asus Eee PC 1201T TouchPad Driver 13.2.6.1.
Now, let’s look at how to code for gestures.
First, it would be good to know if the machine on which your application is running supports multitouch, so if it doesn’t, you can degrade gracefully.
// Test for touch
bool bMultiTouch = false;
int value = GetSystemMetrics(SM_DIGITIZER);
if (value & 0x40)
{
bMultiTouch = true; /* digitizer is multitouch */
}
There is a method called GetSystemMetrics that retrieves system configuration settings. If you pass in SM_DIGITIZER, it will return a bit field with the following settings:
Value | Hex value | Meaning |
TABLET_CONFIG_NONE | 0x00 | The input digitizer does not have touch capabilities. |
NID_INTEGRATED_TOUCH | 0x01 | The device has an integrated touch digitizer. |
NID_EXTERNAL_TOUCH | 0x02 | The device has an external touch digitizer. |
NID_INTEGRATED_PEN | 0x04 | The device has an integrated pen digitizer. |
NID_EXTERNAL_PEN | 0x08 | The device has an external pen digitizer. |
NID_MULTI_INPUT | 0x40 | The device supports multiple sources of digitizer input. |
NID_READY | 0x80 | The device is ready to receive digitizer input. |
There are some limitations for GetSystemMetrics. For example, there is no support for plug and play. Therefore, be careful of using this function for permanent configuration. If you add a multitouch digitizer later, this function would need to be called again to know that multitouch is supported.
By default, your window will receive notifications when gestures occur, in the form of WM_GESTURE messages.
A window can receive gestures or raw touches, but not both. If you want to work at the raw touch level as opposed to the gesture level, you can call RegisterTouchWindow. You will then stop receiving WM_GESTURE messages and instead receive WM_TOUCH messages.
By default, the application receives all gesture messages. However, you may have an application in which the user can move a marker on a game board around on the screen, but you don’t want the marker to be resized. In that case, you don’t care about the Zoom gesture, but you perhaps do want the Translate and Rotate gestures. You can configure which gestures will be sent using SetGestureConfig. It takes an array of GestureConfig structures, which each contain an ID, messages to enable (the dwWant parameter), and messages to disable (the dwBlock parameter). This will change the gesture configuration for the lifetime of the window, not just for the next gesture.
Here’s an example. I create a GestureConfig which blocks nothing and wants all gestures. The GestureConfig struct is passed into the SetGestureConfig method.
You can also dynamically change your gesture configuration. The WM_GESTURENOTIFY message is sent to your window to indicate that a gesture is about to be sent, which gives you an opportunity to set your gesture configuration.
You will receive notifications that gestures occurred as WM_GESTURE messages. Use a switch statement to discover what gesture you received, and respond appropriately.
Information about the gesture is stored in the GESTUREINFO structure.
In tomorrow’s post, we will talk about managed code support for multitouch.
Other blog posts in this series:
Multi-touch is cool and useful but unfortunately it doesn’t come naturally with any version of Windows, including Windows 7. I bet if you watch how people use their multi-fingers on their Mac you would wish you could do the same thing on your Windows 7 powered computer. Here, if you have Synaptics TouchPad equipped computer like laptop, you are about to get something similar.
The new Synaptics Gesture Suite device driver is now equipped with Scrybe gesture workflow technology, meaning that you can finally experience the scrolling, zooming, and rotating gestures on your TouchPad equipped computer.
Looks cool, doesn’t it? Let’s head over to the driver’s page to get this cool thing downloaded. It supports to almost all Windows Operating System, including Windows 7, both 32bit and 64bit.
Once you have the drive loaded and have your computer restarted, you are ready to rock. The Scrybe icon is silencely running in the system notification area. You can right click on it to get more options.
Now, open a webpage that has multiple pages long, and try your two fingers scratch the TouchPad to scroll the page.
You can also use your two fingers to zoom in and out the web page or a photo.
Tapping your 3 fingers brings up its launch pad. Draw a symbol and it will launch the application assigned to that symbol. For example, I just drew a question mark that would launch the search automatically for me.
It’s got a list of gesture symbols predefined but you can definitely add more onto the list.
Depending on what type of TouchPad you have got on your computer, you might get different result from the multi-touch experience. I’ve tested it on my Lenovo ThinkPad T420S laptop and I can get most of it on my laptop but not all of them. So if you happen to have a laptop that has the TouchPad on it, you may want to give this a try.