I am trying to intercept user's touch action on desktop, and simulate their touch gesture to another position on screen. The target window is windows desktop or windows explorer.
current solution.
I am using RegisterPointerInputTarget API to direct all user touch information to the window which I created so that I can receive all the touch information.
Then I am using InjectTouchInput API to create touch information to expected position.
Problem:
Current solution can work in IE and chrome but does not work in windows explorer and windows desktop.
The reason is windows explorer or windows desktop does not get a message "WM_POINTERACTIVATE" before I calling InjectTouchInput, so that windows explorer or windows desktop can receive the touch message from my InjectTouchInput, but these touch messages are ignored.
The reason message "WM_POINTERACTIVATE" is so important, because this message could mark my first touch pointer(InjectTouchInput) as primary pointer. Without primary pointer, windows explorer or windows desktop will ignore all touch message.
I highly suspect that the user's original touch pointer are considered as primary pointer, so that my simulated pointer cannot get the chance to be marked as primary pointer and "WM_POINTERACTIVATE" is lost.
Question: Is there some way we can get the chance to mark our simulated pointer as primary pointer. Is there some API, which can remove the primary pointer flag from existing touch pointer. Or there some message, which I can mark my touch pointer as primary pointer.