I have a Surface Pro with half of the touchscreen broken. To keep it short, is there a way for me to spawn a virtual touchpad like a virtual keyboard (onboard
), so that I can use the tablet alone?
I have a Surface Pro with half of the touchscreen broken. To keep it short, is there a way for me to spawn a virtual touchpad like a virtual keyboard (onboard
), so that I can use the tablet alone?
Executive summary
To answer the question "if there is a way", I would say that my guess is that, yes, it is theoretically possible. But the details of how to do it is a question for users more advanced than me. I'm also guessing that the solution wouldn't be very practical (both in implementation and usage) and that it would be more efficient to repair/replace the touchscreen.
Explanation
Having said that, here's why I'm saying it's possible.
One of the properties of the device is a "Coordinate Transformation Matrix", which you can see and change with
xinput
. This matrix maps the physical sensors of the touchscreen to the screen locations of the display system. In normal operation, that matrix is the identity matrix, so the mapping is one-to-one (i.e., where you touch the screen is where the action is registered on the display). This matrix is used, for example, when you rotate the screen: together with the display output being rotated, the matrix changes to an appropriate rotation matrix so that the display output still matches where you physically touch the screen.This means that you should also be able to define a more general transformation matrix that maps a small rectangular area on the physical screen to the whole display.
So it should be possible to write a program in your GUI that pops up a small, rectangular window (that's always on top so it won't disappear when you click somewhere else) and at the same time, runs
xinput
to update the Coordinate Transformation Matrix accordingly so that it maps the touchscreen sensors in that rectangular area to the whole display output.To be clear, what I described up to here means it wouldn't work exactly like a touchpad. On a touchpad, if you put your finger at the bottom and drag it up, the cursor moves up; if you touch the bottom again and drag it up, the cursor would continue from where it was to go further up (i.e., it moves relatively to the point of contact). On the virtual touchpad I described, if you touched the bottom, the cursor would go to the corresponding part at the bottom of the display. To go further and make it behave like a touchpad, you could make the program change the matrix on-the-fly on each touch so that the contact point would match the current position of the cursor. But I'm guessing the response time would make this not really usable.
I don't know much about GUI programming, so this is not trivial for me, but it may be different for you. The other part, which is having your program calculate the correct matrix, involves some basic linear algebra. So you can judge how easy it is for you to implement a solution.
By the way, I've been talking about implementing this solution yourself because your problem is so specialised that I can't imagine there are ready-made tools out there; but I could be wrong!
To go even further, even if you did get something like I describe above, I can see two practical issues:
In conclusion: repair/replace your touchscreen!