How to allow touch support on multiple monitors simultaneously?

0

I made a ReactJS app which will run in kiosk mode in a museum, on touch screen monitors. That is, the app will run in the Chrome browser, full screen, with right click and other gesture events disabled (to disallow people from leaving the browser). To do this I terminate explorer.exe and disable right click in chrome with javascript.

Now, the setup is the following : one computer and 6 touch monitors. Each monitor will run a chrome browser window with its own application. These application instances are of course totally independent from one another.

My issue starts here. How can I allow two people to, for instance, swipe on a slider on two different monitors at the same time ? Because by default, the first person to start swiping will be able to do so, but the person starting to swipe in second will be blocked. Or another issue: if two people "click" at the same time, only one will launch the touch event

Basically the issue boils down to that, how can I allow on Windows 10 two people to use simultaneously touch events on two different monitors ?

Antonin Cezard

Posted 2017-06-02T08:57:47.033

Reputation: 237

From what I know there is no easy way to do this. Even if it's multi touch enabled it would register across displays (my assumption). What you would want to do is have one input per display that is passed to the application. What did you do in order to let the application be displayed on each screen? The tool you used for that might support it. – Seth – 2017-06-02T09:31:04.820

The application is launched 6 times, once on each monitor. Application is 100% the same on each screen, just launching 6 chrome instances, one for each screen. Works ok if people don't use the monitors at the same time – Antonin Cezard – 2017-06-02T09:33:39.280

With that kind of setup you're likely looking for a multiseat setup. That setup requires additional software and probably will be easier to setup on Linux. – Seth – 2017-06-02T09:47:29.253

No answers