- cross-posted to:
- technology@lemmy.world
- linux@sh.itjust.works
- cross-posted to:
- technology@lemmy.world
- linux@sh.itjust.works
Not sure I could ever live with that - anyone able to test if multi monitors works?
Not sure I could ever live with that - anyone able to test if multi monitors works?
The use case I see is screens mounted on something that moves.
It’s easy with accelerometers to know the orientation, so you can display things on something that in its whole or has parts that move in an additive way.
Imagine an movie screening with the screen mounted on a float in the ocean.
The float moves with the waves. You can stabilize the image of the movie to be still while the screen itself tilts.
Something like this, but then with a direct screen instead of a projected one.
Another use case would be applying this to smartwatches or other displays like that.
You could make the output of the screen always be perfectly aligned with your line of sight rather than have it tilted at an angle parallel with your arm.
Knowing how big Linux is in embedded systems I almost wonder if it was originally implemented for some kind of full-motion simulator since that could easily call for very funky display mounting
I think most people would just use media server software like pixera, d3, touchdesigner etc to accomplish playback of video on a moving surface with feedback sensors.
It’s established tech, plenty of integrations, and most companies that are able to deliver something like this aren’t a linux-first type of company.
If it was for an installation, something bespoke might be made using Linux. But the cost of touchdesigner and a suitable computer are tiny compared to doing this using Linux and then supporting and documenting it (especially considering how widespread skills in touchdesigner/pixera/d3 are in the industry Vs more esoteric Linux skills)