Snapdragon XR2 Chip To Enable Standalone Headsets With 3K x 3K Resolution, 7 Cameras
original event

Passthrough AR (using cameras to capture the room around you for realtime display on the screen) needs a minimum of ~200-250hz, and *really* 400-1000hz if it extends into your peripheral vision. Otherwise, the 'slosh' will have you puking in no time.

Motion sickness is a problem, but SLOSH-induced VR sickness is ENORMOUSLY worse, and not something you EVER really adapt to (besides learning to turn your head slowly & close your eyes before turning your head quickly to hide it from yourself).

Jitter (synthetic images not anchored properly) in foveal vision is annoying, but is stressful & sickness-inducing in peripheral vision. Blame evolutionary survival mechanisms that depend upon noticing 'danger' out of the corner of your eye.

One stopgap idea I had: do 'bluescreen' passthrough AR. Render synthetic video to an alpha-blended framebuffer at ~100hz. Capture stereo camera images at 400hz, and combine both for output to a 400hz display (repeating each frame of synthetic overlay video 4 times per camera-frame).

The net effect would be similar to optical AR (eg, Magic Leap & Hololens) today... the synthetic image would still lag a bit, but at least the reality-anchoring passthrough video wouldn't slosh.

Slosh is just one problem of many... but it's a BAD problem that totally kills the usability of passthrough AR @ 90hz.

I'd also propose a short-term display compromise: 8k displays, with the understanding that only a small part can sustain synthetic video at full framerate. Use case: developers who run their IDE directly on virtual monitors in the headset. At 8k, you can render the equivalent of 3 2560x1600 monitors (one fully in view, plus ~1/2 of the adjacent ones) at normal viewing distance, so you can develop VR apps without having to constantly put on & take off the headset. Yes, everyone knows 90hz + 8k games aren't viable with present-day GPUs... and that's ok. Make 'developer' headsets with double-rez displays so they can render the IDE at 8k, but program games for 4k & just scale them 2x2 when running on a 'developer' headset.

Same deal with 4k monitors. Yes, we all know interface speeds & GPUs are too slow today for 4k @ 240hz. So make the display 3848x2560 & capable of 60-120fps at that resolution... but capable of 240-480fps at 1920x1080 with 2x2 nearest-neighbor scaling, and maybe even 960-1000fps at 960x540. I'm so sick of having to choose between a display that can do 1080p120 or 2160p30, instead of both 1080p120 and 2160p30 (or 1080p240 + 2160p60). Computationally, any monitor with g-sync or FreeSync should be able to do this. Worst-case, it might need another $5 worth of RAM to allow static double-buffering. For an expensive premium monitor, that's *nothing*.

Please visit original link if the content is unavailable. This page is rendered by Context crawler for better reading experience, the content is intact.