@George I wouldn't be so quick to discount the Microsoft initiative. Microsoft does have a VR specification and there are several (relatively inexpensive) VR headsets which are available that support it.
I'd suggest you contact Bryce Johnson, Microsoft's "inclusive lead" in its product research and accessibility team. If Microsoft integrates eye support, it might allow disabled people to control Windows itself via eye control. In particular, imagine how life-changing it would be, if people could control a browser via eye control. That would open up the entire internet.
Years ago, my wife had a quadriplegic friend (who died far too young) who communicated (in text) via a BBS, using nothing but a blow straw. I would think that something like that would be possible via eye control, something like this: Double blink to enable keyboard mode, blink, then look in a direction, then blink again, to navigate a text entry window. There's software to do that already for blow straws, similar to what Stephen Hawkins used to generate speech. As a UI programmer, I'm sure something like that could be adapted for eye control.
For those clients who can speak, look into VoiceAttack, a game-focused speach-to-action program, which is only $10. I use it to play Elite Dangerous, but scripts can be set up to emulate a keyboard, mouse, and/or joystick. Here's a link: https://voiceattack.com/
I suggest that you discuss all of these ideas with Bryce Johnson and anyone else you can find in the "disabled services" industry. I hope you have great success in getting something to help your clients.