3

Ask HN: Can you recommend an “instant-switch” Monitor? Does one exist?

 1 year ago
source link: https://news.ycombinator.com/item?id=34048573
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Ask HN: Can you recommend an “instant-switch” Monitor? Does one exist?

Ask HN: Can you recommend an “instant-switch” Monitor? Does one exist?
98 points by kappuchino 7 hours ago | hide | past | favorite | 100 comments
Current situation: - I have three sources connected to my monitor (1 HDMI, 1 DP, 1 USB C). - The monitor is on a switched off source i.e. due power save mode. - When I try to switch to another source, it takes 4 seconds or more to switch

What the monitor should do: S - Switch instantly no matter if the current source is on or not.

WHY the long delay??? It is really something that angers me.every.damn.time.

I have a Dell U2723QE, but older models have that as well as others from other brands.

Would love to get a recommendation for an instant switch model.

HDMI Switching like that takes long because there is a EDID negotiation going on between the minitor and the input devices.

If you want to switch fast and often your best bet is getting an HDMI switcher that is basically always connected to the monitor and both sources and then toggles between the two. Beware of the supported resolutions and framerates.

A cheap hack would also be to connect your second source to the first using a HDMI Grabber (basically a HDMI-source-to-USB-webcam-converter) and open a fullscreen window displaying that. Then you could e.g. switch by switching virtual desktops.

Another post recommended rhe Blackmagic Atem, this should also work.

s.gif
Even with one of the fancy KVM switches that were recommended here my switching isn’t instant. It did go down from 10-20 seconds of flickering to +/- 2 seconds though.

May not be worth it if you are already at 4s though.

s.gif
What model do you use that gave you 2 seconds switching time? One of the mentioned Rextron or something else?
s.gif
My "NEWCARE HDMI 2.0b Switch 3 in 1 Out" switcher is that fast.
s.gif
What if you modify the cables to remove the DDC wire?

Can no EDID negotiation be faster?

s.gif
EDID is important because it is the only way for the host to know which resolution, color depth, frame rate the monitor supports.

Even if you remove the DDC wires, the OS will still try to communicate, and it might delay this even more because of the retries and delays in code.

You might get a headstart if you hardcode the EDID in the OS itself. There are ways to do this on Linux [1], Intel macOS [2] and Windows [3].

[1] http://billauer.co.il/blog/2020/08/linux-override-fake-edid/

[2] https://github.com/mbruggmann/osx-edid-overrides

[3] https://learn.microsoft.com/en-us/windows-hardware/drivers/d...

s.gif
See my answer here: https://news.ycombinator.com/item?id=34050157

TLDR: because it is done over a slow protocol (I²C) and because of artificial delays in code to account for slower devices

s.gif
> Can no EDID negotiation be faster?

No, at this point removing EDID will simply get the display not detected at all. It used to be optional in the times of VGA, where you could just drive out 640x480 "blind", but there's too much stuff in there these days.

This has been annoying me since I first tried a PC after using Atari computers since they were a thing. No matter what monitor I use, it will take for ever to switch input port or just a simple change of resolution. I was used to be able to switch resolution several times per frame, you could make a game with low resolution and color on the upper 3/4 of the screen and have the stats in higher resolution 4 color mode on the lower 1/4 of the screen without any sort of delay in the signal, you just set an interrupt on every scanline and if the monitor was painting the 300'th scanline, you witch mode and suddenly it's doing double resolution for the rest of the frame. Then switch back on the Vertical Blanc Interrupt at the start of the frame paint.

I was quite surprised when the brand new PC needed 2-3 seconds just to switch resolution. And we are still there after 30-odd years? A KVM switch with analog signal was just the same, still long time to switch between signals. I'm using a remote control software to the local computers so I don't have to wait between every switch. With the added bonus that I can copy/paste between the computers without having to think.

s.gif
It was really jarring when I got my M1 Pro MacBook and switching resolutions on the internal display was literally instant. No black screen between the switch whatsoever, it was just there. Makes you wonder what else we've been missing out on all this time.
s.gif
It's because Macs (almost) always output the native resolution of the display. What you are changing is the draw resolution or scaled resolution.
s.gif
That may technically true, but on the Intel MacBooks, switching resolutions was still a major ordeal with multiple black screens flickering past.
s.gif
One of the reasons why you could switch modes mid-frame is because the concept of a "video mode" is fundamentally different on a PC than on an Atari or Amiga. On a PC, ever since the EGA, the adapter can output at different horizontal and vertical scan rates and the monitor can adapt to the requested mode from the video card. That's why switching video modes takes a couple seconds and may involve a "click" or two. Upon mode switch, the monitor must resync to the new signal.

On the Atari and Amiga, the horizontal/vertical scan rates are constant; the only thing that changes is how scanout turns pixels in video memory into signals for the CRT.

Recently (last few months) I've been testing HD Fury VRRoom, and am happy with it, as it allows switching instantly regardless of monitor, by having negotiation already set up:

https://hdfury.com/product/8k-vrroom-40gbps/

It maintains a live connection to each of 4 x HDMI inputs, 2 x HDMI outputs, and 1 x eARC audio output, and can switch between them "instantly" as you ask. You can switch with a button at your desk, or via API.

I selected this model because it supports (a) up to 8K and we use 5K monitors, both full 5K and ultrawide 5K, and (b) variable refresh rate (VRR) at 4K@120hz for gaming.

In a teleconference room with e.g. a big on the wall and a small screen on the table showing same or different sources, this lets you have multiple input sources (e.g., teleconf system, laptop, game system, and OTT TV source) that 'just work' without users having to know how to make it work.

The feature list for video includes:

- Unlock true VRR/FRL on Samsung Q90/QN900/QN95 and similar models

- Up to 8K60 444/RGB 12-bit via DSC at 96Gbps

- World First 40Gbps Upscaler, Splitter, Switcher with full audio extraction for VRR and any signals

- Play Xbox 1X games at 4K120 Dolby Vision on LG C9, BX & SONY HDMI 2.1 TV Work from ANY HDMI source to ANY HDMI, ARC or eARC sound system

- HDMI 2.1 Full Audio/Video passthrough up to FRL5: 40Gbps/1200MHz

- Upscale FRL5 and ANY signals 2K>4K, 4K>8K and 2K>8K up to 120Hz (no upscale for VRR/1080i/720p/480i-p)

- Downscale ANY FRL0/TMDS signals 8K>4K, 4K>2K or 8K>2K up to 120Hz (no downscale for FRL5/VRR tbd)

- Add or remove any input from the CEC network at any time via webserver, APP, IR/IP or RS232

And for audio, it solves getting eARC signal to amps or soundbars that can support Atmos when the monitor or TV doesn't support it:

- Full Audio from ANY HDMI source (including VRR signal) to ANY HDMI, ARC or eARC sound system

- Full Audio up to Atmos/TrueHD from any HDMI source to SONOS Arc/Beam2 or any eARC sound system

- Solve SONOS Arc/Beam2 + other lip sync issue when using external HDMI sources connected to eARC TV

If you get one and find yourself needing support, join their Discord server.

Closest you’ll get is something like a Blackmagic ATEM switcher, using “Program” as the output. Then you can map a software key to swap it out in a single frame. As discussed in another thread, the computer has to be sending the display signal to some device that is seen as a display, and that switcher keeps the video from both PCs buffering at 60 or 30 fps. To the computer it can’t know the display switched and the display can’t handle two simultaneous streams, so you have a device in the middle that bridges that. Down side is depending on the quality of the switcher you might get 1-2 frames of lag because it’s optimizing for frame accurate sync to ensure the least possible “flicker” when the source is switched.
Your answer wont be found in monitors. You need a media switch, a good one like thise used in the enteraimment industry for broadcast-quality transitions between inputs. Alternatively, look into streamer-quality "media switches" used by streamers with multiple camera rigs.

https://restream.io/blog/best-video-switchers/

This is one thing I really miss about analog tv. You could switch channels as fast as you could press the button.

Everything now is so slow.

s.gif
This is one of the things on my TODO-list for when I stumble upon a way to have infinite amounts of time:

Learn "FPGA 'n stuff" and build my own display controller. Like many people here, I don't understand why this takes so much bloody time. Not just the input switching, but also switching resolution or refresh rate on the same input. So either I'd have an a-ha moment and see why it's not possible (more likely outcome), or well, it'll just work and I can write a cool hackaday post.

The EDID explanation given in the comment currently at the top doesn't make too much sense: Even if the display is off and not in some super hardcore energy saving mode, EDID works, i.e. your computer can query the EDID data from the screen and know what its preferred resolution is, etc. Likewise, even if your display is switched to HDMI-1, a machine connected to HDMI-2 can already be outputting 1080p60 to it, so why does switching inputs take more than a few milliseconds?

Afaik (and I only have very rough knowledge here), there is no side-channel in HDMI/DVI that tells the display what mode it's actually receiving, so the display has to look at the signal and make sense of it. So you have to put some brains in it. But that can hardly take several seconds, i.e. hundreds of frames? Maybe two or three frames! I could see this being a cost cutting measure, maybe doing the dumb implementation that takes seconds instead of milliseconds saves you a cent or two per device.

s.gif
Or, perhaps, weird regulatory/big customer energy saving reasons - “TCO” stickers and whatnot imply some kind of standard measurement, can imagine a “wake from sleep on source change” benchmark driving odd behaviours.
s.gif
My current (Samsung) display is a pain to switch off. If I forget and switch the PC off first, it goes to sleep and starts blinking its annoying bright blue LED, which I managed to disable when it's on or off, but not when it's asleep. To get it to switch off I have to wake it up, wait until it cycles through all its inputs, gives up and shows me the menu, from which I can finally turn it off.
s.gif
I assume this is for a different reason - buffering and stream compression, you have to wait for a key frame before your TV can display anything.

Similar root cause though - technology becomes available that makes the steady state performance much better in return for some setup costs, the setup costs are worth paying in almost all applications so it's an obvious choice to go with the new technology but still annoying when you encounter its drawbacks.

s.gif
Yes! I remember running my finger down the column of electrical contact switches and changing the channel at 60 fps. "brrrrrrip!"
Not on a monitor (that I'm aware of), but it is a major feature of signal distribution equipment for installed AV and broadcast.

A large part of the delay that you are seeing is from EDID negotiation and HDCP. This happens of the 100kbit/s DDC channel and requires a bit of back and forth. On the above systems this is pre-negotiated so that switching can take place across a single frame.

A lot of that equipment is likely price prohibitive for desktop use, but you may be able to reduce the time a little with sidestepping that process either in your machine setup or with an external bit of lower cost hardware like and EDID emulator (e.g. https://www.extron.com/product/edid101h4kplus).

This delay also often happens when I switch between linux virtual consoles (Control+Alt+F7/F8), despite both of them using exactly the same X display settings.

Not always, though. The switch sometimes happens instantly. (At least, it has been this way since I replaced my Nvidia card with an AMD. I don't recall whether the Nvidia ever switched instantly.)

I wish I knew what conditions make it instant, so I could try to make it happen consistently.

Maybe solving your problem in other ways:

0: I use `ddcutil` (on Linux), it takes about 2 seconds for the switch: `ddcutil -b $bus setvcp 0x60 $source`

1: Use something like NoMachine or VLC - would probably cause more problems if you've got devices that go into power-save mode

2: Ensuring that your devices are outputting the exact same mode may decrease the time.

You'll need a fancy KVM switch that does more than merely acting much like a physical switch would.

Something like this should probably work:

https://www.rextron.com/product-4K-2-Port-Full-Frame-PBP-KVM...

Edit: The one above is merely the first one I found, not a recommendation.

Be aware that such things may incur extra latency, since it's not just passing signals through, but rather processing them and repackaging them.

s.gif
As an aside: does anyone know the deal with Rextron, Extron, and Crestron? What’s going on with the names there?
s.gif
Extron and Crestron are both major vendors within the audiovisual industry. Both make (mostly) extremely solid products.

Rextron is not a company or set of devices I've ever encountered. From a quick look at their site they look to be an OEM/ODM provider though so would hazard a guess they may have taken some 'inspiration' from either of the above orgs.

s.gif
Unfortunately I have a similar KVM switch and it is very much not instant. In the same 4 second ballpark as OP. Annoys me too but I just put up with it since I don't have to switch very often and it's sure faster than switching inputs on my monitor's shitty OSD which is what I used to do.
s.gif
Huh. I don't personally have one, but it seemed more like 1s in normal operation to me and pretty much instant when switching while in PIP mode.
s.gif
There is no price listed for this switch. How much does it cost?
s.gif
I have a few tesmart HDMI switchers are they are fairly reasonably priced (~$100) and switch decently fast (Under 1 second usually). Occasionally I have to switch the unit off/on if it gets confused, once a quarter or so. There's a switch on the front of the unit so it's not terrible.

They also support hot key switching, so my hands don't need to leave the keyboard to switch, which is nice.

s.gif
> They also support hot key switching, so my hands don't need to leave the keyboard to switch, which is nice.

Note that the keyboard interception is less than stellar, so it’s equally possible your keyboard just won’t work correctly when plugged into their magic port. My copy-paste is impossible when plugged in (well, dropped 2 out of 5 times, which might as well be).

I bought one of the first GSync monitors many years ago (ASUS PG278Q ROG Swift) and it goes from sleep to displaying an image in just a fraction of a second. It's totally amazing, way faster than any other monitor I've ever seen, including newer GSync monitors. It only has one single input port though. I've never tried hooking it to a KVM, but I bet it would switch super fast.

Maybe try one of the first gen (2014) GSync monitors along with a DisplayPort KVM, if that can work for you. It's my understanding that the first gen GSync monitors all used the same monitor driver board that Nvidia made, so they probably behave similarly.

Honestly I didn't know how much I needed fast wakeup until I had it. It's a way better feature than GSync, yet monitor and TV reviewers generally ignore it completely. There are probably newer monitors with fast wakeup but it's impossible to know without testing. My TV takes like 8 seconds to even turn on its backlight and it's just ridiculous.

Ugh yes, this should be benchmarked! I’m currently on a 34” widescreen Philips monitor and it takes about 10 seconds to switch, it annoys me majorly every time I do it. Just starting the overlay UI seems to take 5 seconds.
> i.e. due power save mode

This is your problem, isn't it? I never use power save mode and have no problems with these delays.

You might be able to do that with a master computer running OBS in studio mode, an HDMI capture car for each source you want to connect, projecting the scene to your Monitor all the time and some hardware device with buttons to switch scene.

But this is kind of a hack and you would possibly have undesired latency, especially noticeable when typing fast and moving the mouse.

There are applications that seem to switch faster than using the OSD, like [0], however, they need an OS running to use.

0: https://codeberg.org/Okxa/qddcswitch

Tangentially related: I find many monitors that look great on paper are a real annoyance when it comes to controls, be it brightness/contrast or input switching, making those 4s irrelevant in comparison.

Between recent models of Acer/Dell/Eizo/LG, I found Eizos to pose the least friction so far, with the LG (may be model-specific) a surprising worst. Reply to this comment with your anecdotes!

EDIT: To address the OP: Switching from an offline source to an active one is def smoother than you describe on my Eizo FlexScan EV series - I estimate it to <2s. The input switching is bearable enough that using the PbyP (two input sources split) is something I occasionally do, whereas with Dells it was just annoying enough to make me not bother.

Yeah would be nice to be able to alt-tab between monitors (or I should say: sources). I never understood why it is so slow. I have a somewhat older Samsung TV and I even have to cycle all the inputs (8?) to arrive a the correct one and it takes 1-3 sec on every input before I can get to the next. Absolutely maddening. But I guess manufacturers were too busy making 3D a thing and then "curved". Maybe they can now decide to fix some basics?
My old LG 27UD88 switches in about 1 second when pressing the relevant hotkey in Lunar (https://lunar.fyi/)

It’s not instant but input switching through DDC is the fastest possible method.

That’s because the input that will be switched to is already connected and receiving video in the background, skipping the HDMI/DP protocol handshake and going straight into rendering the new video on the screen.

s.gif
yeah I think also that with macbook the display is constantly sending video to the display (as there is no way to disable extenral displays).
> The monitor is on a switched off source i.e. due power save mode

Do you want your cake, or you do want to eat it too?

I was annoyed by this too. My solution was to buy a 49" samsung monitor and have either 50:50 or 1/3:2/3 split on the screen so I can see both at once. If I only want one active I can change the source but this has the flicker/delay as noted. I run Barrier software on both to move one keyboard and mouse between the screens.
s.gif
Can you link to the software? It’s a pretty common term but sounds interesting
I'm spending anywhere from 10 to 30 seconds every time I want to turn my monitor on. It's a game of fiddling with the buttons, trying to get it to wake up faster.

As an added bonus, the built-in firmware is terrible, and presents a "quick-switch" list of inputs which is different from the "normal" list of inputs, because… well, I don't know, just to make customers miserable, I guess.

Unrelated: OP, how do you like the U2723QE? Been trying to decide on pulling the trigger on two of them recently.
s.gif
I have one, it's a really nice monitor, sadly there is a small "but". Sometimes, it starts flickering. I'm not completely sure what causes it, but closing the nearest Eletron app will normally fix it.

Yesterday my kid wanted to see a video of volcanos on YouTube, followed a really bright picture of My Little Ponys, the monitor managed to start flickering and retain a shadow image of the little ponies, even as I switched between laptops and power cycled the monitor. This only happens on USB-C input, but seeing as the monitor have an Ethernet adaptor and works as a USB hub, USB-C is pretty much the reason I got it.

Notes: It happens really rarely, and exclusively with the M1 MacBook Air. It might be a firmware update away from being fixed. Also Dells has an excellent return policy.

s.gif
I've got a U3223QE which is the same just larger. The screen itself looks really nice, its bright, no complaints there. The stand is pretty good, reasonably tidy. My only complaint is that when it said USB-C hub I expected it to have USB-C ports for peripherals and it doesn't, only USB-A ports. My older LG has USB-C ports, so I know its possible. It's annoying because as more and more stuff switches to USB-C the only place to plug those in is the laptop itself, which means messing around with too many cables.
s.gif
That's the USB-C upstream port for if you connect PC-2 via HDMI and PC-1 is connected to the USB-C display port.
s.gif
I have its older version U2720Q and I am quite content with it, in a multi-monitor setup under Linux, using 30 bit per pixel, and also including a Dell loudspeaker system connected to U2720Q, used for audio via DisplayPort or HDMI.

I assume that the upgraded version U2723QE should be only better, with the possible exception of firmware bugs like what was mentioned by the other poster.

I don't know the specifics of your monitor. But you say the source is in power-save mode? I think then it has also signaled to the monitor to enter power-save mode.

So if your monitor supports it, turn off energy-saving mode settings might fix the issue.

It might also be less annoying to wait 4 seconds, if you think of it as helping to save the planet ;-)

OK, remember when modems were like EEEEEEEkRrRrRrRrRrRrRbaBONGbaBONG....

That was handshaking, and it was how modems figured out which speeds and protocols each other supported so they could figure out how fast to transfer data and how to encode it.

Monitors are kind of the same. There's handshaking that goes on each time you connect a monitor to a video source.

Like (and I'm totally making this up):

- Hi, I'm an AMD video card.

- Hi, I'm an LG monitor.

- You're a monitor? Great! I have some data I'd like to display. What resolutions, refresh rates, and color depths do you support?

- Well, I support 1920x1080x24bpp@60Hz, and 1600x900x24bpp@70Hz, and...

- Cool! I'd like to display at 1920x1200x24bpp@60Hz, please.

- Sure thing, hang about while I get that set up.

[Monitor CPU sets up whatever framebuffers, scalers, etc. are necessary to display at 1920x1200x24bpp@60Hz]

- Right then. Ready for your first frame.

- All right, first frame incoming...

Only then will you get your display. This can take a couple seconds.

s.gif
> Monitors are kind of the same. There's handshaking that goes on each time you connect a monitor to a video source.

If the connector is high bandwidth enough to send hundreds of thousands of pixels 60 times a second, how can it possibly be so slow to send a few bytes of handshake or EDID data?

Handshakes should be happening nearly instantly. You can complete opening a TCP connection to the other side of the world in a fraction of the time it takes to switch input on the average monitor.

s.gif
There's a major difference between the data-plane (the lanes where frames go from the signal source to the display) and the control-plane (the command and control channel that is bidirectional and used to discover a peer and negotiate settings).

The former is indeed very high speed, in the order of gigabits per second (up to 48gbps in later iterations), whereas the latter can seem ancient in comparison - 500bps to 115.2kbps - several orders of magnitude slower. The reason is that while the high data rate bus is handled by powerful, dedicated hardware (GPUs and ASICs/FPGAs), the control plane is usually done by micro-controllers (or otherwise lower-power devices) that are responsible for general management of the display. Sure, it would be nice if they had more powerful hardware and a faster bus, but those changes usually aren't what people are shopping for and so the standards committees aren't pushing makers to make these changes...

s.gif
Example EDID: https://edid.tv/edid/752/ (one can also go through a github link at the bottom)

Most EDIDs are 256 bytes in that repo. At 500bps it may take seconds to transmit, but 115.2kbps (14kbytes/s) is enough to do it in an instant.

As usual, the issue is probably not a bus speed or a chip cost itself, but a crappy legacy timeout-based negotiation process which nobody touched or restandardized since it was slapped together. You can’t just autoupdate all compatible hardware in the world and move on.

My blind economical guess is that one couldn’t even find controllers today which are slow enough to DDC back and forth in seconds, because the package/assembly probably takes 95% of the cost anyway. Please correct me on that, I’m curious.

s.gif
But why does it take a couple of seconds? I can send a ping thousands of kilometers away and get the ICMP reply in 100 ms or something. I can ping my router in less than a millisecond and my router is further away than my computer.

Why does this whole negociation take seconds and not milliseconds?

Or my car: it's full of electronics. Chips and code everywhere. Even my pedals, which used to be cables decades ago, are now fly-by-wire. It's chips and code everywhere.

Yet feedback is instant.

Now I understand a monitor is not the same as ethernet and is not the same as a car but... Seconds?

s.gif
Ping: your device, your router, your ISP's router, etc are already up and running and ready to participate in protocol interactions. Having done a bit of networking in the days where it was still reasonable to use analog modems as backup connectivity, I remember waiting 15-30 seconds for dial out + negotiation to occur before ping replies began flowing again.

Car: your car is an integrated system where all the components have been selected with the intent of working together as an integrated system. A display device often supports many modes and has very few guarantees about how sources will behave, thus inter-device negotiation is needed upon connection.

As other comments have noted, the control channel is fairly low bandwidth and there is potentially a decent sized chunk of info to exchange, plus timeouts and delays to ensure the display can work with older or less featured sources.

s.gif
I have an Asus monitor, it has DVI, HDMI, VGA. The HDMI has a quirk, I forget what, so I use a HDMI to DVI cable to connect my laptop to it. And when I switch sources away from DVI, the laptop doesn't think the 2nd display disappeared. So I don't think there's a handshake being done when switching back, because they were holding hands the whole time.

I think the quirk was, if I have something plugged into the VGA input, and the HDMI input goes to powersave mode, the monitor "helpfully" says, "Oh, no HDMI signal? Let me switch sources!"

Slow screens, they annoy me so much too!

Somewhat on this topic, I wish one could compare motherboards by the time-to-boot delay.

If reviewers start benchmarking something, manufacturers will optimize for it eventually.

If I were you I'd use a hardware KVM device that goes to a single input source on the monitor and keep the monitor always-on with a black screen saver to prevent burn-in.

I was in a similar situation when I wanted a easy way to switch between speakers/headphone at any time without messing with settings or apps. I ended up using a physical switch. https://www.amazon.com/gp/product/B008BMLXAU/

I suspect there is a tiny CPU inside the monitor that looks at a variety of things before deciding how to decode the signal and present it. As @alin23 notes, DDC protocol is also handled. Obviously the manufacturers are using the lowest cost BOM that they can get away with and that would probably mean a $1 microprocessor in charge of the "smarts".
s.gif
Even a $1 micro should be able to handle a few bits of resolution information in less than a millisecond. This can't be it.
s.gif
There's a lot going through between your OS and the monitor when you plug a display, really. HDMI and DP are two-way protocols.

Expecting everything instant-on is naif.

s.gif
Then there must be something badly wrong with the protocol and/or implementations. Negotiating a few bits over a high-quality cable link should be instantaneous.
s.gif
There could be, but there are (at least) three parties involved in whatever's causing the slow monitor response. I find it irritating even without a switch, just plugging in my external monitor should not take what seems like thirty seconds to actually turn on.

But in reality, you've got whatever tiny micro is inside the monitor, talking to whatever's on the graphics card, in turn talking to display drivers and who-knows-what-else on the PC side.

Slightly related to this, in some ways, is things like wifi connection. I have an ESP8226 that I've built a clock around. That thing will connect to the wifi, negotiate security, contact a time server, probably within one second. None of that is down to my ingenuity, it's literally how it works out of the box.

So the problem may just as much be on the OS side, rather than the monitor and the E-DID protocol.

s.gif
> So the problem may just as much be on the OS side, rather than the monitor and the E-DID protocol.

Yeah. And the DM must adapt to the new resolution, redraw stuff, maybe adjust the refresh rate. That's why I think stating "I want it on immediately" is a bit naive and assuming.

An external KVM switch might help but a lot of monitors suffer from this issue u fortunately and I don’t think anyone has benchmarked this
s.gif
I'm using a pretty low level Displayport switch, and I think it also takes 4s to get a picture back.

I guess the reason for this is actually the monitor and GPU. They will resync to each other after figuring out the peer capabilities and probably take some time for an actual frequency sync. Therefore I don't think any kind of switch would make it better - the switch would need to be intelligent enough not to drop the display stream but be an actual GPU on its own. Plus in this case it would need to have the exact same display signal frequencies that the attached GPUs use.

I wish this existed. Samsung Odyssey G7 has the same issue.
No, nothing like that exists, and no KVM exists that does instant switching.
There are hardware limitations that make what you're asking for impossible.

Use software solutions if you actually need instant switch. But you probably don't.

s.gif
What's impossible about it? I've seen devices in the past that can combine multiple HDMI signals into a single image (like picture-in-picture, or making one high-res display out of several lower-res video connections). And if you can do that, surely you can do instant switching just by toggling between "full screen input 1" and "full screen input 2" layouts.
s.gif
Were any of these devices monitors? Were they cheap? Like it or not, price is a major factor.

On top of that, just look around - there is no perfect display,no matter how much one is willing to pay. Every single one has major drawbacks in certain areas.

s.gif
Also, if all your inputs are the same res(or even aspect ratio) and frame rate you avoid a gazillion edge cases
s.gif
What are the hardware limitations? Surely this is possible if there was investment and interest in making this work? Or do the protocols really not allow this?
s.gif
Most of the delay is I/O bound. Handshaking happens over the Display Data Channel (DDC) which is a protocol over I²C (a two-wire comunication bus). HDMI/DP/DVI have 2 dedicated wires inside the cable for this, USB-C has to switch over to use differential signaling on pins A8 and B8, and then switch back after communication is done.

The bandwidth of that bus is small indeed, but most of the latency comes from artificial delays and error recovery in code: monitor is sending a message to host then sleeping 30ms to make sure the reply is ready, host doing the same thing, and this happens multiple times to negotiate pixel clock, signal timing, colors, resolution, additional features like suspend/reset/HDR/overscan etc.)

You can see how in the MCCS specification [1] both host and display entrypoints for communication start with a delay: https://shots.panaitiu.com/e8D7tQ

You can also see how that looks in Lunar's DDC code [2] [3].

For now, it's incredible that we actually have such a standard as DDC and most monitors work with most devices. A coordinated effort between monitor vendors and laptop/PC/GPU manufacturers into creating another faster standard with modern technology seems out of the question. But maybe with the advent of Thunderbolt this may change in the future.

[1] https://files.lunar.fyi/mccs.pdf

[2] https://github.com/alin23/Lunar/blob/master/Lunar/DDC/DDC.c#... : Delay on requesting data from monitor

[3] https://github.com/alin23/Lunar/blob/master/Lunar/DDC/DDC.c#... : Delay on communication error before retry

s.gif
In theory you could have a monitor that handles both signals all the time, buffering the output. Then it could switch immediately.

It would however be more expensive, since you would need to duplicate at least some of the electronics within the monitor, a few of which also incur licensing/certification/patent/whatever fees.

s.gif
Why would you buffer? Monitors are capable of playing full-screen video, which implies the ability to update every pixel once per... 30hz? frame, doesn't it?
s.gif
Because when you want instant switching you need to already have the data you want to display available the moment someone flips the switch.

You can't just tell the computer that you're now interested in the display data of the other display - that would incur latency.

You'll have to be receiving the information for the second display the entire time, storing it in what is commonly called a framebuffer, aka a bitmap in memory somewhere.

Most types of display have such a thing backing them anyways - one way or another - since they need to remember what they're supposed to display.

s.gif
Okay, I'm willing to wait 1/30th of a second while the screen gets its pixels; is that not an option? Or is there no way to request all pixels and it doesn't send them all every frame?
s.gif
I don't think I understand what you're saying.

I think you're concerned that because there's "buffers" there would be extra latency? There doesn't have to be.

s.gif
That's what I'm asking. When the monitor is live, it can draw a complete screen ~30+ times every second, so why does it take more than 1/30th of a second to start from nothing? Then, I speculated that the only reason I could think of was if the graphics card doesn't actually send every pixel every frame, and there's no way for the monitor to request a full refresh.
s.gif
Monitors don’t request anything. The monitor has to do a basic handshaking/negotiation before it even knows what sort of video stream to expect
s.gif
Ah. And we couldn't stick... What, 4 ints of metadata in the video steam (width, height, depth, frame rate)? Not like we don't have the bandwidth. I can believe that, but it's disappointing that nobody bothered.
s.gif
Historically speaking bad things happen when you hand a display something it can’t handle - including not being able to see the settings dialog to fix it.
s.gif
With basic analog you could set it up in a way where you would just wait for the next frame (or don't even wait that long), but with the new digital stuff there's bootstrapping/handshaking - whatever you want to call it. It's not so simple anymore.
s.gif
What could they possibly be doing that takes 4 seconds? That's ~4,000,000,000 instructions on a typical embedded processor.

It's much more likely that the handshaking was implemented by an incompetent organization that doesn't care about quality.

s.gif
Quite a few monitors support a Picture-in-Picture mode, so they are able to receive and decode two signals at the same time. But if that mode isn't enabled, they only receive one signal.
There are multiple input (5xHDMI) same output (1x HDMI) switches on the net, which feeds one input of the monitor and you can change the source on the adapter with a round robin button. It's almost instant. Another cool feature is whenever a new input source is detected it automatically feeds that to your monitor. I haven't seen the configuration you need, but if you can use adapters to convert each source to HDMI, it works like a charm. It's called an HDMI switch.
s.gif
switching between 2 inputs would take between 1x and 4x (x being the time to switch to the next input) depending if it is from A to B or B to A.
s.gif
Even the cheap HDMI switches all come with a remote so you can jump from input 2 back to 1.

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK