So, I have a Mac Mini 2011 that I rescued and ubuntu'd (18.04).
I platformed it at my father's place, and it worked wonderfully with his 1920xsomething, huge, fairly new, but still VGA, screen, which I used with the DisplayPort-VGA adapter.
Anyway, I took the thing to my place, went to hook it up to one of my several LCD screens (All my monitors are VGA.). The monitor seemed totally unaware that the mac was booting up. and it wasn't like the problem was the screen itself, since I could plug it to my laptop, and it worked fine. I repeated this with another LCD screen, same result, it was like plugging my vga port on to a toaster.
I was starting to worry I had broken the Mac on my way home.
Then I repeated the process, plugging the Mac to a way older LCD I had around, the LCD was broken, displaying only rainbow lines, or solid colors, at times, BUT it could clearly "sense" that the Mac was emitting a signal.
Extremely tired of this, I was thinking that it had something to do with resolution, "resolution", the way that word echoed in my mind it instantly made me go fetch an old CRT I have around.
The CRT works, in the sense that it displays the image being sent by my Mac. It's massively redshifted, but it works.
Then, to my despair, I found out that, actually, what's being projected onto the CRT is not some crazy HD that the poor thing swallows up, it's just 1024x768 @ 60 FPS.
That got me thinking "it must be the framerate??", so I created an xrandr config at 30 FPS, activated it, and straight up lost signal in the CRT. I know it's temporary unless I add it to my profile, but understandably, I wanna know if I'm even on the right track.
Why does my Mac Mini seem to reject SOME VGA monitors for factors that are absolutely beyond me? any ideas?