What's new

VGA video amplifier board for Arcade Monitor with sync filter

Following some questions I got regarding the EDID emulator, I have put more explanation on the EDID stuff done by the videoamp here:
https://github.com/Redemp/VideoAmp_wiki/wiki/VideoAmp's-internal-EDID-emulator-usage

The videoamp is NOT a downscaler, it is a video amplifier with frequency protection and EDID emulator.
It will not perform any video processing to change a source input signal to another output signal (hence the "zero-lag"), so it also means it will not convert a 31kHz or 24kHz source to a 15kHz signal.

Regarding compatibility, we cannot confirm 100% that EDID will work on any particular GPU, so far almost all desktop gpu we tested work except picky Dell laptops on their external HDMI port.
 
1) No because it doesn't convert anything, it forces the PC to send the correct resolution/frequency (via the EDID protocol).

To make it short (and certainly approximate) , EDID is a communication system between a (recent) display and a video device where the display declares its supported resolutions to the video device, so that the video device sends the best/compatible signal that it can send. Arcade CRTs don't have this EDID system. So the VideoAmp, pluged between the video device and the display, emulates the display side EDID communication. You choose the resolution to be declared to the PC and the PC (or any EDID compatible system/OS that can send the desired resolution) sends it.

Interesting project. If it really works the way it's described, we have a new option over what I've been running for over 2 decades (ArcadeVGA/ATOM-15). What's lost on me is:

- How does this device handle system boot up? Are you able to see the BIOS screen and change settings? Or does EDID only work in OS that support it (Windows)
- You can send any resolution you want to the graphics card, but it doesn't mean the graphics card would be happy with it, right? Going back to ArcadeVGA/ATOM-15, those custom firmware patches loaded the low res into hardware and the video card will run them. But why would a modern display adapter do this on its VGA port if its firmware doesn't support it? Just because EDID told it to do so?
 
Interesting project. If it really works the way it's described, we have a new option over what I've been running for over 2 decades (ArcadeVGA/ATOM-15). What's lost on me is:

- How does this device handle system boot up? Are you able to see the BIOS screen and change settings? Or does EDID only work in OS that support it (Windows)
- You can send any resolution you want to the graphics card, but it doesn't mean the graphics card would be happy with it, right? Going back to ArcadeVGA/ATOM-15, those custom firmware patches loaded the low res into hardware and the video card will run them. But why would a modern display adapter do this on its VGA port if its firmware doesn't support it? Just because EDID told it to do so?
You write the EDIDs you want to the vga boards memory, so they are there on bootup. If your bios works inside those EDID you should see your bios, I have a newer system with a 5600x on my Blast City and the bios shows up in 640x480.

Your GPU still has to support the resolution at a driver level to be able to use the corresponding EDID in the VGA Board. I use CRTEMU, so I can go below 640x480. Super resolution is another option if your gpu supports 1280x240 for example you can send that.
 
Your GPU still has to support the resolution at a driver level to be able to use the corresponding EDID in the VGA Board. I use CRTEMU, so I can go below 640x480. Super resolution is another option if your gpu supports 1280x240 for example you can send that.
But CRTEMU is a device driver for Windows and only works once Windows is running. The video card itself has to have that resolution in its "supported resolutions" table. That's where hacked firmware (like ArcadeVGA and ATOM-15) come in. Does the video card support those low res without custom firmware? even in BIOS?

You mentioned 640x480, but that's 31khz res. (unless you mean 480i) With Atom-15 I can see the BIOS as the video card forces 240p at boot up.
 
I cannot speak about CRT Emudriver (I have only Nvidia gpu), but I do see my bios in 15kHz when pluggued to a 1050Ti gpu (hdmi2vga) on a 4th gen i5 Dell PC. Again it may depends on your gpu and what your Bios can manage.

BTW I have updated the wiki given some questions of another user.
 
But CRTEMU is a device driver for Windows and only works once Windows is running. The video card itself has to have that resolution in its "supported resolutions" table. That's where hacked firmware (like ArcadeVGA and ATOM-15) come in. Does the video card support those low res without custom firmware? even in BIOS?

You mentioned 640x480, but that's 31khz res. (unless you mean 480i) With Atom-15 I can see the BIOS as the video card forces 240p at boot up.
Its dependent on what the GPU itself can output based off an EDID table. The videoamp saves all the EDID information you write to it, so as long as the gpu can output one of those res based on its bios it should. Its all dependent on your hardware you are using, all the videoamp is doing is providing a list of user configurable resolutions and filtering out resolutions you dont want based on the filter table.
 
Very interest and novel idea on how to expand the number of available video cards that can be used in Arcade machines as well as grow the number of OSes that can run on the machine.

So DOS/Windows tells the video card what resolution it wants to run at; the EDID table tells the video card what resolutions it supports, and the video card makes the decision of what it should do (find the closest match) What happens if there is no closest match? With Windows it's easy - video drivers don't show resolutions the card doesn't support. So if you run CRTEMU, you won't see 1024x786 (for example) as an available option, even if your video card supports it. But what about good ol' DOS? Those old apps don't listen to EDID. What happens if the video card is told "go to 1024x768" and EDID tells the video card it doesn't support that res? Trying to understand the logic.
 
Very interest and novel idea on how to expand the number of available video cards that can be used in Arcade machines as well as grow the number of OSes that can run on the machine.

So DOS/Windows tells the video card what resolution it wants to run at; the EDID table tells the video card what resolutions it supports, and the video card makes the decision of what it should do (find the closest match) What happens if there is no closest match? With Windows it's easy - video drivers don't show resolutions the card doesn't support. So if you run CRTEMU, you won't see 1024x786 (for example) as an available option, even if your video card supports it. But what about good ol' DOS? Those old apps don't listen to EDID. What happens if the video card is told "go to 1024x768" and EDID tells the video card it doesn't support that res? Trying to understand the logic.
I never tried with DOS but I am pretty sure it will not work as EDID and plug-ang-play were introduced with windows 95/98 around 1995.
 
The old ArcadeVGA and later Calamity Atom modified firmware worked by having a list of supported resolutions loaded in firmware.

With ArcadeVGA, MAME has per game ini settings where a windows tool compared what is the native game resolution vs what the video card supports, find the closest supported resolution, and write that to the ini file. With GroovyMAME, I believe the CRTEMU Windows driver talked to MAME directly, bypassing any ini files.

My questions:

- I assume to use this device, one doesn't need to install any special hacked video card drivers (no need to run Windows 7 in developer mode)?
- GroovyMAME somehow knows about this device and sets resolutions per game that best fits?
- How does it work with refresh rate? (different games have different rates that are close to 60hz) Does it force 60Hz for all resolutions?
- Is there a way to force a particular resolution per game? (E.g. For vertical games like PacMan, by default GroovyMAME cuts the top/bottom of the game. I force a 480i resolution to see the full screen on a horizontal CRT)
- How does the BIOS screen work? Can you still see it / change settings in BIOS? Or is the CRT black until Windows loads and EDID information is read and enforced?
 
The old ArcadeVGA and later Calamity Atom modified firmware worked by having a list of supported resolutions loaded in firmware.
In the VideoAmp case, the resolutions are loaded in the board EDID itself, and the GPU detects and loads them from the EDID. Depending on the GPU and driver you can store many resolutions (usually 10x or 20x) to cover most needs. The resolutions are either the ones predefined in the companion software, or you can load from VMMaker or any modeline text files (I provide some modelines with the software archive).
With ArcadeVGA, MAME has per game ini settings where a windows tool compared what is the native game resolution vs what the video card supports, find the closest supported resolution, and write that to the ini file. With GroovyMAME, I believe the CRTEMU Windows driver talked to MAME directly, bypassing any ini files.
I tried MAME with only a few fixed superwide resolutions, and GroovyMAME with switchres enabled, more resolutions (20x) and system mode unlocked meaning switchres will use resolutions reported by the operating system. It does work properly.
My questions:

- I assume to use this device, one doesn't need to install any special hacked video card drivers (no need to run Windows 7 in developer mode)?
No need for a specific driver, but you need at least a working driver for your gpu. The driver will be in charge to "load" the EDID when you plug the board and report back to the OS the supported resolutions. Warning, some may be filtered like interlaced resolutions for recent NVIDIA boards or drivers.
- GroovyMAME somehow knows about this device and sets resolutions per game that best fits?
For GroovyMAME you need to enable switchres (dynamic resolution switching) and disable the lock system modes since they are disable by default by GroovyMAME so switchres will not use them even if you "see" them in Windows/Linux/MacOS. GroovyMAME does not know about the board.
- How does it work with refresh rate? (different games have different rates that are close to 60hz) Does it force 60Hz for all resolutions?
You need to add as many modelines as you need. For example for 15kHz games, since many games used actually very close refresh rates, I usually setup a few bunch of resolutions like 2560x240@60Hz, 2560x256@59Hz, 2560x264@58Hz 2560x264@57Hz, 2560x288@50Hz, ... then I let switchres does the "magic" and pick the closest resolutions for the selected game.
- Is there a way to force a particular resolution per game? (E.g. For vertical games like PacMan, by default GroovyMAME cuts the top/bottom of the game. I force a 480i resolution to see the full screen on a horizontal CRT)
No, that's something the board cannot do, as the board knows nothing about the game/OS whatever. This has to be handled by GroovyMAME or a script to force the resolution.
- How does the BIOS screen work? Can you still see it / change settings in BIOS? Or is the CRT black until Windows loads and EDID information is read and enforced?
That really depends on the GPU and the BIOS. Using an NVIDIA 1050Ti or a 2060 (hdmi2vga), I do see my BIOS at 15kHz at boot time for example. But I cannot ensure you will see it on your setup.
 
Thanks for the reply. Looking forward to trying it out on my setup.
 
Hi,
For those interested, we now have the JammaMia extension board available as a DIY kit (you need to solder yourself) with an open-source firmware available here:
https://github.com/njz3/jammamia

The firmware makes it seen as 2 joysticks which should be easy to map in emulators.
The board is based on an arduino 32u4, has an audio amplifier with mono output on jamma or stereo output with screw terminals, CPS1 and CPS2 connectors, separated analog inputs for X Y joystick for 2 players and outputs signals will be avaible in the future.
Here are pictures of the board when assembled:
jammamia1.jpg

jammamia2.jpg

To get a sample of our first beta prototypes (10x available), write to Bandicoot or to me in PM.
Price of the prototypes: should be around 35 euros as a DIY kit, with cable to connect to the VideoAmp board.
 
Back
Top