Jump to content
Due to a large amount of spamers, accounts will now have to be approved by the Admins so please be patient. ×
IGNORED

Playing vertical games horizontally - seeking your expert advice


Recommended Posts

The MAME table I'm building will have a 32 inch TV in it (using output from a RADEON PC card, not a harness).

 

Because of the size of the TV, and the viewing angles when you're sitting at a full size table, I think I'm going to need to tilt the TV forward, under the glass, towards the player, rather than have it sit flat on its back.

 

The downside of this tilting is that I won't be able to put another set of controls at the end of the table, and play vertically-oriented games along the monitor's vertical axis.

 

So, my question to you MAME experts...

 

Will I still be able to play vertical games satisfactorily on a 32 inch horizontal monitor?

 

By my measurement, there will still 19 inches of vertical picture (albeit with black bars at the sides).

Link to comment
Share on other sites

thats one big screen you got there, whats wrong with sitting it flat?
Well, the table I have in mind will be as big as a 6-person kitchen table.

 

I want to be able to play cards and board games at it, as well as MAME.

 

I have to set it up to be sure, but I think that the table surface and viewing angles will require the monitor to be tilted in order to be easily viewable for arcade games.

 

I think...?

Link to comment
Share on other sites

The MAME table I'm building will have a 32 inch TV in it (using output from a RADEON PC card, not a harness).

You realise that using TV-out supplied by a standard video card will force you to use an interlaced video mode? Quite frankly I find the flickering drives me up the wall.

 

Will I still be able to play vertical games satisfactorily on a 32 inch horizontal monitor?

What's your definition of "satisfactorily"? If you're OK with the game being letterboxed, then yes.

 

I play vertical games on my MAME cab from time to time (21" horizontal monitor), and I have no problems with it.

 

By my measurement, there will still 19 inches of vertical picture (albeit with black bars at the sides).

Yup.

 

4:3 TVs are nice simple calculations because they make a 3:4:5 ratio right-angled triangle on the diagonal (cheers Pythagoras).

 

Horizontal = 32 / 5 * 4 = 25.6

Vertical = 32 / 5 * 3 = 19.2

Link to comment
Share on other sites

You realise that using TV-out supplied by a standard video card will force you to use an interlaced video mode? Quite frankly I find the flickering drives me up the wall.

No I didn't realise that. I will have to test it and see how it goes. Is there any way to stop the flickering?

 

Perhaps I can set the refresh rate on the card to match the TV (would that be a multiple of 25 frames per second...?)

 

 

4:3 TVs are nice simple calculations because they make a 3:4:5 ratio right-angled triangle on the diagonal (cheers Pythagoras).

 

LOL. Pythagoras had the smarts. I had a tape measure ;)

Link to comment
Share on other sites

No I didn't realise that. I will have to test it and see how it goes. Is there any way to stop the flickering?

 

Perhaps I can set the refresh rate on the card to match the TV (would that be a multiple of 25 frames per second...?)

Interlacing is interlacing. Changing the refresh won't change that. All consumer video cards will force interlaced modes at standard def due to the DAC onboard.

 

You could go all out, create some custom low-res modelines via powerstrip, and then use a JROK RGB->S-Video to get a progressive scan mode. But that's a whole lot of cost and effort.

 

Have a play and see what you think. Some people don't mind interlaced mode - some people don't even notice it. I'm a bit pedantic about it and it really bugs me.

 

LOL. Pythagoras had the smarts. I had a tape measure ;)

I remember sitting in highschool maths with a bunch of whingers all going on about "why do we need to learn this crap, we'll never use it in real life". How wrong they were. :)

 

I have found that connecting through SVHS gives satisfactory results. Does your telly have that?

S-Video gives a much clearer picture than composite due to the separate cables for the intensity/luminance (Y) and colour/chrominance © signals.

 

But again, all video cards still spit out interlaced modes, even with S-Video. I honestly wish someone would have a go at releasing either a hacked or open source driver that could give better control over this. Assuming it's controllable by the driver, and not something that's hard-coded in at hardware level on the TV-out DAC.

Link to comment
Share on other sites

What you said...

 

All consumer video cards will force interlaced modes at standard def due to the DAC onboard. You could go all out, create some custom low-res modelines via powerstrip, and then use a JROK RGB->S-Video to get a progressive scan mode.

 

What I heard...

 

Whoooooooooooooshhhhhhhhhhh!

 

:lol

 

 

I remember sitting in highschool maths with a bunch of whingers all going on about "why do we need to learn this crap, we'll never use it in real life". How wrong they were.

 

Maybe their parents had given them tape measures :)

 

 

But seriously, there's no PC video card that doesn't ouput an interlaced signal...? 'Cause you're right, flicker is going to drive me nuts :(

Link to comment
Share on other sites

I remember sitting in highschool maths with a bunch of whingers all going on about "why do we need to learn this crap, we'll never use it in real life". How wrong they were. :)

 

nah mate, they were right. lazy bastids like us just post annoying questions on forums like this and let someone else do the work for us. we still don't bother to understand the problem if someone else can solve it for us :)

 

only jokin :) best aspect of this hobby is all the learning I reckon.

 

So I'm presuming based on this discussion that regardless of whether I use a TV-Out or a S-Video out on my gfx card i would get the interlace issue? Man glad I didn't go that path though I have thought about it. Smaller LCD wide screen TV's are bloody cheap. :(

 

i'm suprised there isn't any tv adapters available for this problem, i would have thought that displaying stuff on TV's for a computer would be a common thing :unsure

Link to comment
Share on other sites

But seriously, there's no PC video card that doesn't ouput an interlaced signal...? 'Cause you're right, flicker is going to drive me nuts :(

I haven't found any that offer low-res (15KHz) progressive modes. Newer ones will do progressive scan at high res modes (31KHz+/480p+) but that doesn't help you on a standard-def TV.

 

Again, try it out if you've already got the hardware lying around. All free to air standard-def TV is interlaced, and while it's high-motion data, most people don't notice it at all. Gaming is different, as it's often low-motion stuff (lots of static text and whatnot) and you sit much closer to the screen.

 

So I'm presuming based on this discussion that regardless of whether I use a TV-Out or a S-Video out on my gfx card i would get the interlace issue? Man glad I didn't go that path though I have thought about it. Smaller LCD wide screen TV's are bloody cheap. :(

 

i'm suprised there isn't any tv adapters available for this problem, i would have thought that displaying stuff on TV's for a computer would be a common thing :unsure

It is very common, but most people use it for watching movies. My MythTV media PC runs S-Video out to my standard-def 30" TV, and it's interlaced. But I'm watching movies and TV recordings, and doing so about 3-4 metres away from the screen, so it's fine.

 

Similarly most modern consoles are interlaced - N64, PSX and up are all interlaced on standard def TVs. Most people don't notice it because (a) they sit a decent distance away from the TV, and (b) they're playing 3D games that again are more high-motion.

 

Playing 2D arcade games sitting close to the monitor is different altogether. But again, many people don't notice, and I'm a pedantic bastard. :)

Link to comment
Share on other sites

I was using an ARCMON.SYS file for my ATI card years back on my mame cab to have 15Khz output to the monitor. Not sure if its something worth checking out and not sure if it'll do what you want, just a suggestion:unsure

Great suggestion, thanks.

 

It solved the flicker problem for you...?

Link to comment
Share on other sites

Great suggestion, thanks.

 

It solved the flicker problem for you...?

 

All I can say is I didn't have a flicker problem using it nor noticed one as I used it from the start, also this driver is for dos though, don't know how it would work in windows?

Link to comment
Share on other sites

I was using an ARCMON.SYS file for my ATI card years back on my mame cab to have 15Khz output to the monitor. Not sure if its something worth checking out and not sure if it'll do what you want, just a suggestion:unsure

 

It's a DOS mode driver, so it won't work in Windows.

 

Also, it looks like it's just there to provide 15KHz modes via VGA. Pretty much the same thing as Soft15KHz, Powerstrip, or custom XOrg modelines. None of these help with TV-Out, as the encoder chip overrides low res modes and scales everything to a minimum of 640x480 interlaced (ie: if you send it 320x240, it will scale it to double size and then interlace).

 

You can use custom modes in Windows via Powerstrip or Soft15KHz, but you'll need a TV that will take RGB (ie: SCART) input.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...