Main Menu

Display units

Started by Hans Manhave, March 13, 2015, 10:59:55 AM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Hans Manhave

Having gone from WinXP to Windows7, I have two displays that are giving me a bit of trouble.

They are two different models of Samsung.  One begin a SyncMaster2232BWplus.  I currently don't have the model number of the other one written down.

If there are only icons displayed on the screen, the screen is dimmed.  Open any app or folder and the screen brightens right back up so one can see clearly.  The screens are only a couple years old.  Windows7 was out at the time of purchase. :)   

I updated the display drivers.  Intel HD Graphics (Dell OptiPlex 7010 mobo) and AMD Radeon HD 74740.

No matter what port I plug these monitors in, the dimming occurs.  I don't see a setting in the display itself (using the hardware buttons) and Windows set for never time out.

Besides dumping the screens, what else is there to do?
Fantasy is more important than knowledge, because knowledge has its boundaries - Albert Einstein

Mark

I'm wondering if the monitors are actually going bad?  They don't light up until enough pixels are told to or something.  If you plug them into an XP machine, does it work normally?  Or, do you have any other adapters you can try?  Like, do the machines have display ports on them and you can try a display port to VGA adapter... something like that.
Mark Piontek, MBA
Director of Information Systems
BS in Information Systems Security

Gene Foraker

I think I would also swap the monitors with those of other Win 7 PCs which work with their monitors.   Make sure the "bad" monitors still don't work with other PCs and the "good" monitors work with the problem PCs.
Gene Foraker CPCU
Gates-Foraker Insurance Agency
Norton, OH


My posts are a natural hand made product. The slight variations in spelling and grammar enhance its individual character and beauty and in no way are to be considered flaws or defects.

Hans Manhave

Each of the machines have two monitors (different brands/models).  The Samsung one has the problem, on any display port or VGA port.  The other one, with VGA cable also, doesn't have a problem on any display outlet.  Haven't tested on other computers.  Have acquired DVI cables to substitute to see if going digital will make a difference.  VGAtoDVI adapter made no difference.   More to follow... :)
Fantasy is more important than knowledge, because knowledge has its boundaries - Albert Einstein

Hans Manhave

DVI cables (I tried two, each slightly different), made no difference.  It doesn't like to recognize DVI on the monitor side.  Acts the same as if there were no data cable plugged in.  Only have played with one of the two units.  Not sure it is worth continuing because when there is no display and machine needs to be rebooted with different hookup, the machine insists on a lengthy "repair", which has no effect.

This could just be a Win7-64bit problem.  It does have 64bit drivers, but they may be WinVista, XP64bit or other variations.
Fantasy is more important than knowledge, because knowledge has its boundaries - Albert Einstein

Bob

If the monitor is bad won't matter if you connect VGA or DVI same result.  If one works the other doesn't then it's a bad port on monitor.  If monitor is bad won't matter what port used.

If you're not getting signal switching to DVI the monitor should have buttons for menu allow you to cycle the the inputs.   Often they toggle between VGA and Digital and or DVI.  Higher end monitors will detect and switch input but middle to low range you have to tell it which input like TV at home.

Jeff Zylstra

Don't forget that many monitors and/or display adapters are only recognized at boot time, and if you miss that hardware recognition time, you'll probably need to reboot again.  I've had that happen a few times and it's irritating.
"We hang the petty thieves, and appoint the great ones to public office"  -  Aesop