I just received a new monitor to be used as a secondary extended display. I've done this before with an older computer, and was able to drag documents, icons, whatever to a second monitor. This new monitor is a VGA only input, which should be fine, because my computer has an HDMI output (which I'm using for my primary monitor) and a VGA output (which I was going to use for my secondary). But the new VGA monitor says it's not receiving any input from this computer. Nor does the computer notice that an additional monitor device has been added. I tried an older monitor with a VGA connection, and it doesn't work on this computer either.
In googling the net, I see this is not an uncommon problem, but no solutions that have been offered have seemed to work for anyone. Some people say it's a problem with Windows 7 and beyond (I have Windows 8.1). Some think it's an Intel issue. The site for my computer (Lenovo) notes the problem, but has no solution.
Now I also have a display port output on my computer. One person suggested buying a DP to VGA adapter and try running the new monitor through the display port. But I don't know whether the DP outlet even works either. I would assume it does, but then I also assumed the VGA output was functional. The adapter is only $8, so I'm going to order it, but it will be a week before I'll know.
I'm assuming that it's a Windows 7 through 8.1 issue, because I got a number of google hits that repeat this, and I am grasping at straws right now. So if anyone has run into this and can steer me in the right direction, I'd appreciate it.
possibly a driver problem.
could you give the link to where you found
"The site for my computer (Lenovo) notes the problem, but has no solution."
and names could be useful too (monitor and the computer)
My Lenovo work laptop works just fine with a second VGA monitor running Windows 7 Pro so it isn't necessarily something your OS just won't do. Have you looked in your system setting for the display to see if your computer recognises the second monitor or if there is a soft switch to turn it on or off?
I solved the problem, simple now that I know how to do it, but it took a long time to figure this one out. Ordinarily, you change the resolution by right clicking on the desktop and choosing "screen resolutions." You can also access this dialog box through the control panel. It's the same work page that allows you to extend you desktop to another monitor, or clone a display, and to select your primary monitor and secondary. It has a drop down menu to select the resolution you want.
Simple right? But 1920X1080 was not listed (on either computer after I connected the second monitor). Worked fine with one monitor, but the second seemed like it was stealing resolution from the first monitor. I know what was happening was something else entirely, but it seemed like resolution capability was being sucked out of the first monitor.
Here's the simple solution. At the same page, select "Advanced Settings." A separate dialog box appears. Select "List all Modes" (why doesn't it say "All Resolutions)? All available resolutions will now be shown, but now they are called modes. Why weren't they shown before when you tried to use the resolutions menu? Why didn't it show all modes in the first box? Why do you have to have two dialog boxes when only one would be enough? Well, first of all, don't think about it. Just, know to look around and explore. If one box doesn't do what it says it will do, look for the other box somewhere close by that calls the resolutions "modes" rather than "resolutions". Select the resolution you want, and now that resolution will be available in the menu of the first box that didn't show it before.
Weird, simple, and frustrating.