I think because there was still so much unexplored territory there. "What a computer game is" was still evolving and the focus was on game play, not graphics (although graphics was a big part of it for sure). Now most games are for console and ported to PC so they have a limited scope in terms of control and open world play.
I actually think it's a little different from that.
There were certainly dominant genres. Adventure games, for example, were EVERYWHERE. And prior to 1992, nobody had heard of first-person shooters. And, of course, there was the FMV explosion in the mid 90s with CD-ROM drives and such. But I tend to think that there was more innovation because there were more players at the table. You didn't have a "Big 3" situation. You didn't have dominant franchises (well, I suppose you sort of did with the "Quest" games from Sierra).
I do agree that games being developed natively for the PC rather than as console crossovers increased options, purely based on the degree of input available. I'm hoping that motion capture gaming (e.g. PS Move, Kinect, Wii) changes some of that. But mostly I think it tends to do with the fact that you had a ton of producers out there, all competing with each other for market share.
Access
Acclaim
Accolade
Activision
Sierra
Origin
Electronic Arts
LucasArts
Interplay
Microprose
SSI
Apogee
Id Software
the list goes on.
These were all gaming companies that developed and published their own games during that era. There was, as a result, less centralization. Market trends would develop, certainly, but there was far less centralization and far more innovation. I mean, think about it. EA and Activision at that time were just two of many. Now the industry is a lot more comparable to the recording arts industry with two or three mega-publishers, and a bunch of much much smaller indie labels. "Corporate" music is tightly controlled, innovation is stifled as much as possible or is itself controlled and directed.