I don't think that you can simply things down to a stereotype, as it were. Remember, OS <number/character here> has been around for longer than Windows. The stereotype was that consoles were for gaming, and that computers were for computing. It wasn't until Microsoft started building the DirectX API into its OS that things started to change. By giving developers lower level access to hardware, simplifying tasks, etc, they attracted developers to their platform. Sure, DX sucked at first, but MS kept pumping time and money into it, and look what they got out of it: a console.
It also helps developers that they can target a console with a large audience, and roughly 90% of all computer users at once by using one API. It's not so much that 'Windows is better' it is that 'Windows is ubiquitous.'
Remember that OpenGL has been around longer then DirectX as well, and look where it is now, compared to DX. You have certain game development shops that swear by it (id springs to mind), but the majority go for the easy route. Take EA. They can cover a huge swath of people by using DirectX. So, at the end of the day, it's a money-driven decision that probably won't change for a while. Either OpenGL steps up and starts covering the entire media aspect instead of just focusing on graphics, or... Well, not much. There are things like SDL that cover what I'm talking about, but don't have the acceptance of the industry.
Unified API is pretty much the thing that drives the game development industry. (And by development, I mean idiotically large companies that care more about their bottom line, their stock price, and upper management then about games.)