Your monitors may, or may not come with a cables - most do but I would not worry too much about compatibility. The card's connector will accept both type DVIs. Note that many users still prefer CRT monitors, which are analog. And many users may still be using analog-only LCD monitors. But there is only so much real-estate on the back of a graphics card for so many connectors. To complicate matters, many people, including yours truly, use more than one monitor. So the card connections are often dual-purpose, as determined by what is plugged into it. Typically, an inexpensive D-Sub to DVI adapter is used for analog monitors when no D-Sub connection is available on the card.
Note that HDMI is the latest and greatest connection standard. Also note that the video data in HDMI and DVI is exactly the same. The primary difference (besides physical size) is HDMI also carries 5.1 audio and HDMI control codes (not used in computers). HDMI emerged out of the home theater world - but the transition has not been perfect because of how audio works in a computer - though a sound card (or integrated sound), and not through a HT receiver. Some graphics cards with HDMI now support audio "throughput" for use with a monitor with internal speakers. But if, like most, you have computer speakers, you don't need audio to your monitor. And in a multi-monitor setup, using the monitor speakers is awkward, at best.