TV has been a source of entertainment ever since the first picture lit up the tiny tube inside the huge cabinet sold to the public. But now it’s all about getting — whether it’s curved or flat — HDTV or 4K — it’s the picture that brings the excitement. But there’s no reason the TV should be relegated to only showing TV shows and movies— why not use it as if it was a computer monitor for looking at everything — BIG, video games BIG and more. That’s just another reason why you need a — not to mention that these TVs are so full of new technologies and enhanced features that they blow away the TVs of just a few years ago in terms of picture quality alone. But the question remains as to how to get what’s on the computer over to the TV in a simple and efficient manner. By following a few procedures. Here’s how.
SETTING UP THE COMPUTER TO USE WITH THE TV
The video card in the computer outputs the video signal and can be modified from its “default” setting that is being used (or from what you have set it at). To output Full HD, which will match that of , set it to do a 1920 x 1080 image. Also be sure to have the aspect ratio match that of the TV (i.e.,16:9). That should take care of the majority of what you will be taking from the computer and “sending” to the TV.
For video games specifically, you’ll need to avoid over-taxing the computer’s CPU so it can concentrate on the game’s frame rate and all the needed graphic work. Do this by not having other programs running that could cause interference. And if the graphic card looks a bit wimpy for what it must do, consider reducing the strain on the graphic card by reducing the frame rate or changing the “effects” mode of the game to one less strenuous. You don’t want any “lag” in the game from what you would normally see on the computer monitor, but are now looking at on the TV.
CONNECTING THE COMPUTER TO THE TV
The video signal output ports will be found on the computer’s graphic card and accessible (in most cases). You just plug in the appropriate cable and drag it over to the TV and plug in the other end. A cable works best because it provides the Full HD video signal and also because it sends both audio as well as video and up to 1080p resolution (of course in the case of a 4K TV, it will be upscaling that image to make it even more awesome looking). If the graphic card only has a DVI output, that will work too — even though most TVs today don’t have DVI inputs (that was way back years ago), But it’s no big deal OR big expense to purchase a DVI to HDMI adapter: all you do is plug the DVI into one end and the HDMI is on the other for use in the normal fashion. But since DVI doesn’t send an audio signal, you’ll have to use analog RCA cables (stereo) going from the computer to the analog RCA inputs on the TV. That will triple the number of cables. The same goes if you’re stuck with using VGA — which is of a lower resolution — and which will require an adapter although there are some TVs () that includes such an input. And with DisplayPort outputs, adapters will also be needed (these found on Apple iMacs and most laptops today).
GETTING THE SIGNAL FROM THE COMPUTER TO THE TV
Regardless of the kind of video cable being used, there still has to be a connection made physically between the computer and the TV (FYI — a laptop counts here too). The length of the cable will depend upon how far the computer is from the TV — for a HDMI cable the cost can be a bit expensive if the length is more than 10-15’. Shorter lengths will work fine if a laptop is being used since it can “go” to the TV, unlike a desktop PC or Mac. The same applies if connecting a smartphone or tablet to the — these are “computers” after all, and not all mobile devices work with TVs wirelessly. For example, if you’ve Android phones/, it’s easy to “send” what you have on it to the TV wirelessly using built-in conventions found in most TVs. But someone with an iPhone/iPad would be out of luck because the tech isn’t compatible — hence the need for having a wired alternative for Apple devices.
Another alternative is to use a wireless video transmission system — these are made to use their own Wi-Fi network between the sender/transmitter and nominally require HDMI at both ends. So instead of attaching the sender to a Blu-ray player’s HDMI output to go to the receiver which is plugged into one of the HDMI inputs, the sender gets plugged into the HDMI output (or DVI with adapter) of the computer.
WATCHING THE BEST PICTURE FROM THE COMPUTER ON THE TV
Regardless of whether it’s a desktop or laptop or mobile device, a computer’s picture needs to be “tweaked” at the So pull out the manual (or online manual) and look for details as to which of the features might give you the best results — these might not seem obvious at first by just hitting the remote’s button and cycling through the picture. You’ll know what looks good because you’ll see it looking good (and in most cases you can save these settings so that you can return to them at a later time when you’re not watching regular TV).
Turning your HDTV or into a BIG SCREEN monitor might seem like a bit of trouble to do, but what you will see will make it worth it. So get yourself a big screen TV that can make what your computer has been showing you look really, really special.