Recently I was playing video games with my brothers and their friends when they decided to move the party to another house. We were all set to go when suddenly someone mentioned the TV involved. This was soon drawn out into a long conversation about why old video games don't work well with new TV's, but work perfectly fine with old ones. Why is that?
Well, the problem I am mentioning is called input lag, which is the loose definition for any large difference in time between the input to a hardware device and its associated output. For example: hitting a button on a game controller and waiting a second before the TV displays the action. Many of my brothers' friends noted that this input lag was almost never seen with old cathode ray tube TV's, while it can be often seen with liquid crystal display or plasma.
The reason this occurs is because of the difference between the analog signals of old video game consoles, and the digital signals of new TV's. When an old video game controller is pressed, the controller takes information and packages it in an analog signal to be sent to the TV. The TV then accepts this signal for display. Old TV's used analog display, so they could simply unpack and use the information. However, new TV's use digital systems, and must first demodulate the data (which includes changing the carrier wave) to be unpacked. They also nowadays store video information, which previous TV's did not.
As a result, your original smash bros. game may not perform as well as you'd like unless you fish something archaic out of a trash heap. Good luck with that buddy.