Aspect ratio and resolution ain't the same thing, though a lot of people confuse them.
The framerates of movies on film was traditionally 24 frames per second. That caused problems when they were shown on (US) TV. 24 =/= 30. 30 fps was the standard for analog US TVs. Analog TVs are considered Standard Definition (SD).
Well, NTSC (US analog) is actually approx. 29.97 fps, PAL (European) is 25 fps.
Then you get into whether the content is Progressive or Interlaced.
https://www.manchestervideo.com/2013/10/16/quick-guide-to-video-frame-rates/
Showing a movie on US TV required being scanned so each film frame became two half video frames, with some frames being repeated so the motion was smooth and seamless.
A computer monitor has software that can jigger things, or the content has been smashed around beforehand.
Movies upscaled to 48 fps, starting with "The Hobbit: An Unexpected Journey." The push for 60 fps is the latest craze.
Now that TVs are digital -- Standard Def, High Def or Ultra Hi Def -- things get really screwy and confusing.
Video games and computer monitors are a whole different ball of wax, and have different ways of dealing with playback content.
Some people hook a computer up to a TV so they can get a really big screen. Computer monitors can be expensive buggers and they aren't made in TV sizes that I know of except for very specialized uses. Sports stadiums, for instance.
Bottom line, to get really good quality playback at a good resolution, TV or monitor, it will cost money.