At the dawn of the 20th century, Thomas Edison and his company discovered that the human eye needed to see, at bare minimum, a framerate of what we would today call 10 FPS before our brains could become convinced that a single image was in fact moving and not just a fast slide show. 10 FPS is the baseline. Edison actually shot a few films at this framerate, including a short film adaptation of Mary Shelley’s Frankenstein (beating Universal to the punch by two decades), but little of these films survived.
24 FPS was a standard of Old Hollywood, and games below this seem choppy and poorly animated to pretty much everyone. This low framerate is part of why restored versions of old films that feature a higher framerate seem so odd to watch, but it is only part of the reason.
30-45 FPS is the current Hollywood standard, and this is the range in which most console games fall, as developers consider it the “best of averages” balancing on screen details and framerate.
60 FPS is the up-coming standard of Hollywood — for instance, the Hobbit films were filmed with special 60 FPS cameras. This ended up making test audiences complain that the films felt “unnaturally smooth”, so in many theaters, the films were artificially brought back down to 45 FPS, which is how you probably saw them unless you saw them in IMAX. For PC gamers, this is often the minimum optimum framerate.
60+ FPS is the desired framerate for PC gaming, especially for games that require pixel-perfect reflex timing like serious e-sports. It will be a long while before Hollywood or consoles catches up to this.
Here endeth the lesson.