Correct. Most cinematography is shot in 24fps (Netflix etc). Most YouTube content is rendered in 30fps or 60fps. Thus, 60/90/120hz refresh rates are used on most displays. After that 144hz is used because it's 24*3 and then 240 (240hz is not as common since there aren't many games that can sustain 240fps on current hardware).
In short, as long as the refresh rate is a multiple of either 24 or 30 it won't cause issues on most content. Good example is my gaming monitor being overclocked to 165hz. It's nice for gaming but for content consumption I need to remove my OC to get it back to 144hz otherwise I just deal with some choppiness in the video. Overclocking this display to 100hz would make tasks like scrolling and gaming better (at the risk of worse battery life pushing more pixels per second) but would really impact content consumption, which is what most people do on their smart phones 50% of the time.