If I may pop in, FoxTwo is correct, the general/overall workings of video compression is that 'slower' is better quality, where it takes more time to analyze the video and changes between the frames (you stated that concept too, I see).
The complexity arises from the various settings and the configurations they can have.
For example, in AVC (h264/x264 MPEG-4 Part 10), you can have a Slow preset, but then something can be altered to a lower quality setting, say something that 'ignores little changes' between frames, so instead of quality being kept, it will 'smooth out' those areas of the video, which seems like an apparent loss of quality to human eyes [I hope I am saying the concept right, I'm getting tired at the moment, but wanted to pop in to help a bit if I could].
For example, one of your movies might have a lot of Film Grain in it (the darker, 'static' or 'fuzz' that appears in films, which makes them appear more 'film-like' (as opposed to TV) and increases apparent quality ('what seems like quality to human eyes, but isn't actually more detail in the source', is how I like to explain that).
If then, there is a setting that ignores the static, tries to eliminate it (to make the file smaller), the film-grain effect will be gone, but the result will look 'mushy' or 'muddled', to use subjective terms, and some people feel it will look worse than the original. This is normal in video compression however, and usually these choices/options in the codecs are there because they were originally designed for television and similar media, where the loss of fine detail doesn't matter as much [to the creators of the content] as does, say, the streaming synchronization/timing (where speed/accuracy is more important, in such things as television/telecasts, over fine detailed quality).
While normally choosing "Slower" or other similar presets, one of the things that can help with fine detail like Film Grain is, configure the preset option of "Grain" in the settings of AVC/264 compression, or select to turn off DCT skipping and P-frame skipping. These types of settings make MPEG-4 'NOT' ignore the little changes between frames, and will help keep fine detail like film grain. Another thing to try, is to lower the Deblocking processing in MPEG-4/AVC. Deblocking will try to hide the analysis areas of the video (where is processed the images and didn't have enough bitrate, so there is a hard-edge of change, which creates the look of little 'blocks' when things move in the video) by blurring these areas, to try hide these 'edges'. The higher deblocking settings will blur these 'edges' more, which results in a softer/blurrier image. Lower Deblocking settings can then improve the apparent quality (where it may 'look better' to human eyes, but is still pretty 'lossy' in mathematical comparison to the original). Setting the preset option "GRAIN" in the settings of AVC/264 encoding, automatically sets the Deblocking levels to "-3" (Minus Three), for example, to try to keep the fine detail of Film Grain.
The last thing I will sum up briefly to try help [for now, sorry I have an illness that makes me tired quickly], is the Bitrate. If you select Slower as a preset, but set the Bitrate much lower than the original (say, a 2500k bitrate, when the original is 8000k [to try make it fit some media, for example]), even though there is extra analysis and processing going on (in the Slower preset, to try to keep fine details), there will be an end result of loss of quality, since the Bitrate is not high enough to keep the fine changes between frames of the video. Compression Artifacting, such as Gibbs Effects ('ringing' or 'static' around edges, such as text) and Macroblocking (those little 'squares' in lower-bitrate video) will occur, which will make the video look 'worse quality' to human eyes [even if you don't know what the changes/worseness is, you will 'feel it is worse']. This is one way that a "Slower" setting can still 'look worse'.
Here is an example I captured back around c.2012 (recording the game RIFT by Trion, with Bandicam) showing what happens when the bitrate is set too low (and/or when the Deblocking setting is too low or off in AVC/MPEG-4 video post-compression/rendering with small bitrates) - it is a good example of the "Macroblocking" compression artifacts occurring [the 'little squares' I was talking about earlier], as well as an example of some of the loss in the details and edges when too low a bitrate is set:
The above type of compression artifacting can also occur if you set Bandicam's 'Quality' setting too low; such as if you use CBR, setting the number value too small, or setting Bandicam's VBR 'Quality' setting too low (overall/ default (eg. setting "Quality 10")) - if you are seeing these types of compression artifacts ('glitches') in your video - even if they are not as bad as the above, I purposely made it worse to show the differences - setting your Bandicam 'Quality' setting higher will help get rid of these, as Bandicam is capable of much higher Quality than the above [if you have the hard drive room to spare and computer power to allow it; note that your video size will be larger as well, because it needs the data space to keep all the differences between frames and finer details, in the video data stream].
HTH for now, with a couple concepts anyway - keep researching it yes, and have fun learning how to make your videos look better, heh
("Hope That Helps!") ~ Troy