Theoreticly it does matter.
Suppose you have a refresh rate of 70 (your eyes).
Suppose your game fps is 100. When playing the game lets say in frame 50 a enemy will show up. In frame 49 he is not visible yet. Because each frame takes 0.01 seconds to generate you will have a lag of 0.01 second because of the frame rate. So, the higher the refresh rate of the game the faster "you" can spot the enemy. Eventho you can only see 70 frames per second, a faster refresh rates makes sure that the interval you need (for example 0.02 sec to visually see) will become smaller since the monitor will display the image faster.
Now you could say: "but people can only see a difference after 100/70 = 0.014~ second." This is only partly true because 1,:
1. Eyes do not have a refresh rate. The 50-70 is just a rough guess taking the speed eyes send signals to the brains in acountance and some other variiables;
2. Suppose your eye "refreshes" 0.0001 second after a new frame. This would mee you would have to wait 0.01 second + 0.01-0.0001 for the next frame (almost 0.02 s), if you have fot example 200 fps this would be 0.01 sec meaning you still catch the image faster.
This difference is for example 0.01 s (can be smaller/bigger), depending on the monitor refresh rate and FPS. But there is a (noticable) difference.
A lot of "pro" game players prefere high fps in combination with a 120mhz or 200 mhz screen with 0.02 s reaction time. So I suppose for some people the difference is there, even tho it's only small.
Short answer. Yes it matters, 0.01 second can make a difference.