First,
Every computer has an internal clock.
It's a little circuit that you save Date and Time to, and it keeps ticking, counting the time.
It saves that time as a 32/64 bit integer, which keeps
Year, Month, Day, Hour, Minute, Second, Milisecond data.
It's why the 16 bit systems had a error some time ago, the 16 bits weren't enough for all that.
Most novice programmers don't know how this mechanic work, so hey do some hacky for loops (BAD IDEA, Eats computer resources BAD).
Instead, you should use any integrated wait commands, timers, clocks.
A program usually timestamps the time a timeout was executed.
Then it checks (efficiently so) if the current timestamp (current OS time) is late enough (5 seconds later than the marked timestamp), then it runs some code.
Putting it basic, the only way a computer can tell time is by checking the internal clock once in a while, and checking if it has anything to do on a particular time
(time scheduling).
If you need to mess with this, and not use default timeouts and other similar things the environment you work in, you will have a lot of learning to do, as this is a somewhat serious engineering problem. (Which is why timing is frowned upon in computers).