Time handling is everywhere in software, but many programmers talk about the topic with dread and fear. Some warn about how difficult the topic is to understand, listing bizarre timezone edge cases as evidence of complexity. Others repeat advice like "just use UTC bro" as if it were an unconditional rule - if your program needs precise timekeeping or has user-facing datetime interactions, this advice will almost certainly cause bugs or confusing behavior. Here's a conceptual model for thinking about time in programming that encapsulates the complexity that many programmers cite online.