I’ve recently had a project in which an interesting challenge popped up.

`Write a function that will round a timestamp to the closest 15min value.`

Something like this:

Additional example:

For `May 1 2015, 18:44:20.000 `

– round to `May 1 2015, 18:45:00`

.

For `May 1 2015, 18:07:30.000 `

– round to `May 1 2015, 18:00:00`

.

For `May 1 2015, 23:52:30.001 `

– round to `May 2 2015, 00:00:00`

.

For this discussion, let’s allow the mid-points to be rounded to any (of the two) neighbouring 15min timestamps. The problem seems easy. And it is. But something strange happened.

I asked this question several people around the workplace. They were able to **start with a sound idea in seconds** (well, most of them). The thing that shocked me is that no two persons proposed the same solution. Every one of them had his own way.

So here are few of the first ideas:

– Make four if-s to determine which interval the timestamp belongs to, then create a new DateTime with appropriate values. In my opinion, this is by far the worst solution (from a computer scientist viewpoint). While the algorithm actually performs very well, what happens if we need to change the rounding to the closest 1 minute and 12 seconds interval (starting from a 00:00:00 in a day). There are 50 of those in every hour. While I doubt such need will ever materialize, the point remains. I hope I don’t find 50 if-s in the codebase any time soon.

– Another idea was to convert a timestamp into a number of seconds since epoch (Jan 1, 1970). Then calculate the modulo 900 of that number (900 = 15 min * 60 seconds) and then subtract that remainder from the original number. With this algorithm, we get to the closest 15 minute timestamp that is earlier than the original. Now we check for some edge cases and construct an appropriate timestamp.

After a few minutes, **every one of those method was refined** both in terms of maintainability, readability and performance. I will not provide our final solution here because it gets complex, because in reality we need to deal with UTC and some other things. It doesn’t matter though, because something else is the key lesson in this post.

Let me give you a chart for it.

If you take only a little time to think, you will get to a very low quality solution. Define quality as you wish, but I guess we’re all talking about correctness, maintainability, readability and performance. To clarify, I am not talking about a class of algorithms for simple and common problems (such as calculating the average of an integer array). They should be obvious and most of the time there is only one, proven optimal solution that usually happens to be simple. What I’m talking about here is the class of problems that are new (not encountered before) and require some thought.

Think **too little** and you get **low quality**, think **too long **and you **waste time**. The idea is to find the right amount of time that will yield the right amount of quality. Be on the green spot. One of the most important skill of a software developer lies in mastering this – finding this optimum.

Next time a new kind of problem attacks you, resist the urge to start coding it. Take a pen and paper, and sketch your problem. Take some time to refine your solution, then code it. If you can’t think of a good idea to improve your solution in a reasonable amount of time, stop. You are probably on the green point. I usually stop when after 5-30 minutes I have not found any meaningful improvement. And by meaningful I mean those that would significantly improve maintainability, readability or performance. If it takes an hour to reach a good solution and then 3 days to reach a great solution, go with good. Refine later if you find time.

Like I said, there are no cookbooks here. Each problem is its own story. Master this skill. It will take time, but please – **climb the curve**!