Psst! You should watch those request times!
Yesterday in the evening I tried to measure some request times in the simplest possible means. So I used Aria2, a download tool that tries to be very fast with everything and wrote a file with 40 URLs to download from the local instance of the URL shortener I’m writing. I got request times of around 0.1-0.2 milliseconds. That is the time it took the software to answer the request, transport not included.
Cool cool, now let’s see what it changes if I do some database optimization. #sqlite has some pretty impressive settings, that allow lots of optimization. I pushed them so far that the database is practically always in RAM and writes are done via a journal that is synchronized with the actual db in regular intervals instead of on every write. This made the simplest requests 5-10x faster! The typical GET request would only take 0.06 milliseconds. On my laptop!
But I managed to get back to the slower version from before by adding a few more columns into the table. I really wonder how this can have such an impact, so I’ll investigate further.
Do you think optimization on this level is a waste of time? 😅 Tell me in the comments.