SQlite so far is a default choice for me to do caching in mobile applications. It is lightweight, fast, with query-able feature sql syntax and built in for both Android and iOS.
I used SQlite on several mobile app projects and never had a complaint about it, it was simply smooth as butter. Until recently it gave me this lovely message: Row too big to fit into CursorWindow. Then I realized, there is lite in SQlite for reasons!
So the story is, I was implementing a caching mechanism to speed up my Flutter based application’s API Calls. The result body of the particular endpoint of my app was indeed big. I’m not sure about the size if I store it into a json file. But imagine having a body of API request then you copy it into a json file with a built in text editor (I use gedit by the way, and Ubuntu linux) then it crashes! just because you navigate the cursor in the file. To put things in context further, I have 16GB RAM, Core i7 processor with 12 cores and around 300GB free disk space. I never had my text editor crashing, especially in linux, so it was hilarious when it happened ^^
Back in SQlite, it turns out the cursor window size is around 2MB, at least for Android, not sure about iOS (but we’re speaking about cross platform development here, both platforms are important). This limitation does exist for good reason though, because otherwise the performance will suffer a lot. However, the thing that bugs me is the fact that I can insert that data in the first place, before getting an error while accessing / selecting that particular data. Yes, you read it right! we can actually store that big chunk of data, the one that crashes native linux text editor. But when it comes to an attempt to read it (or query it), it gives you that lovely error message.
I wonder why it’s allowed to store that huge amount of data but at the same time, we’re not allowed to query it? What use is data stored if we can’t query it? So it’s basically there in the database, but for whatever reason we can’t receive it back. Why not prevent it from being stored in the first place? Maybe there is an option to skip the cursor to scan that particular column where that huge data sits? who knows. But anyway, increasing the cursor window size wasn’t the option, and I had to look for another way.
My first obvious workaround was to have that huge data stored in the json file instead, then I keep the reference of the file (as a filename) in SQlite. But since I used the endpoint url as a key, I need to encode it using base64 encoding then use it as filename, otherwise it is a bad idea to have the url as filename (with lots of slashing).
That was it, the problem and then the solution. For some of you maybe it is an obvious problem with an obvious solution. It’s just that I like to put it into a note, especially about this cursor window size limitation of SQlite in smartphone devices, in case I will face the same problem in the future.