I've had issues with the SD card causing the display to either flicker or annoyingly change brightness while streaming files, and yesterday I found a solution for this.
The problems were several:
1) I called SD.begin() every read.
Once during setup is sufficient. Shaved of 10 ms per read leading to overall speed boost.
Calling SD.begin() so often also caused stability problems that could make the SD card stop working all of a sudden for apparently no reason.
The only reason I did this was because I believed it necessary since I share the SPI port between the display and the SD card reader. And since the display needs a begin-call before showing anything at all I believed the same goes for the SD card reader. Shame on me and RTFM!
2) Uneven brightness.
This was rectified by calculating the worst case scenario and simply delaying processing until at least that time has passed. I.e during no loading a delay of 12 ms was present. During loading I delayed from 0.9 ms to 5 ms. I.e delay 12 000 microseconds minus the amount of microseconds passed since loading started. This is to ensure that a solid framerate is used so that each frame has a certain time-slot to live in.
Of course, in a future update I will be processing other data instead of just wastes cycles during that wait time. That's the beauty of a micro processing unit - you can rely on the time it takes to process things to be constant since there's no OS or other processes that hog the resources. There's of course a bit of interrupts etc that fire every now and then, but then we're talking microsecond differences at most.
Using these two tricks I've managed to squeeze out 40 solid frames per second that doesn't flicker or change intensity/brightness when streaming. Streaming becomes seamless and invisible, which is the way it should be.
40 fps is far from ideal and actually makes the whole display flicker when you don't focus your eyes on it. Since most of the time you're actually not looking at the display you get this annoying flicker in the corner of your eyes while playing. I can assure that staring at this display gives you headache. Trust me.
By analyzing where the time culprits are I've managed to see that the first opening of a file (i.e finding the file on disk) takes a whopping 11 ms, while reading 512 bytes takes 5 ms (slow, I know). If I somehow am able to minimize the open file time I can get 82 FPS instead (much better!) by setting the max delay time to around 6ms instead of 12ms.
I've come up with a byte compression scheme that will allow me to send the same amount of data in almost 1/3 of the time by cutting away all the wasted space. Basically - with 8 color levels used only the three least significant bits are actually being used: B00000111 = 7.
What I'm currently doing, and is allowed during testing, is to send the whole byte which wastes 5 bits for nothing! So by taking the first 3 bits of eight pixels I get three bytes to send in the end.
So instead of sending 4096 bytes for a full frame I send 1366 bytes (and get 2 wasted bits).
While not doing very much for the actual refresh rate of the screen, it triples the refreshrate of the animations, thus making smoother animations.