The easiest method i think most people use is...
File > Maintain Library > then make sure Optimize database (Complete) is ticked then OK it.
This should do a decent clean for you but the database size does get quite large depending on how albums etc you have.
Windows 10 Pro x64 2TB USB3 External Hdd with Quad Core 3600Mhz CPU and 10GB Ram
MM 4.1.21.1871 Portable on External Hdd
TESTING MM 5.0.0.2113
Numerous Addons
User Since 2006 (Lifetime License) since 2012
As suggested, do the optimization on it periodically for the "health" part.
As far as the "size", no need to worry. MM can handle it
[122 MB is fairly small compared to many MM users]
dypsis wrote:My database is now 122 MB. To me, it seems big for a database.
Is there any maintenance that can or needs to be done on it to keep it small and healthy like (for example) Microsoft Access databases?
Or do I just carry on.... ?
Microsoft Access <G> .. My db is approx 850m right now and that is middle of the road with this gang. There has been no problem from the db. Just do the maintenance as mentioned and do backups ..
I optimized the database. It didn't make any difference to the size though.
No surprise really. If you perform a pragma integrity check on an MM database after complete optimisation you'll probably find a load of missing references (that shouldn't be there). MM is the only SQLite database that I have which fails the test.
"Unless SQLite is running in "auto_vacuum=FULL" mode, when a large amount of data is deleted from the database file it leaves behind empty space, or "free" database pages. This means the database file might be larger than strictly necessary. Running VACUUM to rebuild the database reclaims this space and reduces the size of the database file."
So are you saying that MediaMonkey's complete optimisation does not compact the database?
The general idea of Optimization is to fix little inconsistencies that bog the database down from running smoothly.
Sure, it might compact it/reduce size some after also deleting empty space characters. But I don't think we're talking
differences of, say, from 122mb to 112mb or even 122mb to 117mb for most people (you might get those numbers
if you optimize (complete) after a few years of "frequently" deleting stuff from the database without any maintenance).
Have you or anyone else ever noticed a "significant enough" difference in size to matter much (size wise,
not overall database health) after doing an optimization and pre-optimization comparison test?