9.5 after a couple searches, htop and free(1) show less being used in total, guess it compresses good or there's a bunch of shared memory being reported.
24 gigs when idling after indexing, 4.4 gigs (so far) when idling after restart. Not great, not terrible, hopefully search speed would compensate that. Would keep it running on home PC for a while until I get time to move it to the server.
Yup. 1.4m and 5.3 gigs used. Depending on how bad memory usage when idle would be, I could just copy the indexed DB onto the server and be done with it.
This was as easy as adding a check for nil dates, I guess a couple malformed objects ended up there. Indexed 600k posts so far, system monitor shows around 2,7 gigs used, would probably baloon later.
https://github.com/meilisearch/meilisearch/issues/2619 k, changing meili_post to meili_put in the mix task fixed the issue, now it appears to have started indexing something. Just testing it over SSH tunnel to the workstation now, I wonder if I could use the same instance for all three pleromers I host here, if not that'd be a shame.
https://docs.pleroma.social/backend/configuration/search/ >Note that it's quite a bit more memory hungry than PostgreSQL (around 4-5G for ~1.2 million posts while idle and up to 7G while indexing initially) agency=# select count(1) from objects where data->>'type' = 'Note';>18451319 That's like 15 times the suggested metric and I don't appear to have 64 gigs to spare except on my main workstation that I'd like to use for other purposes.
Operator, CEO, inner predictor and the sole proprietor of Ryona Dot Agency.Not too fond of negroes, jews, women, hohols and transvestites.380 defederations and counting.🏴☠️🇷🇺