

Yeah, I had to pay, I don’t think there’s a way around that but I didn’t mind.


Yeah, I had to pay, I don’t think there’s a way around that but I didn’t mind.


Yeah, there was nothing about language in it, but I took it anyway just for the sake of it.
Wish I could take an official one, purely out of curiosity.


Just took a test out of curiosity, but the result screen is much different.
Disclaimer: Don’t take too much of the score for granted, the test isn’t that comprehensive, and just by knowing basic math and intermediate logic you may reach a similar score.



Yeah, I don’t think what I suggested should do anything at all in this specific case…


Can you verify if Jellyfin is remuxing without transcoding? I.e. changing container but without touching the frame/audio data.
I believe while you playback it should say in the administration panel, in the card that represents the active session you have this issue in.
Remux and transcode happen on disk, unless you manually set the temporary path to a decently sized tmpfs partition.
I solved a similar issue doing exactly what I just wrote: tmpfs (can’t recall what its name is under Docker) and set the transcoding path accordingly. I also had to tweak the transcode files’ lifetime:
This has done wonders for me for both on-the-fly remux and transcodes, but I had to reserve a beefy tmpfs (I think I have like 8GB set right now).


Very cool critters indeed!



Oops, in spite of writing it in the alt text, I just realized that it’s not really clear, Pepper and Titus are two domesticated ferrets!




He truly makes you wonder what he really is

Bonus picture with his brother Titus
The source tarball is always autogenerated from the git repository state at the release point’s commit.


Are you using the docker version or the manual installation version? (After using the manually installed version for a while, I suggest the docker one as upgrades are much less painful)
I only had the demo of it and I just kept going up the hill and building up as much speed as possible, only to then let myself go OOB - for hours and hours.
Ah… Lovely memories of a brainless kid just messing around with his computer…


Prey 2017.
Such an underrated game on its own.
The ambient is so immersive to me, both indoors and outdoors, so many details, being able to interact with so many objects in so many ways, even the Looking Glass, I just wish it lasted much more… the ending was however quite disappointing in all aspects, especially from the story perspective and in the “I wanted more” perspective.


It’s just the fact that, at some point, if you want a faster computer, you’re bound to have DDR5.
AMD 5000 is fast, but how does it compare to last gen? Is there a 5000 CPU that can get the same score as a high end 9000 CPU?
What if you have a homelab server to upgrade but find out you need more PCIe lanes?
Other than that, yeah, you don’t need DDR5, but DDR4 is slowly going out of production and is also rising in price… so you’re screwed either way.
Small improvement: Allow for spaces near the Equal sign in the regex (i.e. Port\s*=\s*)


If you don’t have the rights to share that ROM then you likely don’t want to share It.
ROM sites are required to not hold a ROM for more than 24hrs or 48hrs, and I believe they rotate their URLs files in order to circumvent abide by that rule.
Frankly, why bother at all? Why not link to a ROM website directly? Unless ONLY YOU have that ROM, then it’s more of a risk than a feature.


With this setup, I’d build an emulator box / movie player HTPC.
If you want to upgrade something, I’m fairly certain that your GPU is not your bottleneck. Gen4 intel goes way back, and I noticed quite a lot the upgrade from my Gen6 to my Zen2, and back then I was running an RX 480.
That of course means changing CPU, Motherboard and RAM (goodbye DDR3).
The only issue with the 1060 is the VRAM, which pretty much only limits your resolution output - especially in games (as they may have more than one render buffer / frame buffer) - and the texture details in modern games.
Playing movies won’t be affected at all, IMO.
In a HTPC like that, I wouldn’t upgrade anything. It’s too costly to change CPU, Mobo and RAM just for a HTPC “out of curiosity”. I would most certainly at least try to settle with it, and evaluate its shortcomings on the fly. You’ll always be able to upgrade it at a second time.
Maybe just double check your storage, a HTPC should run an SSD as you likely don’t want to wait 2 minutes for it to boot.


That would basically give out my wanking schedule to the processor…
I used to love it, but wiki.js 2.0’s editor is very unfriendly to non-tech users. 3.0 could’ve been the solution, but after waiting over and over for wiki.js 3.0 to release, after being years late on their schedule and with less and less blog posts (the last blog post about 3.0 is two years old!!) we chose to migrate to Bookstack.


Honestly, given that they should be purely compressing data, I would suppose that none of the formats you mentioned has ECC recovery nor builtin checksums (but I might be very mistaken on this). I think I only saw this within WinRAR, but also try other GUI tools like 7zip and check its features for anything that looks like what you need, if the formats support ECC then surely 7zip will offer you this option.
I just wanted to point out, no matter what someone else might say, if you were to split your data onto multiple compressed files, the chances of a bit rotting deleting your entire library are much lower, i.e. try to make it so that only small chunks of your data is lost in case something catastrophic happens.
However, if one of your filesystem-relevant bits rot, you may be in for a much longer recovery session.
Why does that look like a ChatGPT-written post, especially the paragraph with “From X to Y, …”?