It’s also a fscking mess to set up a Usenet downloader, especially since it’d be a bunch of buggy weird stuff ending with -arr in the names and web UIs.
And no, torrenting isn’t outdated and isn’t amateur. In Usenet messages are replicated over all services offering that newsgroup. I hope the downsides are clear.
Some kind of Usenet with global identifiers of messages and posters, and with something like Kademlia to find sources for a specific newsgroup(to get all the other side has in it)/post(to get it specifically)/person(their public key), would be much better than just replicating each message everywhere with a local identifier.
It’s also a fscking mess to set up a Usenet downloader, especially since it’d be a bunch of buggy weird stuff ending with -arr in the names and web UIs.
It’s really not. You pretty much need to just put in some api keys for your indexer, downloader, and provider, and away you go.
In Usenet messages are replicated over all services offering that newsgroup. I hope the downsides are clear.
What downsides are you talking about in regards to downloading content from usenet?
Well you could use the -arr stack but you could also just set up SABnzbd which is the same difficulty to set up as qbit/jackett.
I haven’t touched the -arrs myself, just go to my indexer, click download, it goes into the correct folder which sabnzbd automatically picks up and starts a-downloadin’, then it transfers the complete files to another folder.
But I use both, and slsk, and ytdl. Why limit myself?
It’s also a fscking mess to set up a Usenet downloader, especially since it’d be a bunch of buggy weird stuff ending with -arr in the names and web UIs.
And no, torrenting isn’t outdated and isn’t amateur. In Usenet messages are replicated over all services offering that newsgroup. I hope the downsides are clear.
Some kind of Usenet with global identifiers of messages and posters, and with something like Kademlia to find sources for a specific newsgroup(to get all the other side has in it)/post(to get it specifically)/person(their public key), would be much better than just replicating each message everywhere with a local identifier.
It’s really not. You pretty much need to just put in some api keys for your indexer, downloader, and provider, and away you go.
What downsides are you talking about in regards to downloading content from usenet?
Centralization, need to have a big commercial provider which you can pay (as someone in Russia …).
Well you could use the -arr stack but you could also just set up SABnzbd which is the same difficulty to set up as qbit/jackett.
I haven’t touched the -arrs myself, just go to my indexer, click download, it goes into the correct folder which sabnzbd automatically picks up and starts a-downloadin’, then it transfers the complete files to another folder.
But I use both, and slsk, and ytdl. Why limit myself?
OK, I’ll try it.