5ive writes in with his IPFS use case
Hi, talking about IPFS is a subject that can be just as confusing as NFTs. I tried keeping this as brief as possible. You can always reach out on Matrix(email I don't monitor well). I'd be open to Jitsi chat, but I know everyone's time is valuable.
My use for IPFS is somewhat pedestrian. I use it for hosting static web sites and an APT repo. It has the benefit that I have a public service that doesn't rely on any risky server infrastructure. The packages in my APT repo are content hash addressed. Meaning they can't be forged to hold malicious content(in addition to them being signed).
My current setup runs completely off a $26 ZeroPi https://www.friendlyelec.com/index.php?route=product/product&product_id=266
Nothing says you couldn't simply deploy your site from your desktop, but I wanted a local web server instance. This device runs NGINX, which can preview the state before deploying. Internally, my DNS points to this NGINX server for the sites. When I leave my network the DNS points to the IPFS public sites.
When my site is in the state I like locally and I have a need to publicize it. I run a Python deployment script I developed to do all the heavy lifting. The process really is that simple.
The beauty of IPFS is the lack of servers involved, low maintenance, resilance to attacks, and the cost. The only hosting I'm paying for is DNS. Mind you, I don't have that much data published. With large amounts of data you would want to pay something for pinning which I'll get to. The largest downside is the speed of the IPFS network. During deployment your content needs to be discovered. This can take hours depending on the speed of the network. I'm not in a hurry to deploy my content, so this isn't an issue for me.
Besides DNS, your biggest cost is pinning your files. The act of pinning ensures that your files are always available on the network. If you have a reliable location with enough storage(linode) you can pin your own files and run an IPFS server to maintain access to these files. When a request is made via a gateway this content might be cached on that gateway, but there is no guarantee how long. I use Pinata's pinning service, which offers 1GB free for testing. Each time I pin my content, Pinata keeps that history. At anytime I could revert to an older copy by updating my DNS to an older CID. Cloudflare's gateway also caches. That's a reason I'm using Cloudflare as my gateway in the DNS record. IPFS gateways request content on the IPFS network then returns it in a web request to your browser.
As time allows, I publish some of my tools to public git, but I have not with this deployment script. It is available on my public apt repo. I have also built this tool for my use-case, so it's not as universal(expects Cloudflare DNS, etc). This script supports sending matrix or SMTP alerts. Each site is setup in its own configuration file. I'm open to publishing on git and making changes if there was a demand.
In a nutshell:
1. Starts IPFS daemon if it is not running
2. Generates the CID(content ID) and gathers paths to distribute
3. Verifies the paths are live by continually fetching each path from a list of IPFS gateways
4. Pins the CID locally
5. Pins the CID to public IPFS pin services (Pinata)
6. Uses Cloudflare API to point the DNS at the new CID
7. Kills the IPFS daemon process
After deployment is complete the site looks like any other website with a normal DNS hostname. All that is required at this point is that the content is pinned/available and DNS pointing at an IPFS gateway with a dnslink text record pointing at the IPFS CID. Of course, there are a number of things you can do with IPFS. Simply uploading content to IPFS and referencing the IPFS CID for retrieval is pretty easy. That can be done all within a browser. For your encryption or mostly obfuscation challenge you three could have shared IPFS links!