I don't want to spam any links here but if you are interested please do look at my last post about the dangers of doing this and lessons I learned from my mistake.
Please do not keep the files for 10 days. Even 24 hours is a deal-breaker. From what I've learned, anything more than 30 minutes can get you into trouble.
I wonder though if you could simply just block the Google crawler and bypass it. Or use a JavaScript to auto-POST something before the file gets sent for download. The Google crawler doesn't issue POST requests as far as I know.
1. Many people are more likely to go to a lot of effort to complain loudly and widely rather than hit a simple "delete this" link.
2. Such feature is basically a self-DoS. If someone takes a disliking to the app or a user of it they can script up a "delete everything" and fire it off.
If you make it super user-friendly and advertise it as the next Megaupload, sure. But if you keep a small audience of good-faith users it's not asking for problems.
If you can teach me to make my file upload as hacker-friendly as this service while implementing auth, i'd be glad. Here the entire point is you don't need further configuration/credentials for example to upload log/config from a server.
2. How do you know someone is from small audience of good-faith?
3. What if a file has virus and corrupt all the files on your end?
If you don't need auth, there are few measures you can take on your end:
1. TTL - Make these files temporary - They will be erased after x hours. Eg. x=1
2. Throttle - Limit number of uploads from a given IP/machine or control by uploads per sec
3. May be adding a malware scanner?
You can stand up an (SCP/SFTP-subprotocol-only) SSH server, and tell the user to log in with their GitHub username + GitHub SSH key. Then configure your SSH server to call[1] a check on GitHub’s API to map the provided username to the GitHub user’s set of public SSH keys. From there, the server treats that list exactly as if it were the user’s ~/.ssh/authorized_keys file.
[1] As it happens, I wrote an OpenSSHD plugin for exactly this: https://github.com/tsutsu/github-auth3
Following that, you can configure PAM to continue the auth process however you like, policy-wise: let any GitHub user in; only let GitHub users in from a specific GitHub org; keep an LDAP directory of GitHub usernames such that you can attach metadata to them like “is banned” or “has used up their upload credits for the day” or “is on plan tier X”; etc.
Then, to actually handle the uploads, you can 1. set up automatic local user instantiation per remote user; 2. populate /etc/skel with just the right set of limited files to allow the user to upload into one “spool” directory; 3. have an inotify-like daemon that watches for files to be closed in that directory and handles them from there (e.g. uploading them to an S3 bucket, etc.)
—————
Or, alternately, you can avoid building this on top of OpenSSH, since you’re really fighting against the current by trying to virtualize everything, when OpenSSH expects to be relying on, and providing access to, a traditional POSIX environment.
Instead, you can have your own SSH server daemon that provides access to a pretend environment inside the SSH-server process, and handles SCP/SFTP upload streams through a custom in-process handler, the same way a web framework handles PUT requests.
I don’t know how common this is in other runtimes, but Erlang has an SSH server framework that you can use to implement exactly this. (As it happens, I’ve also written a high-level service that uses this SSH server framework to implement an alternative carrier for Erlang remote shell, where you can just SSH into the Erlang node to get a shell on it: https://github.com/tsutsu/exuvia. This app is also, AFAIK, the only public/FOSS demonstration of how to use Erlang’s SSH server library—which is kind of sad. People should play with things like this more! Make MUDs and such!)
I use a little python script that creates a curl command to upload to S3 for cases where I don't have the AWS toolchain on a remote box.
Not as easy as a single command, but at least I'm less likely to be sending files off to some random site for everyone to see.
curl -X POST https://content.dropboxapi.com/2/files/upload --header "Authorization: Bearer ACCESSTOKEN" --header "Dropbox-API-Arg: {\"path\": \"/DROPBOXFILEPATH/DROPBOXFILENAME\"}" --header "Content-Type: application/octet-stream" --data-binary @/LOCALFILEPATH/LOCALFILENAME
https://gist.github.com/tuxfight3r/7ccbd5abc4ded37ecdbc8fa46...
Luckily I found out before law enforcement did [2] so I proactively talked to my federal bureau for months generating Excel sheets of IPs and access times and devices and countries. I didn't see many of the images myself, basically just looked at one upload per IP which was like three in total and forwarded all uploads of that IP to the police but man.. what the hell is wrong with people. 4 digit number of uploads of CSAM.
[1] https://github.com/HaschekSolutions/pictshare [2] https://blog.haschek.at/2018/fight-child-pornography-with-ra...
Nice job going through the reporting process and I'm glad you blogged about it to share with others
(*based on https://github.com/magic-wormhole/magic-wormhole)
*this one and another few before it.
Other comments are right to point out that this site is setting itself up to be abused. My feeling is that this is intended to be a demo. I doubt the creator is trying to provide a real service here. And they might be in for a rude awakening if it gains traction.
But, it looks like they intend this to be open source. Anyone can clone the repo and run this on their own server! Unfortunately, the repo does not have a license file, which makes me a little uneasy.
Edit: I didn’t say that very well. With no license file, technically we cannot actually use this code since it defaults to ‘All rights reserved’. I think the author might not realize that though. It seems they intend it to be ‘open’ based on line 334.
Also, it is not particularly good PHP code, a little rough around the edges. But hey, it's a cool demonstration on a very straight forward way to upload & share files! Could be a good starting point to develop further.
Surely the author is bearing the liability of getting burned by not specifying a licence.
A small script i use very regularly:
#!/usr/bin/env bash
if [ ! -f $1 ]; then echo "MISSING: $1"; exit 1; fi
torify curl -F"file=@$1" https://YOURSERVER || echo "UPLOAD FAILED (code: $?)"Also there is a cute alias you can do to easily 'share' files:
alias share='f() { curl --progress-bar --upload-file "$1" https://share.schollz.com | tee /dev/null; echo };f'
CopyFile["/path/to/file",CloudObject["your.file"]]
I use it all of the time.https://miscdotgeek.com/curlytp-every-web-server-is-a-dead-d...
cat file.txt | curl -F 'sprunge=<-' http://sprunge.us
ix.io