I have a REST-API server by HTTPS. I need a way to call the API when a file is retrieved using SFTP. The content is dynamically generated per request. Is there a simple way to do that without having to add user accounts and introduce security holes on the server? Having the key will allow anyone to get the content of the file, but nothing else.
This is what I have:
GET and doing this shall return the same:
sftp XXXX@myservice:/items (or sftp myservice:/items/XXXX) 1 Answer
I think I understand what you mean. You want to generate a file based on the current user (or requested file). You currently do that over HTTPS but you want to also make it available over SFTP... Right? Interesting problem.
You might be shocked to hear that I don't think there's anything that can do what you're asking out of the box.... Anything's possible though.
My first approach involves writing your own filesystem!
I'd start with FUSE. Moreover, python-fuse. Here's the only tutorial I've ever used for FUSE. It'll get you to the point where you can start to edit things. I would:
- Have it list just one [fake] file in
readdir() - Work out how to find the username from the filesystem
- Hotwire
open()andread()to pull from arequests.get('your URL')using the username.open()might just need to be faked andread()be a handle on therequestsrequest. - You'll probably need to patch up and dummy many of the other methods so they return data but it'll obviously be faked most of the time
- Mount this FUSE filesystem somewhere in your SFTP that everybody can see as readonly
This would work for either sftp approach (you could look at the full path instead of the user, for example) but you'll need a single sftp account that people can log in with.
Alternatively, if every user has an account, you could try to monitor access to a directory owned by each user. inotifywatch and inotifywait can let you watch for access to a directory.
inotifywait -mrq --format '%w' /home/*/magicdir/ | while read DIR; do user=${DIR:6:-10} # that is based on the length of /home/ and /magicdir/ wget -qO $DIR
doneAnd leave that running somewhere in the background. It'll update the file from the website everytime they access the directory. You might want to check it (with an echo blaaa $DIR) to make sure it's not hammering the server. It may not display a file the first time they load the directory.