Hi everyone.
I have a side project that could turn into a full-time project. so I wanted to ask what is the best way to go about it. The application needs to do the following:
- accept files via HTTP, FTP or SFTP
- accept files via email
- run a public-facing web server
- run a database (my preferred choice is PostgresQL) that stores data from the files, does some post-processing, and is used to serve data from the website
I currently am running PostgresQL, Nginx and ProFTPd on a FreeBSD virtual instance. ProFTPd saves its activity log to PostgresQL, which has triggers on the table to process files. I'm also using another FreeBSD instance with Postfix/Dovecot to run my personal email server, and aliased an email address to a python script that FTP's email attachments to the application server, so it hooks into the FTP workflow.
My main question is, if I wanted to turn this into a high availability, production grade architecture, what is the best way to go about it? Is it OK to put everything on a sufficiently large instance, and put each application in a jail? Or should it be split up into multiple servers, one (or multiple for replication) for the DB, one or more for Nginx/ProFTPd/Postfix?
And related to this, any recommendations on where to find FreeBSD sysadmins who could spec out a project like this?
Thanks!
I have a side project that could turn into a full-time project. so I wanted to ask what is the best way to go about it. The application needs to do the following:
- accept files via HTTP, FTP or SFTP
- accept files via email
- run a public-facing web server
- run a database (my preferred choice is PostgresQL) that stores data from the files, does some post-processing, and is used to serve data from the website
I currently am running PostgresQL, Nginx and ProFTPd on a FreeBSD virtual instance. ProFTPd saves its activity log to PostgresQL, which has triggers on the table to process files. I'm also using another FreeBSD instance with Postfix/Dovecot to run my personal email server, and aliased an email address to a python script that FTP's email attachments to the application server, so it hooks into the FTP workflow.
My main question is, if I wanted to turn this into a high availability, production grade architecture, what is the best way to go about it? Is it OK to put everything on a sufficiently large instance, and put each application in a jail? Or should it be split up into multiple servers, one (or multiple for replication) for the DB, one or more for Nginx/ProFTPd/Postfix?
And related to this, any recommendations on where to find FreeBSD sysadmins who could spec out a project like this?
Thanks!