I am starting a small web hosting company to teach me about system administration and one of the problems I am having is how would I limit the bandwidth and disk space of each virtual host (Debain/nginx)? Or am I going about it the wrong way and should not use virtual hosts?
I have used mod_bandwidth to restrict bandwidth. It can be used per directory, per file, etc... so, you just have to configure it per VirtualHost.
For the Disk Space, just use the quota. Create a unix account per customer, and assign them the quota. That should be quite well documented in the web.
Some quick Googling shows me that you could use a Squid proxy to set queues for data called 'data pools'. You can apparently also use IPROUTE2 and TC (can't post link, I'm not awesome enough)
From http://www.faqs.org/docs/Linux-HOWTO/Bandwidth-Limiting-HOWTO.html#AEN65
Also see an article on serverwatch.com titled Reining-in-Bandwidth-With-Squid-Proxying
Squid is probably the simplest way, and you can do a lot of other cool things with it, too.
Its not simple. I suspect that it won't be possible simply using vhosts to enforce limits unless you put a scripted proxy in front of the webserver (and that only addresses the bandwidth problem, unless you restrict file uploads to a scripted web interface).
But surely it makes more sense from a business point of view to sell a package which includes X bandwidth and Y disk usage then bill the customer for excesses (which you can measure easily from the disk footprint and the access logs).
C.
To limit total bandwidth usage refer to the KikoV answer, I'm discussing about limiting disk space.
In fact there's no straight way to limit disk space, but you have two options: