Is it possible to point all of a TLD (like .dev), towards localhost with the hostsfile and without having to set up a local DNS server? Would be awesome for local web development!
Kyle's questions
So I noticed a temperature monitor cleverly labeled "Ambient." Knowing that the sensor is inside my computer, I can get a decent temperature check on my overly hot dorm room if I check it as soon as I wake my computer from sleep. I was curious if there was some way of logging temperature data for different parts of the day, ignoring the times where the processors are making the ambient temperature rise?
My school uses Ironport for filtering/monitoring web traffic. I have a bit of curiosity as to why it tacks on a x-junk:
header onto everything. After going through a few curl tests, I've found no real connection between what is shown on the page and the x-junk:
header.
here's my curl request: any ideas?
Anchorage:~ khotchkiss$ curl -I google.com
HTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Sun, 06 Feb 2011 04:37:25 GMT
Expires: Tue, 08 Mar 2011 04:37:25 GMT
Cache-Control: public, max-age=2592000
Server: gws
X-XSS-Protection: 1; mode=block
Content-Length: 219
Age: 108
Via: 1.1 MC-IRONPORT.UNIVERSITY.LIBERTY.EDU:80 (IronPort-WSA/6.3.3-015)
Connection: keep-alive
X-Junk: xxxxxxxxxxxxxxxxxxxxxxx
Is it possible to make the URL index.php?view=something
back into /something
?
Hey I want to build some web junk on OSX, I just want the software to be installed in a manageable package, like all the other good OSX stuff. preferably lighttpd but apache works too.
I wanted to set up caching on all the junk my webserver serves, but I would like a good idea of what production-level services use cache wise? For some reason I can't pick my own times for things like js, css, png, jpg, etc.
I've heard many good things about Nginx lately, and I wanted to put it on my slicehost server. I am in a fix for ram, and would like to get Wordpress and wp-super-cache configured. I was just wondering the 'recommended way' of get PHP setup, because I see so many webpages saying their way is correct.
No compiling if possible please, it makes updating a drag D=
I have a DNS setup going on where I have several subdomains that cname out to google apps, but I was wondering if I could wildcard the rest of my subdomains, and have my google apps cname correctly. In other words, would the records that aren't a wildcards be parsed before the wildcard record?
So I just got a nice new VPS service. I was wondering what a good number of 'MaxClients' would be, for a server that will host a personal website, some blogs, etc, and your experiences with one server maxclients stuff! The default of Ubu server is 150.
I was wondering what a rewrite statement that looks for this situation:
I want to have multiple users on my server. Each user can have VirtualDocumentRoot
like sites in their directory. For example, they just make a directory like example.com
in their home directory, and it's hosted.
The problem is I don't know if VirtualDocumentRoot
can do this, or if it would take a rewrite rule that looks in all the users folders for a domain.
Can anybody help me?
Is it possible that mod_vhost_alias can read several directories (For example, each users) to find the sites? Like in a hosting setup where different users can create the directories in their home directory?
I have a Mint install (Fantastic stats, if you were wondering) and I would like to move over to Lighttpd for my server needs. Unfortunatly, this means that php_auto_prepend is no longer easily available. I understand that PHP can do this in the php.ini file, but I have areas that can't have this value prepended so that's out of the question. Is there a lighttpd way to prepend my script for my mint install in some files, and have it disabled for others?
My config:
AliasMatch /browzerResources "/srv/default/browzerResources"
<Directory "/srv/default/browzerResources">
Options Indexes MultiViews
AllowOverride None
Order allow,deny
Allow from all
</Directory>
</IfModule>
creates a redirect loop in the Web browser like this:
http://example.com/browzerResources/index.htm/index.htm/index.htm/index.htm/index.htm/etc...
Any idea why it does this?
I was wondering if there is a way to use the Alias directive with just one file. Or there was a hack to do this, without having to go into another directory.
I know. This sounds complicated =D I made a PHP file browser, as an alternative to the apache one. I needed it for logic purposes, it does extra things for me, &etc. So instead of dropping this file in all of my directories, how could I get it to "show up" in all my directories that don't have an index (would use the apache dirlisting by default)? Thanks for the help!
Edit
I wonder if this could be done using Alias and DirectoryIndex? Is it possible to alias to a file?
So I have two domains, both use the same htdigest settings and password, etc. The problem is that it doesn't use the same auth over on the other one. My two htdigest locations are: http://svn.kylehotchkiss.com and http://apps.kylehotchkiss.com/codex. Here is the config for my svn.kylehotchkiss.com domain:
<Directory />
DAV svn
SVNParentPath /srv/svn.kylehotchkiss.com/repo
SVNListParentPath On
AuthType Digest
AuthName "KHP Code Repository"
AuthDigestDomain / http://svn.kylehotchkiss.com/ http://apps.kylehotchkiss.com/codex/
AuthDigestProvider file
AuthUserFile /srv/svn.kylehotchkiss.com/auth/passwd
Require valid-user
</Directory>
and my code for the apps.kylehotchkiss.com/codex/ is in a .htaccess file there and this is how it goes:
AuthType Digest
AuthName "KHP Code Repository"
AuthDigestDomain / http://apps.kylehotchkiss.com/codex/ http://svn.kylehotchkiss.com/
AuthDigestProvider file
AuthUserFile /srv/svn.kylehotchkiss.com/auth/passwd
Require valid-user
So what exactly am I missing with the AuthDigestDomain settings that doesn't allow the two to work together? Is the .htaccess file in the wrong spot?
I have a webapp under an alias on my server. I want this webapp to be redirected to HTTPS://. So here is my code:
alias.url += ( "/email" => "/srv/Applications/email/" )
$HTTP["url"] =~ "/email" {
$SERVER["socket"] == ":80" {
$HTTP["host"] =~ "(.*)" {
url.redirect = ( "^/(.*)" => "https://%1/$1" )
}
}
static-file.etags = "enable"
etag.use-mtime = "enable"
$HTTP["url"] =~ "/(plugins|skins|program)" {
setenv.add-response-header = ( "Cache-Control" => "public, max-age=2592000")
}
}
Now the issues is if I access the email at http://site.com/email, it redirects to https://email for some reason, but if you access it at http://site.com/email/ it works fine. I was just wondering if the is a fix to this, or I will have that hanging /email issue stuck =/ Thanks for any help!
Is there a way to get apache to send a 403 instead of a 401 with some config in an .htaccess file? I'm using dreamhost btw.
Edit:
I should better explain what I am doing. I am doing an HTTP auth login page with jQuery. My goal is to bypass that browser popup login window. To do this, I want the server to give me a 403 error when I try to access a file in the protected realm, as opposed to the 401 I am currently getting by default. When I get that 403, I can run a function that tells the user they have a password wrong instead of that browser popup, which doesn't tell the user they got information wrong, it just makes the site look bad in the end.
thanks =)