We are using AWS ELB to offload SSL and load balance the incoming traffic. If all instances are down/unhealthy the ELB just sends back a blank page with HTTP status 503.
Is it possible to send a static page, to indicate maintenance, for example?
We are using AWS ELB to offload SSL and load balance the incoming traffic. If all instances are down/unhealthy the ELB just sends back a blank page with HTTP status 503.
Is it possible to send a static page, to indicate maintenance, for example?
I want to use an intercepting squid server to cache specific large zip files that users in my network download frequently.
I have configured squid on a gateway machine and caching is working for "static" zip files that are served from an Apache web server outside our network.
The files that I want to have cached by squid are zip files >100MB which are served from a heroku-hosted Rails application. I set an ETag
header (SHA hash of the zip file on the server) and Cache-Control: public
header. However, these files are not cached by squid. This, for example, is a request that is not cached:
$ curl --no-keepalive -v -o test.zip --header "X-Access-Key: 20767ed397afdea90601fda4513ceb042fe6ab4e51578da63d3bc9b024ed538a" --header "X-Customer: 5" "http://MY_APP.herokuapp.com/api/device/v1/media/download?version=latest"
* Adding handle: conn: 0x7ffd4a804400
* Adding handle: send: 0
* Adding handle: recv: 0
...
> GET /api/device/v1/media/download?version=latest HTTP/1.1
> User-Agent: curl/7.30.0
> Host: MY_APP.herokuapp.com
> Accept: */*
> X-Access-Key: 20767ed397afdea90601fda4513ceb042fe6ab4e51578da63d3bc9b024ed538a
> X-Customer: 5
>
0 0 0 0 0 0 0 0 --:--:-- 0:00:09 --:--:-- 0< HTTP/1.1 200 OK
* Server Cowboy is not blacklisted
< Server: Cowboy
< Date: Mon, 18 Aug 2014 14:13:27 GMT
< Status: 200 OK
< X-Frame-Options: SAMEORIGIN
< X-Xss-Protection: 1; mode=block
< X-Content-Type-Options: nosniff
< ETag: "95e888938c0d539b8dd74139beace67f"
< Content-Disposition: attachment; filename="e7cce850ae728b81fe3f315d21a560af.zip"
< Content-Transfer-Encoding: binary
< Content-Length: 125727431
< Content-Type: application/zip
< Cache-Control: public
< X-Request-Id: 7ce6edb0-013a-4003-a331-94d2b8fae8ad
< X-Runtime: 1.244251
< X-Cache: MISS from AAA.fritz.box
< Via: 1.1 vegur, 1.1 AAA.fritz.box (squid/3.3.11)
< Connection: keep-alive
In the logs squid is reporting a TCP_MISS.
This is the relevant excerpt from my squid file:
# Squid normally listens to port 3128
http_port 3128
http_port 3129 intercept
# Uncomment and adjust the following to add a disk cache directory.
maximum_object_size 1000 MB
maximum_object_size_in_memory 1000 MB
cache_dir ufs /usr/local/var/cache/squid 10000 16 256
cache_mem 2000 MB
# Leave coredumps in the first cache dir
coredump_dir /usr/local/var/cache/squid
cache_store_log daemon:/usr/local/var/logs/cache_store.log
#refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern -i .(zip) 525600 100% 525600 override-expire ignore-no-cache ignore-no-store
refresh_pattern . 0 20% 4320
## DNS Configuration
dns_nameservers 8.8.8.8 8.8.4.4
After trying around for some time I realized that squid is sometimes deciding that my file is cacheable, sometimes not, depending on whether and when I enable/disable the dns_nameservers
directive.
What could be wrong here?
Are the any malware/security scanners out there, that can analyze apache error and access logs and tell me if I have potential risks on my server or if there is unauthorized access etc. ?
I have several user logon scripts defined in active directory. These are batch scripts as well as powershell scripts.
When do these actually run? Does the user see "black terminal boxes" when he logs on, or do the scripts run while he sees the login screen?
Edit for clarification: I am speaking of logon scripts, defined in GPO and I am running Windows Server 2008 R2.
I do regular backups of my HyperV host system, including the folders with the hyper v guests.
Should I also do a backup of the guests separately (e.g. from within the guest OS itself)?
Is there something like an "event" that is fired when a user is created in active directory? An external system automatically creates active directory users and I would like to perform some powershell tasks directly after creation of a new user. When the user logs on, the script should have run already.
Has someone an idea how this could be achieved?
I am running Windows Server 2008 R2.
I am having a weird problem with the PATH variable under windows:
My application is in a folder c:\app\bin and the DLLs for this application are in the c:\app\runtime folder. To run my program I modify the PATH variable with a *.bat file usually with the following script:
set PATH="c:\app\bin";"c:\app\runtime";%PATH%
This will bring the executables and the DLLs on the path. However, on one of my Windows Server 2008 R2 systems this does not work. That means, if I execute the above command in a command window, I can start the exe file from c:\app\bin, but the application complains immediately that it cannot find some dll files required ("The program can't start because ....dll is missing from your computer ..."). These dll files should be in c:\app\runtime.
I experimented a little bit and it points out that there are three workarounds:
PATH="c:\app\bin";c:\app\runtime;%PATH%
The weird part about solution 2 is, that it does not change anyhing if I add quotation marks to the first path, or if I change the order of the paths.
Has someone a clue why my original script does not work? I need to get it run, because it is created automatically by a program and I cannot change the application that generates the bat file.
In the latest version of the Windows Firewall, included for example in Windows Server 2008 R2, you can block incoming connections and apply this rule only for a set of users (Users Tab in the rule properties).
Why is this not possible for outbound connection rules and how can it be achieved? I need a software solution that blocks all internet access for specific users and for others not and I hope to realize this with the windows firewall.
I am having problems to shadow the remote desktop session of a user on a Windows Server 2008 R2.
If I log on as local administrator on the server with RDP and shadow the session of a domain user that works. However, if I log on with a domain user that is in the local administrators group and try to shadow the session of the same user as I tried before, I get an error:
> shadow ...
Remote Control failed. error Code 5
Error [5]:Access is denied.
I explicitly enabled all permissions for the domain user which cannot shadow a session under: Remote Desktop Session Host Configuration -> Rdp-Tcp connection -> Security -> checked Full Control
I even restarted the server after these modifications without success. I have exactly the same settings enabled on a different server and there a domain user, also being in the local administrators group can shadow sessions of other domain users.
So, are there any other settings I must enable to allow screensharing? Or do I do something wrong? Is there a way to determine which permissions an existing user actually has?
I have Windows Server 2008 R2 RDP Session Hosts. They are using the RDP 6.1 protocol. How can I update to 7.1? Do I have to install a software or can it be configured?
I need to block internet access for some users on our Windows Servers 2008 R2. If you google this question you will find a lot results that propose to disabling Internet Explorer and setting a proxy to 0.0.0.0. Unfortunately this can easily bypassed using a portable Firefox for example.
Is there a more restrictive solution? I need to find a way that even telnet, ftp etc. won't work.
Thanks for your help!
Update for clarification: I would like to block internet access only for some users, not or all on this server.
Usually if you connect to a Remote Desktop Session Host running on a Windows Server 2008 R2 you will be prompted for your credentials and if they are valid you will be logged in to your session directly.
On one of our servers everyone who logs on with RDP sees the usual Windows Logon Screen (green/blue background with username/password) after already entering them in the Remote Desktop Client. So, users are not directly logged in, but have to enter their credentials twice. It is only the case for this server. The server also is a Remote Desktop Connection Broker.
Is this behavior caused by a wrong setting or do you know how to get rid of it?
I would like to deploy an expensive proprietary software to users in a Remote Desktop environment (users are only using RDP to connect to our servers). I would like to allow them to run and use the full potential on the software, but only on our server. They should not have the ability to copy the program files.
One solution to achieve this is App Locker, but is it also possible with App-V?
I am on a Windows Server 2008 R2. Is it possible to limit the number of concurrent instances of a program/an executable file for a user?
An example: I would like to prevent the users from starting paint.exe if another paint.exe is still running.
There are firewalls out there that analyze the traffic that goes through and block it if it is not wanted. How good do these firewalls work with encrypted traffic, e.g. HTTPS or IMAP over SSL?
An example: Can a firewall distinguish between HTTPS traffic on port 443 and, lets say, secured Remote Desktop traffic over 443?
We are running a Hyper-V server on our Windows Server 2008 R2. I have ordered a subnet because we would like to provide each of your VM's with a public IP address.
My data center provider (Hetzner AG from germany) writes the following about it:
Problems with virtualization
With this type of IP/subnet allocation, it is not possible to use a "bridged" setup, as with such a setup several MAC addresses appear. VPS (linux virtual servers, Xen, vmware, etc) must use a so-called "Routed" setup (VMware: "host-only networking"). With an additional subnet the host system or dom0 must be configured with an IP address from the subnet which is then used as a gatewar for the VPS. The (additional) address of the host system must therefore be configured in the VPS in each case as a gateway. An exception to this rule is "openvz", which does not require a gateway. On the host system or dom0 "ip_forward" must be activated for each virtualization:
Well, what does this mean for me now, how do I have to configure hyper V?
Thanks for your help!
Is it possible to upgrade Windows Server 2008 R2 Standard to Enterprise?
I am allowing some users in my organization to shadow Remote Desktop Sessions of users on our Remote Desktop Services deployment. I am facing the problem that the users that I have allowed the "Remote Control" right (via Remote Desktop Session Host Configuration) can shadow ALL sessions ....
Is there a way to restrict the access to sessions of certain users/groups?
I am using Remote Desktop Services on Windows Server 2008 R2.
We are hosting some licensed software in our company on a Windows Server 2008 R2. Users access the server using Remote Desktop Services. However, we would like to prevent users from copying the licensed software (e.g. the executable and related program files). That means we want to suppress uploading to web server, copying with RDP etc.
How is this possible with Windows Server 2008 R2?
I am currently setting up a Windows Server 2008 R2 with Active Directory and are currently facing an issue with setting up file/folder sharing between users/groups:
My organization is divided into sections, each section has a section manager and section users. I want to achieve that each section user has a shared folder that can be accessed only by him and the the section manager. The section manager can access all shared folders of his section users, but not the shared folders of any other section. Furthermore its important that section users cannot access any other shared folders (e.g. the shared folders of other section users).
Well, how do I setup such a structure? I basically need a conception for needed groups, users, rights and where to store the actual folders in the file system.
Looking forward to your answers!