I had a virtual server with 2GB ram and average use was always under 1GB of ram, so I got a new one for exact same purpose and I only got 1GB of ram. But it seems to me that it uses more CPU now.
Having less ram means it will use more CPU? Even if average ram use is not used at 100% ?
// it is a web server with Plesk panel
The answer is not so straight-forward but essentially - Yes.
By having less RAM, you will typically force the system to use more virtual memory on the HDD. This results in the CPU having to do a bit more work e.g. moving data between RAM and HDD and tweaking page tables, particularly when there is a context switch. Therefore, there is less CPU time available to the applications.
That is why, if you had a choice to spend the same money to either double your RAM or increase your CPU speed by 5%-10%, it is usually better to spend it on doubling your RAM.
CPU usage is often a red herring because you don't know if it represents useful work or not. If you have high CPU usage, it could indicate that the application is doing a lot of useful work (good); on the other hand, it could indicate that the application is being very inefficient (bad).
It's better to focus on direct measures of useful work, such as the throughput and latency of the application. For a web server, these would include metrics such as maximum number of simultaneous users and the time taken for the server to respond to each request. Then you can compare these metrics to CPU usage to get an idea of CPU efficiency.
To answer your specific question: it depends on the performance bottlenecks in your particular environment.
In many situations, adding RAM will actually increase the CPU usage because it will allow more useful work to be done -- and this is a good thing. For example, imagine a web server under heavy load serving up files. The performance bottleneck is likely to be disk I/O; the application will spend most of its time waiting for disk, so the CPU usage will be quite low. If you added more RAM to cache the files on disk, the application would spend less time waiting, more time processing, and so CPU usage and throughput would be higher.
It could also work the other way. For example, there is often a trade-off in computer algorithms between space complexity and time complexity; that is, extra RAM allows the application to use less CPU, so CPU usage would be lower (or you would get more throughput per CPU cycle). @sybreon has given a good example of how adding RAM can make the CPU usage more efficient because the system doesn't have to spend as much effort managing a constrained resource.