Please, try to reduce the load by the next steps.
Some will make large differences, some will not:
Optimizing databases - This speeds up database queries. Scripts spend less time asking the database for data, so their execution time is lowered, which will show up in the statistics. You can do this through phpmyadmin. Click on the database you are working with on the left hand side frame (not the table, the database name). On the right hand column your tables should be listed. Scroll down till you see the "Check all" link. Click on that link, make sure all database tables are checked and then from the dropdown next to it, carefully select "Optimize table".
Reducing the size of databases - A good idea anyway, a queries will probably be faster and small databases are less prone to corruption. Forums scripts often have a "prune" function to remove old posts, you might want to enable it.
Taking out scripts which are not needed - This makes sure the CPU and Memory resources aren't wasted. Some scripts are just there to add extra functionality to your website, but may not be very important. You could choose to stop using some. If you do, make sure that as well as removing the link to the script, you disable the actual script, so that people who have bookmarked it do not keep visiting. The simplest way is just to rename the directory that the script is contained in. It is really important to rename the directory, because search engines remember links for years.
Temporarily disable scripts you suspect (similar to above) - Search scripts and forums are easily disabled. It's much easier to disable more things, then enable them again slowly than keep disabling everything, which takes even longer. Optimizing scripts - If you are good at programming, or know someone who is, you might find a way to cut down the usage of your scripts. I wouldn't recommend doing this with well known scripts, as those are already (mostly) well optimised. Of course, it's up to you, but it would be a big job and might make upgrades to the scripts harder in the future. Generally speaking, code written by normal people who want a website up and running
quickly doesn't tend to be very efficient, so it is an option in some cases.
If you are planning on doing this, you need to look it up in a search engine really,
it's too complicated to explain. Basically though, try reducing the number of loops you have in your code and avoid looking things up constantly. Instead, look it up once at the beginning of the page, set a variable and then refer to it.
Disable GZip encoding - This compresses pages to cut down on bandwidth. Some forums have it as an option. You could try disabling it, it should put less strain on the CPU, as it won't have to compress each page. On a dynamic website, it would be even worse, as caching is used less.
Switching scripts - Moving to newer or cut-down scripts could help. It could make it worse though, depending on which script you go for, so you need to do some research before choosing an alternative.
Remove modifications or plugins from scripts - If you have modifications that make an existing script even more complicated, that could make a difference.
Take off extra statistics software - Scripts which are included on each page aren't really needed, you have statistics in Cpanel for this. Sites which look up each IP address to see what country the visitor comes from can be particularly bad.
Block bad "bots" and slow down, or stop search engines - Search engines sometimes crawl a whole website very quickly. This can put high load on the server, as all the pages are hit very quickly.
http://www.robotstxt.org/ can help you with this. You could also check your Webalizer stats and see if any IP addresses are hitting your website excessively and block them in the Cpanel. Some search engines allow you to specify a "crawl delay", or you can stop them from indexing certain pages and scripts using a robots.txt file (in your public_html directory and the main directory for each addon or subdomain).
Stop loading content off other websites via scripts - wget, curl, etc, they can take up a lot of resources. Especially if they are being used every single time your site is loaded. If you are going to use them, you should really use them once or twice a day and make their output saved to disc, so your users get a cached version that is only updated every few hours, for example. I don't mean loading images and content off other websites just by linking to things, that actually helps a little with resource usage in some cases.
Disable RSS feeds - There's not much I can say about this. It can help sometimes, I recommend trying it.
Split up long pages - You might have a website which is 3 or 4 screens full of information. Your visitors might read the top bit, then none of the rest. Therefore it's a waste to have the processes for all the other sections of the site loading up. There is no point especially if half of them don't scroll down. This is mostly helpful if httpd (Apache) is your top process.
Cache content - Some blog scripts and gallery scripts generate new fresh content on every visit, which uses up the same amount of resources on each visit. In the context of image galleries, it is a waste of resources to generate a new thumbnail and not cache it, especially because images can be large and hard for the server to process. Some scripts have addons you can install to cache content. For most well designed gallery scripts, there will already be an option enabled, but it is well worth checking, it can make a very large difference.
-----------------
If you have something to add - please!