HTML Logo by World Wide Web Consortium (www.w3.org). Click to learn more about our commitment to accessibility and standards.

Moving forward with Composr

ocPortal has been relaunched as Composr CMS, which is now in beta. ocPortal 9 will be superseded by Composr 10.

Head over to compo.sr for our new site, and to our migration roadmap. Existing ocPortal member accounts have been mirrored.


Reporting server performance

On shared hosting it is always possible that a webhost can come along and say you are using too much resource.

Shared hosts fit many sites onto the same server, and depending on the price and reputability of the webhost, you may be allowed to use more or less server resource.

Web hosts may work on various metrics, or may respond to issues they have manually identified.

Database size

This could be an issue if you store a very large amount of data/content on your website. It has rarely, if ever, been reported as an issue for an ocPortal site.

Bandwidth

This could be an issue if you are hosting large files, such as videos or downloads, and a lot of people are downloading them. ocPortal itself has no particular high bandwidth requirement.

Request volume

It may be that you have very large numbers of visitors. If you have many hits each second, the web host may struggle to serve these requests. This is not specific to ocPortal.

CPU usage

On a reasonably decent server, average ocPortal requests should be well under 1 second execution time on the server. 0.1 seconds is about right for a simple page on a well-configured server that isn't overloaded.

(Note that when we look at execution time, we're not talking about time between clicking and seeing a page – that is browser reaction time + latency + queue time + execution time + latency + render time + probably some other stuff)

So ocPortal doesn't use a lot of CPU inherently. However, CPU is probably the biggest limiting factor on a server. If you have 10 reqs/sec (requests per second), with each request averaging 0.5 seconds, then this would use 5 CPU cores constantly. For a normal ocPortal site that is a lot of visitors to get, but the webhost has to think of the server as a whole…

Let's imagine the webhost has 500 accounts on the server, and 7% of those sites receive a significant ongoing volume of hits to some kind of CMS (ocPortal, or others), averaging 1 per 10 seconds, each request taking 0.5 seconds to execute…
500*0.07*0.5/10=1.75 seconds of request per second of real time, probably enough to tax a dual-core server.
So you can see, depending on what kinds of customers the host has, what software they run, how many accounts are held on the server, and the power of the server, hosts may have to do some push back against what their customers might do.

If you are using the ocPortal chat room, and a few people leave it open, the reqs/sec could rise considerably.

If you are going to run a popular CMS site, you will eventually outgrow shared hosting and need a VPS, dedicated server, or cloud instances. Until then, try and see past webhost marketing and choose a host that has a good reputation and good hardware – absolutely don't cheap-out and try and get really cheap hosting, as you'll suffer from it. Cheap hosting is ideal only for simple static sites that get few hits. It's really important to realise how cut-throat and difficult it is for web hosts – think how much salary a host needs to pay for the support you use from them, for the hardware they provide, for the marketing/advertising/offers they run, and for maintaining their business (their own website, etc) – you'll see how squeezed it is and how if you go on the cheap end you can't possibly do well from it.

Disk I/O

No ocPortal user has ever reported that a webhost has reported excessive disk I/O (input/output) to them, but it may be an issue affecting the performance of your site due to factors outside the scope of ocPortal. I/O can be a real performance bottleneck if the server is not well-configured, such as having a remote disk, a slow disk, a failing disk, or if other users on the server are doing something that is really making the disk 'thrash'. ocPortal won't thrash the disk, but it would be affected by another user doing so. ocPortal therefore does have some special options for limiting its need to regularly check the hard disk, discussed in the optimisation tutorial.

Memory usage

Normally ocPortal requests are explicitly limited to a maximum of 64MB, which is very small compared to the amount of RAM a server has (4GB is fairly typical nowadays). There are a very small number of one-off admin tasks where ocPortal raises or removes the limit, but its very unlikely to be a problem.

If a webhost complains about memory usage they are probably looking at an aggregate figure.

Example: chat room

The chat room is one possible cause for an aggregate figure to rise, as it has regular hits for each person who has the chat room open. This is because the chat room has to be constantly checked for new messages (hopefully one day this won't be the case, but current PHP and web browser features prevent a smarter approach on shared hosting).

Let's say that 8MB is used by the chat message check script. This memory use is from the actual parsed PHP code that is loaded up, as PHP tends to use quite a lot of memory for it (orders of magnitude more than the disk size of PHP files).

The chat room checks messages by default each 10 seconds (so on average a message will be received 5 seconds after sent).

If each request takes 0.05 seconds, and there are 20 people in the room…

20*0.05/10=0.1 seconds of request per second of real time

Assuming only a single server core, this means a server would be using the 8MB 10% of the time.

If the server has 1GB of RAM dedicated to web requests, it could sustain 500 customers using this volume of RAM.

So, the chat room is unlikely to be a problem unless the webhost is really cheaping out in a big way (e.g. putting thousands of users on a subleased VPS).

Example: general use

If you have 50 users browsing the site, making a click every 10 seconds, and each hit is using an average of 23MB of RAM and taking 0.5 seconds to execute…

50/10*0.5=2.5 seconds of request per second of real time
2.5*23=57.5MB ongoing average RAM usage

(we're assuming the RAM is used flatly across requests, which isn't true, but all these calculations are approximations)

If there are 500 sites on the server and 7% are using this level of resource and the other sites aren't using notable resources…

500*0.07*57.5=2012.5MB RAM required on the server

As with the last example, it should not be a problem – but if the host has a lot of users on the same server, or users are using a lot of resources (maybe there's a trend to move to CMSs universally, and also spiders are heavily hitting all the sites regularly) it could be a problem.

If a web host is complaining

If a web host says you are using too many resources, find out specifics from them. Ask what URLs are causing problems, what resources are being used, and what is valid on their AUP (Acceptable Use Policy).

If the host has reasonable limits, and you aren't serving a high ongoing or peaking volume or users, pass on those specifics to the developers in a bug report.

If the host has unreasonable limits, change webhosts and give them a poor review.

If you are using too many resources for shared hosting, you'll need something better.

There are no pages beneath this page

There are no posts yet

CEDI change-log Post