What is PS?

In a Nutshell:

PS (ProcessorSpeed) is an identifer, how fast a machine is compared to other machines.

At first installation, each rrClient executes a CPU benchmark.

PS used for render cost

RR uses the PS of a clients to sum the costs of each rrJob.
Please see next section "Render Costs"

PS used for render time display

RR displays the render time per frame as "real" time hour:min.sec.
In addition it displayse the render time as PS*h.

Please see next section "Render times per frame".

PS uses for Client Averaging for different projects

The Client Averaging is based on the PS of the clients, not on the number of clients.

PS used for Remaining Time Calculation

The server checks each job in intervals depending on the remaining time to render.The higher the time, the higher the intervals. Then the server re-calculates the remaining time depending on the clients, which are actual rendering this job.

For the calculation the server uses PM (ProcessorMinutes = PS x minutes). E.g. if one computer with 0.5 PS is rendering 1 minute on an job, the job takes 0.5 PM (0.5 PS x 1 Minute).

If the remaining PM's are 100 and only this 0.5 PS client is rendering this job, the remaining time will be 200 minutes. If you start 3 more clients with each 0.5 PS the remaining time will be 50 min.

The server uses PM/PS because all other remaining time calculations I know are only like this:

time till now: 10 min.

frames done: 10

frames missing: 90

time remaining= 10/10*90 = 90 minutes

It's ok if you are using the same number of clients, but not if you add some clients, or in the worst case if you almost all clients are moving to a different job.

I know programs that calculate huge remaining times, if you have disabled the rendering for a while. For example you render one hour and then you disable the job for 5 hours, the remaining time can rise up to days with this old calculation. 

PS used for Sequence Divide min max

When the server sends a job with Sequence Divide to the client, then the PS has an effect on the number of frames that are send to the client.

The higher the PS of a client compared to other clients, the more frames it gets.

Render times per frame

What is render time as PS*h?

I a nutshell, it is a normalized render time.

No matter which CPU you use, the PS*h should be the same for the same frame.

A frame with 1 PS*h means that a machine with 1 PS would have taken 1 hour to render the frame.

If your machine has 6 PS, then it takes 10 minutes (1/6 hour) to render the frame.

History of [Ghz* hour] and [PS* h]

[Ghz* hour] 
Companies required a way to measure the time a frame took.

But since the 1970 the GHz (MHz) of a new CPU doubled every 2 years.
And twice the GHz was twice the computation speed.
If a frame rendered 2 hours on a machine with 200 Mhz (0.2 Ghz), it rendered 1 hour on a machine with 400 Mhz (0.4 Ghz).

And if you have a mixed render farm of old and new machines, you cannot simply tell your supervisor "the frame takes 1 hour".

Therefore [Ghz* hour] was used as a unit for the render time.
Which means 
2 hours on a machine with 200 Mhz (0.2 Ghz)"  is  2 hours * 0.2Ghz = 0,4 Ghz * hours.
"1 hours on a machine with 400 Mhz (0.4 Ghz)"  is  1 hours * 0.4Ghz = 0,4 Ghz * hours.
As you can see the "render time" is the same.

But since the year 2000 these two laws do not apply any more:

a) Every 2 years the Ghz of a CPU doubles.

b) Twice the GHz is twice the speed.

And in addition there are speed differences between Intel and AMD with the same GHz.

And CPUs have a variable GHz which depends on how many cores you use and the render time (and the CPU cooling).

[PS* h]

Nowadays GHz does not really increase any more.
CPUs include more and more routines to calculate some formula in a fraction of a time.
And the computation speed of the CPU still increases.

There is no technical specification which can be used to get the speed of a CPU.

It has to be tested with a benchmark.
RR has implemented the PS (Processor speed) benchmark.

Twice the PS means twice the render power.

Therefore all render times in RR are stated as [PS* h] as well.

Render Costs

How much does a render cost?

First of all: 

This is a difficult question.

And you can make it really complicated.

You have two options:

A) Simple price based on acquisition costs:

Many rental companies use 1% of the acquisition costs as a rent per day. 

But most rental companies do not lend their hardware 365 days a year. 
So perhaps 0.5% might be a better percentage.

The percentage might vary a lot depending on the degree of capacity utilization and additional non-hardware costs like power, IT staff, render wranglers, network infrastructure, fileserver, cooling.

B) Real costs:

On the other side you can try to calculate the real costs.

Let's try some calculation for a small farm with 20 machines. 

Note that these prices are not taken from any existing farm.


Actual cost per year

Network Switch 24x10GBit  1x25GBit

2000 for 5 years


Fileserver 100TB Raid10  (=200TB HDDs)

20 000 for 7 years


20x Render Computer (CPU only, 32 cores (=64 HT cores), 64GB RAM, 500GB SSD)

6000 x 20 for 3 years


Power infrastructure (Fileserver, Network, ... = 1500W)

1000W * 7200h= 10800 kWh * 0,35 


Software 4 x Autodesk M&E Collection (Includes Maya + 5 Arnold Batch)

3000 x 4 = 12000 


IT + Render Wrangler

50h each month= 1 week each month.

Eage 5000/month=> 1250 per week


A/C room cooling 

Sorry, I do not have any information


Room rent

Sorry, I do not have any information


Other stuff (cables, )

1000 for 5 years


Sum per year


Sum per year per render computer


Results: Cost per machine per hour

A) Based on acquisition costs
A render computer has cost 6000. 
0.5% of the acquisition costs as rent per day.
Rent per day: 6000 /100 * 0.5 = 30

Rent per hour: 30 /24 = 1,25 / hour

B) Based on real costs
Lets say the farm is used 300 days x 24h each year.

Note that the sum 0,625 / hour is the break-even point. 
So you do not earn anything with your farm at this cost level.

And if you do not earn anything, you actually loose money as you do not got the money you spend into the hardware "for free".
And we are missing some costs like rent, taxes, ...

So we add for example 100% (=> less than 50% profit margin).

Base cost

3712 /300d /24h = 0,52

0,52 / hour

Power (300W real consumption, not what the power supply is labelled)

300W x 1h = 0.3 kWh  x 0.35

0,105 / hour

Sum per hour

0,625 / hour

Sum per hour +100% 

1,25 / hour

rrConfig setting

Now that we have the price for a machine per hour, you have to enter the costs into RR.

RR has a global setting "Cost per PS per hour" (rrConfig, menu Server, tab Misc).

In addition you can explicitely set the cost of a rrClient in its configuration.

First of all you have to check the PS of your rrClient in rrControl, client table.


RR states that your new 32 cores machine (+ Hyperthreading 64 cores) has about 145 Ghz and 300PS.
The costs are 1,25 / hour for these 300PS.

The rrConfig setting is per kiloPS (=1000 PS) per hour.

Now we convert 1,25 for 300PS into 1000 PS.
1,25/ 300 *1000 = 4,17.

We enter 4,17 into rrConfig.

rrConfig setting - Render App costs

In this example we have added the costs for the render application to the costs of all machines.
If you want to calculate different costs for different render applications and/or different renderer, 

then you can set separate costs for each render application in RR.

(rrConfig, menu RenderApps, tab Render Licenses).

Price reduction per year

You could make it even more complicated. 
If you want to. 
I just mention this to complete this information about costs. 

The calculation does not take into account that the value of a machine decreases over time.

Your machine costs 6000. 
The CPU speed doubles every 2-3 years.
Which means in 2-3 years you get a machine which is twice as fast for 6000.

And that means your machine is worth 3000 (50%) after 2-3 years.

To compensate it, you can use different percentages of the costs per year.


Evenly priced

Reach 50% in the 3rd year

Reach 50% after the 3rd year