Turbos and Super Chargers

This article uses terminology that is explained in our Why Diesels are Different article and discusses the application of forced induction in regards to diesels. It's all about increasing the power and efficiency of internal combustion engines and this is achieved using Turbochargers and Superchargers. This article will provide you with an appreciation of what these awesome power boosters are and what they can do.

Turbo & Supercharge Terminology

OK, lets get the terminology straight again before we get into the details. First of all, they're all superchargers. Yes, the word "turbocharger" is an abbreviation of the term "exhaust gas turbine-driven supercharger". So what is a supercharger? Well, let's look at the two parts of the word: "super" and "charger".

In the early twentieth century, when this stuff was first being thought about, it was common to call the air/fuel mixture ingested by a spark ignition [SI] engine (i.e. petrol or gasoline engine) the "charge". As in, an explosive charge in blasting or the propellant charge in a cannon etc. Normally the charge was inducted into the engine via atmospheric pressure. That is, as the piston descends on the induction stroke, it creates a partial vacuum above it and normal atmospheric pressure (about 1.0 bar or 14.7 psi absolute) 'pushes' the air/fuel mixture through the carburettor and into the cylinder. This is called a "normally aspirated" (NA) engine and, until about twenty-five years ago, not too many people would have driven a car with any other sort of engine.

So, we now know what the "charge" is (and it's still known by that term, even for diesel compression ignition [CI] engines, where it is purely air which is drawn into the cylinders). What's the "super" bit? Well, to understand this you need to appreciate the general meaning of the term in the early twentieth century, when it usually meant "above or beyond normal" (as in "superhuman" or "supernatural"), rather than the more contemporary meaning of "huge" or "giant" (as in "superleague" or "supersize"). What it comes down to is this: charging your engine's cylinders at above or beyond normal atmospheric pressure.

Why Supercharge?

Why supercharge? It's all to do with specific output and efficiency. Specific output means how much power (and torque, because they are intimately related) you can get from a certain size of engine. The usual term to measure this is kW/litre. Efficiency is about how much power, torque (and ultimately kilometres travelled) we can get out a litre of fuel. In real-world terms, we're talking about performance and economy. Both of these desirable factors can generally be increased by supercharging an engine, provided it is done properly. The gains are especially important for CI (diesel) engines.

How Super Is It?

Pressure

How much pressure are we talking about here? Well, it varies a lot. Anything from a few psi (say, 0.1 bar) to 45+ psi (3.0+ bar) in competition engines. Here I'm talking about the increase in engine induction pressure above normal atmospheric pressure. This is what's known in industry as "gauge pressure" or in automotive circles as "boost pressure".

Another way to look at is in absolute pressure and this actually gives you a better picture of how "super" it is. Absolute pressure is the measure of pressure above absolutely no pressure or a complete vacuum, such as in outer space. As mentioned earlier, normal atmospheric pressure at sea level is about 14.7 psi or 1 bar or 1000 millibar or 100kPa, depending on your preferred unit of measurement.

So, let's say we have a boost pressure of 7.5psi (or about 0.5 bar). This is around 22.5psi absolute or 1.5bar absolute and is therefore about 50% above normal atmospheric pressure. So we could reasonably expect our engine to be receiving almost 50% more air than a NA engine. And if the correct amount of fuel were combusted in this air we could expect the engine to produce about 50% more power, yes? Well, yes, that's true in theory but in the real world there's no such thing as a free lunch and we have to 'pay' for this extra pressure and power somehow, so the actual power gain will always be less than 50%.

Similarly, a boost pressure of about 15psi or 1.0bar (that is, about 30psi or 2.0bar absolute) will give a large power increase over the same sized NA engine but it won't be anything like a 100% increase.

SI Engines

Spark ignition engines for normal road use generally use relatively low boost pressures, often 6psi (0.4bar) or lower. Higher boost in SI engines can be difficult to manage, due to problems controlling pre-ignition, although there would be many competition engines running far higher boost levels.

CI Engines

Compression Ignition engines do not have the same issues with pre-ignition as they ingest only air. They are therefore more naturally suited to forced induction and the amount of boost pressure is really limited only by the design and strength of construction of the engine. Many production engines are designed to run at least 15psi (1.0bar) of boost (for example, the Nissan ZD30, Land Rover Tdi and Td5 engines etc.). I've read of modified American light truck diesels (Dodge Rams, Ford F-series etc.) running in excess of 30psi (2.0bar) boost, apparently with satisfactory reliability(?)

Is Pressure all that Matters?

What is really more important than an increase in inlet air pressure in an engine, is any increase in air density you can achieve. Normal air is about 78.08% nitrogen, 20.95% oxygen, 0.93% argon, 0.038% carbon dioxide, and the rest is small amounts of other gases like methane, etc. Without getting too far into heavy chemistry, every gram of air contains a certain number of oxygen molecules and can therefore combust a certain mass of fuel efficiently.

It can be a difficult concept to grasp but air does have weight or, more correctly, mass. And the density of a material is the mass of a standard volume of the material, say 1 litre. For water, its usual density is 1000g/L. For 'standard' air, at sea level (a pressure of 1 Atmosphere or about 1.0bar absolute) and at 15°C, its density is 1.225g/L. But air, unlike water, is compressible. That is, you can force a greater mass of air into the same volume. So, at a pressure of 2.0bar absolute (twice normal sea level pressure) a litre of air, still at 15°C, would have a mass of 2.450g and so its density would be 2.45g/L.

So, again, it all looks good. An engine running at 2.0bar absolute inlet pressure (1.0bar of boost) is ingesting air at twice the density and hence twice the oxygen content of its NA cousin, right? Well no, unfortunately. Did you notice the extra proviso in the paragraph above: "at 15°C"? This now makes all the difference. Temperature is intimately tied up with pressure and volume when it comes to compressing gases. After all, recall that compression ignition engines only work at all because air gets very hot if you compress it a lot.

So, in this process of increasing the pressure of the inlet charge to an engine, we invariably heat it as well. Therefore, the increase we get in air density is never as good as you might think just looking at the increase in pressure over 'normal' air pressure. Temperature effects will always come into play to reduce the potential density gain. We'll talk about this issue in more detail later.

How Pressure is Created

How do we go about the charging at above normal atmospheric pressure?
Well, basically we use an air pump or compressor. In fact, the Germans use the very simple and logical term "kompressor" in place of the English word "supercharger".

Now, there are numerous designs for air compressors. The most common ones are probably the simple reciprocating piston pumps familiar to handymen and tradesmen, and this type was the first type used in an automotive application by none other than Gottlieb Daimler, the inventor of the motor carriage.
Nowadays, the compressors used for automotive supercharging normally fall into one of two categories: positive displacement and centrifugal compressors.

Positive Displacement Compressor

In very general terms, a positive displacement compressor is a relatively large device that positively 'shift' a particular volume of air with each revolution of their drive shaft. They run at relatively low speeds, that is, speeds not too dissimilar to that of the engine with which they are used - of the order of thousands of revolutions per minute (rpm). These are the large polished alloy belt-driven cylindrical devices you may see adorning the top of the big V8 engine of a dragster or racing ski boat, between the cylinder banks.

Centrifugal Compressor

A centrifugal compressor on the other hand is a relatively small device that relies on accelerating air to very high velocities and then converting that velocity to pressure at its outlet.

To accelerate the air to very high velocities, the compressor's impellor (or fan) has to be either large in diameter or rotate at mind-boggling speeds, often in excess of 100,000rpm and perhaps as much as 300,000rpm. No, that's not a misprint, it's 300,000.

Superchargers and Turbochargers

Now we'll talk about the differences between what are commonly referred to as "superchargers" and "turbochargers".

Superchargers

The word supercharger is now normally taken to mean a charge compressor driven mechanically from the engine crankshaft. Often this drive is via a belt from the crankshaft pulley (usual in aftermarket conversions) but in some factory-original designs, it may be driven by chains or gears. The advantages of this are that the compressor is running and boosting the intake charge whenever the engine is running. This means that the full power capability of the engine (at its current rpm) is instantly available when the throttle is opened. But it also means the compressor is being driven all the time, even when full power is not required, such as when highway cruising. And driving a compressor is not 'free' - it takes power. So, a lot of the time a supercharged engine is using power to drive a compressor that's not needed and the overall efficiency is not all it could be.

This problem can be alleviated to some extent by further complications such as clutches in the compressor drive or bypass valves. However, if you need bulk power from an engine as soon as you floor the throttle, such as in a 'top fuel' dragster or a racing ski boat, then mechanical supercharging is the way to go.

Turbochargers

So, what about these "exhaust gas turbine-driven superchargers"? Well, in 1905 a clever Swiss fellow by the name of Dr Alfred Büchi came up with the idea (and patented it) of using the energy contained in the hot exhaust gas from an internal combustion engine to drive a supercharger. An American, Dr Sanford Moss, seems to have been the first to apply the principles in practice in 1918, when his "turbosupercharger" successfully boosted the output of a V12 Liberty aircraft engine to then unheard of power levels even though at an altitude of over 14,000 feet, at Pikes Peak.

The hot exhaust gas, at an appreciable pressure, is expanded across a small gas turbine. The exhaust gas expands as it flows through the turbine and emerges at lower pressure and a lot cooler. The reduction in pressure and temperature of the exhaust gas represents the energy converted to mechanical energy which is then used to drive the supercharging compressor.

Many comparisons between mechanical superchargers and turbochargers describe the exhaust gas energy as "free energy" and attribute to this, all of the superior efficiency of turbo charged engines. This is not strictly true. Sure, in a NA engine, all the energy embodied in the exhaust gas is wasted. But when you insert a gas turbine into the exhaust flow it doesn't produce "free" power. It creates a restriction in the exhaust system which must be overcome by the pumping action of the engine. This is another of the reasons why a doubling of inlet charge pressure (that is, a boost pressure of about 1.0 bar or an absolute inlet charge pressure of about 2.0 bar) doesn't give a 100% increase in power and torque.

But a turbocharger does, to a certain extent, automatically adapt its output boost to the load on the engine. At low load, there is not much energy in the exhaust gas and the boost pressure developed is low, unlike a positive displacement supercharger. Therefore, turbocharged engines generally have a better overall efficiency than supercharged ones.

As the vast majority of forced induction diesels in automotive applications are turbocharged, we'll now concentrate on these for the remainder of this article.

Turbochargers in Practice

OK, so far turbochargers are looking like the best thing since sliced bread. Are there any problems with their practical use? Well, yes, a few actually.

The fact that turbochargers don't create much boost at low load, which helps the engine's overall efficiency, is one of its major shortcomings. When you want a lot of power in a hurry, you can't get it as quickly as a supercharger could deliver it.

To produce more power, we need more air (and fuel) forced into the engine. To get more air into the cylinders we need a higher boost pressure in the inlet manifold. To produce more boost pressure, the compressor needs to be driven faster. To accelerate to a faster speed, the compressor needs to be driven faster by its gas turbine. To increase the speed of the turbine it needs more exhaust gas, at higher pressure and temperature. To generate more exhaust gas, with more energy embodied in it, we need to burn more air and fuel in the engine - and this is where we came in. The expression "chicken and egg" springs to mind.

When you 'plant the foot', it takes a finite time for the exhaust gas energy to increase - to increase the boost pressure - to increase the engine's power output - to increase the exhaust gas energy etc. etc. This is known as "turbo lag" and was one of the biggest problems for automotive engine designers to overcome. Indeed, they were not able to overcome it satisfactorily for some decades, leading to a loss of popularity of early turbocharged engines, especially SI engines. These days it's rarely a problem. Vastly improved materials, design and manufacture has led to very small, light and fast responding turbochargers able to provide near instant boost from relatively low engine rpm. However, at very low rpm (just above idle), it is still difficult or impossible for a turbocharged engine to compete on equal terms with a much larger NA engine of similar maximum output.

Another issue is controlling boost pressure. With mechanically driven positive displacement superchargers it's relatively easy - you just adjust the gear ratio between the engine and the supercharger until the desired boost pressure is delivered and then it stays set forever after. Centrifugal compressors, on the other hand, are more difficult beasts to regulate. They have very complex relationships between air flow rate and pressure and both change non-linearly with rotational speed.
The hot exhaust gas, at an appreciable pressure, is expanded across a small gas turbine. The exhaust gas expands as it flows through the turbine and emerges at lower pressure and a lot cooler.

The reduction in pressure and temperature of the exhaust gas represents the energy converted to mechanical energy which is then used to drive the supercharging compressor.

An engine whose turbocharger was designed to deliver 1.0bar of boost at engine revs of, say, 4500rpm, would develop very little boost at all until near maximum rpm. It would be an absolute slug at any revs below maximum, probably worse than if it were not turbocharged at all. Alternatively, a turbocharger which developed 1.0bar of boost at, say, 1500rpm would be delivering enormous pressures at 4000rpm, far more than the engine could tolerate. So what we want ideally is a design which develops useful boost at low revs, which reaches full boost at reasonable revs (say, 1500 - 2000rpm) and which does not 'over-boost' at higher revs.

A couple of practical methods have evolved to do this. The first and still most common is the use of a "wastegate". This is a simple valve which can be opened to allow some of the engine's exhaust gas to bypass the gas turbine. This valve is operated by the turbocharger compressor's outlet pressure. As the boost pressure increases, it presses on a diaphragm, against the spring pressure keeping the wastegate valve closed. When the force of the boost pressure on the diaphragm exceeds the spring force, the valve opens and allows some exhaust gas to bypass the turbine, limiting the power available to the compressor. If the boost pressure drops below the design setting, the spring begins closing the wastegate valve, putting more energy back into the turbine and compressor. Simple and reliable, as long as the few moving parts remain in good working order.

However, wastegate control is not the most efficient way to control gas turbine power and speed. Turbine and overall engine efficiency can be improved if all the exhaust gas flows through the turbine but the turbine actually changes its characteristics to suit the energy and flow of exhaust gas available and delivers the optimum power to the compressor at all loads.
This can now be achieved by Variable Turbine Geometry (VTG). It is most commonly implemented by placing variable guide vanes in the turbine housing. These can alter the pressure, speed and direction of the exhaust gas impinging onto the turbine blades and provide accurate control over turbine power and speed.

The final issue we'll look at this time is the effect of the heat generated in compressing the air charge. As mentioned earlier, a doubling of the absolute pressure of our air charge only gives a doubling of air mass and density if the air temperature remains constant. But it doesn't. To give you an idea of the scale of the problem, I'll give some example figures I've actually measured on my engine. It's a mid-1990s 2.5L turbo-diesel running 1.0bar of boost pressure as standard. One pleasant morning when the ambient temperature was 23°C, I was towing a caravan up a steep range. Naturally the engine was running at full boost or very close to 1.0bar (about 2.0bar absolute). The temperature of the air leaving the turbocharger compressor peaked at 184°C!

Now, I won't go fully into the mathematics involved but instead of the density of the air charge leaving the turbocharger being double that of the ambient air (that is, 100% higher), it was only 28% higher. So this heating of the charge air had 'robbed' me of 72% of the potential increase in air density I might have expected by doubling the absolute air pressure! Not only is this a huge disappointment in terms of the performance I might have been expecting, but it's pumping a fair quantity of heat into the engine before we even think about combusting fuel in that air. Not good.

But all is not lost! In the next article we'll see how to drastically reduce this heat burden we're imposing on the engine and get even more power and efficiency - by charge cooling! We'll see why intercoolers are really aftercoolers - and whether they really do anything useful at all.

Comments & Reviews(36) Rating 5/5

Post a Comment
Loading...

Page Stats

Created: June 2008
Revised: April 2009
Latest Feedback: May 2013

Sponsored Links