Dead technology

2-Bit Speed

Pattern Altitude
Joined
Nov 9, 2021
Messages
1,553
Display Name

Display name:
2-Bit Speed
Seeing the thread asking about a 10:1 Compression Ratio reminded of a technique within engine design to compensate for lost power as DA increases. The engines were built with CRs such that detonation was guaranteed on the fuel of the day at Sea Level. The advantage was that at a given altitude (I think it was 6,000 feet?), the engine would still be producing its full rated HP. This was during The War to End All Wars. They avoided detonation in a couple of different ways (at least I only remember two). The simple way (mechanically) was to instruct the pilot not to use full throttle until the specific altitude and how much throttle was appropriate at different altitudes. The simple way (for the pilot) was to have a second throttle (or throttle lever, I'm not sure which) called an altitude throttle. This throttle lever would have incremental labels for higher altitudes until the desired altitude.

Knowing that many pilots live at high altitudes and take off from even higher DAs, this seems like a reasonable solution which would avoid the added complexity of a turbo. With FADEC, the pilot might not need to (manually) compensate at all.

My questions are
1: Why did this tech die out so completely?
2: What is the opinion of the PoA collective regarding this as a solution to higher DAs?
 
That’s cool, did not know that.

How would we know what the DA is at any given time at some altitude in some location? Seems like it would be kind of tough to figure out.

Look how much argument and disagreement the 3 levers/knobs already cause, lol!
 
If you have a 265 hp engine, but limit power at sea level to 200 hp and then at altitude allow full throttle so that you are getting 75% HP, isn’t it really a 200 HP engine? Seriously though, it’s the same problem as turbo. Easy for an operator to over boost. Often no apparent, immediate symptoms.

To answer your question, my guess is that manufacturers stopped selling them because users would not follow the rules and they had to deal with the warrantee claims with no way to prove misuse.
 
I don't know the answer to your question, but I have wondered why the performance charts/tables/nomograms weren't mechanized like analog fire control computers; crank in OAT/elevation/desired power and it gives you the throttle/mixture/prop settings. Cooler than the ersatz "single lever" on a Cirrus and has period-correct springenwerk for our postwar SCA spam cans.
 
This is similar to turboprop engines.

The " thermodynamic hp" of our King Air 200 was 1700 (I think)

We were only allowed to extract 850.
 
Turboprops (and turboshafts) are a bit different, though.
The hot section might be rated for 1700hp, but the compressor will eat half of that, leaving only 850 available as shaft power.
 
My questions are
1: Why did this tech die out so completely?
Because of this:
To answer your question, my guess is that manufacturers stopped selling them because users would not follow the rules and they had to deal with the warrantee claims with no way to prove misuse.
We have an awful lot of pilots that don't even understand mixture control or carb heat. Those are simple concepts. They sure aren't going to understand detonation and the factors behind it.

Cars used to "ping" at low RPM and high throttle settings. That was detonation, and it was very hard on the engine. The driver had to back off the throttle and gear down. He/she understood that. But as automation took over control of the choke and shifting and almost everything else, engines started to suffer more as next-generation drivers understood less, and EFI and EI came along and could be programmed to avoid detonation.

In the really old cars, even the spark advance was manually set.
 
Because of this:

We have an awful lot of pilots that don't even understand mixture control or carb heat. Those are simple concepts. They sure aren't going to understand detonation and the factors behind it.

Cars used to "ping" at low RPM and high throttle settings. That was detonation, and it was very hard on the engine. The driver had to back off the throttle and gear down. He/she understood that. But as automation took over control of the choke and shifting and almost everything else, engines started to suffer more as next-generation drivers understood less, and EFI and EI came along and could be programmed to avoid detonation.

In the really old cars, even the spark advance was manually set.

Understood. With FADEC, could this be automated as it is in cars so that planes could use over-compressed engines?
 
How would we know what the DA is at any given time at some altitude in some location? Seems like it would be kind of tough to figure out.

How do you calculate density altitude on the ground? Same method. Get pressure altitude by setting your altimeter to 29.92 and read the OAT, then plug them into your E6B (I use a phone app).
 
You probably would want a manifold air pressure gauge. If you fly a plane with a constant speed prop it will likely be equipped with a manifold pressure gauge and pilot manages hopefully according to published charts to keep from over boosting engine. I did sometimes rent an old airplane with engine upgrade STC and a constant speed prop that had very little published performance data. I just kept it in high RPM and assume it was by design approved because it would never over boost in that configuration.
 
The concept itself is valid, and it's the sort of thing that I'd do (or at least consider) if building an experimental aircraft. Certified, though, pilots aren't exactly known for following the instructions called out in engine manuals very well, nor are OEMs known for putting good recommendations in their POHs.

Done already. Must be 15 years ago. Not cheap. https://www.lycoming.com/engines/ie2

View attachment 133091

Yeah, but increasing compression ratio on that engine was not employed as a technique to improve altitude performance. Also, the throttle on that engine, at least at the last point in time I was familiar with it (which was 2011) was not electronic, so you'd still be relying on the pilot to regulate manifold pressure.

Signed,

A guy who knows a thing or two about that engine. The real one, not the digital rendering.
 
This was during The War to End All Wars.
There is the answer to your question. Flying during wartime where the pilot and airplane life expectancy can be measured in minutes instead of years means doing anything and everything to get an edge over the enemy... at any cost. So what if they blow up an engine? They are probably going to die anyway, right?
 
There is the answer to your question. Flying during wartime where the pilot and airplane life expectancy can be measured in minutes instead of years means doing anything and everything to get an edge over the enemy... at any cost. So what if they blow up an engine? They are probably going to die anyway, right?

Must be why that WW2 tech of strapping a turbine to supercharger never made its way to GA.
 
Turboprops (and turboshafts) are a bit different, though.
The hot section might be rated for 1700hp, but the compressor will eat half of that, leaving only 850 available as shaft power.
We had the full 850HP up to about 10,000 ft......depending on temperatures.
 
Not sure that having restricted RPMS and/or density altitude throttle linkages reduces the complexity of operation over a turbo. Turbo-normalized engines don't really need any input from the pilot (aside from fuel mixture control for non-FADEC applications). Full rated power up until the ceiling limit of the turbo.
 
Back
Top