IRA Written test

ahmad

Pre-takeoff checklist
Joined
Apr 9, 2017
Messages
477
Location
S Illinois
Display Name

Display name:
Midwest Aviator
This is from Sheppard. How stupid is this question and answer? The real answer is 36 minutes and 10 gals but if you want to get credit for the answer you must choose the wrong answer. WTH
 

Attachments

  • 20230210_103220.jpg
    20230210_103220.jpg
    106.8 KB · Views: 111
  • 20230210_103230.jpg
    20230210_103230.jpg
    161.9 KB · Views: 104
  • 20230210_103237.jpg
    20230210_103237.jpg
    164.7 KB · Views: 100
It’s Chinatown, Jake.


Welcome to the FAA.
 
If you’re taking the IRA, then you’ve already taken the PPL written and know it is impossible to discern if a specific question was missed.

Take those explanations with a grain of salt.

The written is composed of 60 questions; last I looked at Sheppard’s IRA product, there were about 1,100 questions in the bank. So, only 5% of those questions will be on any given exam.

Don’t overthink it. If you did the calcs and got the mathematically correct answer, that’s all that really matters. If, instead, this is a brain dump exercise well…good luck.
 
If Shepard believes this question actually exists and is erroneous, they should be contacting AFS-630 about it.
 
If Shepard believes this question actually exists and is erroneous, they should be contacting AFS-630 about it.
Chortle. it's been done. The Instrument test has always had questions that were scored wrong, were unanswerable (with the information provided), and generally the sign of the fact that they are poorly prepared and unproofread.
 
Chortle. it's been done. The Instrument test has always had questions that were scored wrong, were unanswerable (with the information provided), and generally the sign of the fact that they are poorly prepared and unproofread.
Why would they take writing test questions seriously when applicants don’t take the test seriously?
 
Why would they take writing test questions seriously when applicants don’t take the test seriously?

Perhaps it's the other way around. The tests are so irrelevant and awful, that nobody takes them serious as they are purely an artificial impedeiment and neither measuring the necessary or sufficient ability of the student to get the sought after certificate/rating.

Take some other government agencies that have industry people working on the testing pools: you get the examining organizations (really should be more than one, don't get me started on that) and these people like Gleim, Kings, Shepperd, etc... to collaborate on the actual tests.
 
Perhaps it's the other way around. The tests are so irrelevant and awful, that nobody takes them serious as they are purely an artificial impedeiment and neither measuring the necessary or sufficient ability of the student to get the sought after certificate/rating.

Take some other government agencies that have industry people working on the testing pools: you get the examining organizations (really should be more than one, don't get me started on that) and these people like Gleim, Kings, Shepperd, etc... to collaborate on the actual tests.
Perhaps. Maybe getting that fixed should be the emphasis. Especially if the only way people are going to actually learn the material is by preparing for a test on the material.
 
Last edited:
If Shepard believes this question actually exists and is erroneous, they should be contacting AFS-630 about it.

Here’s the problem. Sheppard (or their client) has no way of knowing if any given question is scored correctly or not. They rely on eyewitness feedback by someone who gets a sheet that shows an ACS knowledge area deficiency.

They also have zero ability to discern which answer is being scored correctly, for the same reason.

Unless they are buying data directly from PSI and PSI is violating their contract with the FAA.
 
I suspect that they have enough people taking the tests to know when there are bad questions. Even when the question pools were published, the answers weren't and Gleim and the rest knew what the correct scored answers were based on their "research."
 
Here’s the problem. Sheppard (or their client) has no way of knowing if any given question is scored correctly or not. They rely on eyewitness feedback by someone who gets a sheet that shows an ACS knowledge area deficiency.

They also have zero ability to discern which answer is being scored correctly, for the same reason.

Unless they are buying data directly from PSI and PSI is violating their contract with the FAA.

Yes, they are relying on eye-witness testimony. However, based on the OP's screen shots, they clearly do believe this is happening and address it in detail. They don't need proof to address this with AFS-630. They should simply pass it on with a description of the potentially-faulty math and let the FAA take it from there.
 
Which perhaps they have, but it probably takes the FAA years to fix the tests. There were known defects in the writtens back when I was doing them, long enough to get them published as known errors. Even then, that the FAA tended to do is just discard the question from the score rather than fix it.

The FAA takes forever to revise things. Don't get me started on Airport Master Records and NOTAMs.
 
Since the FAA and PSI partnered up contractually, test question updating is now happening more quickly. The plan is to do away with the Knowledge Test Supplement altogether. Test takers won't be able to know in advance what Figures may appear on any given exam. They are really starting to clamp down, trying to stifle question/answer memorization as a study strategy. Most recently, AFS-630 stopped publishing their "What's New" document.
 
Perhaps it's the other way around. The tests are so irrelevant and awful, that nobody takes them serious as they are purely an artificial impedeiment and neither measuring the necessary or sufficient ability of the student to get the sought after certificate/rating.
Some good points here.

Curious if there is any true loss if we nixed these knowledge tests (in their current form) completely.

The important knowledge RUAC fest (rote, understanding, application, and correlation) occurs during the practical.

So does the knowledge exam, aka the written, still have a place?
 
…So does the knowledge exam, aka the written, still have a place?
As long as you have DPEs with their own ideas of what’s important and/or pencil whipping certificates, the written serves as the only standardized assessment that the 61.xxx knowledge requirements have been met.

Having said that, I look back on my AF career and every platform had a rote-memorization Master Question File exam that was tested to every six months in a closed book environment, followed by an open book exam that encompassed a much deeper breadth of material across all the applicable publications for the platform and mission.

I think the question we owe ourselves is whether or not rote memorization is okay for some things or not.
 
This is from Sheppard. How stupid is this question and answer? The real answer is 36 minutes and 10 gals but if you want to get credit for the answer you must choose the wrong answer. WTH
What makes you think this question is actually in the FAA question bank? Perhaps it was 20 years ago when they started doing this, but I can assure you the test has evolved and test prep companies haven't. The FAA doesn't report when they toss out questions, so how would Shepard know?

Remember, it's in their financial best interests to scare you into thinking you need their test prep to prepare for tough, semantic, or erroneous questions. When you get a knowledge test that has none of those, you don't feel cheated, you're just happy you passed the test.

Learn the material. It's the ultimate cheat code.
 
We complain because the written test is a waste of time. Then we complain because orals take too long. Somewhere in there, the knowledge needs to be evaluated.

how about making the written to cover the rote stuff…regs, AIM, etc. Make it short answer/fill in the blank instead of multiple guess. Leave the higher level stuff and calculation problems for the oral.
 
Its just so dumb. You'd think a test like that would have been proof read a few times before being published. I dont memorize the answers. Memorizing just pass the test is not smart because you have to know this stuff. My goal is to score high and do well on the test because I know the stuff and not because I memorized it.
 
The FAA doesn't report when they toss out questions.
Learn the material. It's the ultimate cheat code.

Up until very recently, the FAA did report when they tossed out questions - well not the actual questions, but classes of questions and topics. For example, they reported when they dropped all of the ADF questions. This was done in document that the Airman Testing Branch published called "What's New". Sadly, they have stopped publishing it.

But yes, learning is the ultimate cheat code.
 
You have to understand that these are the people who wrote Several pages on Wench Launching.
The written requires you to know the difference between a trebuchet, a catapult and a ballista for launches, and which is better for solid payload vs hot oil or tar.
 
Up until very recently, the FAA did report when they tossed out questions - well not the actual questions, but classes of questions and topics. For example, they reported when they dropped all of the ADF questions. This was done in document that the Airman Testing Branch published called "What's New". Sadly, they have stopped publishing it.

But yes, learning is the ultimate cheat code.
Some of these test prep companies include flight planning questions that I suspect were scrubbed years ago. Since flight planning is not going anywhere, there's no way for testing providers to get feedback on when they've gone away. It's not as though test takers are reporting back when they've gotten or question or when they haven't.

I question how accurate they're able to capture the details of new questions when all they have is the test takers memory of the question. I suspect they rely on testing centers not monitoring test takers who take scratch paper with them from the test center.
 
I got a 97% on my instrument written. The two questions I got "wrong" were coded as Icing. There were no questions about icing on my exam. Making sense of the exam is impossible.
 
Learn the material. It's the ultimate cheat code.

Thank you for saying that. I am in total agreement and have always asserted that for the same effort it takes to memorize a bunch of stuff, one could have learned and understood the subject matter to score a decent pass on the test. And be ready for the oral. Folks who merely memorize still will have to learn everything (for real) for the oral.

But it seems most of the recent batch of instructors were brought up the memorization ladder and coach their students to do the same.
 
Thank you for saying that. I am in total agreement and have always asserted that for the same effort it takes to memorize a bunch of stuff, one could have learned and understood the subject matter to score a decent pass on the test. And be ready for the oral. Folks who merely memorize still will have to learn everything (for real) for the oral.

But it seems most of the recent batch of instructors were brought up the memorization ladder and coach their students to do the same.
There’s a pervasive belief that knowledge testing is somehow flawed and the only way to pass is to memorize answers because working the problem will result in an answer that will be scored as wrong. Whatever truth to that has long since passed and the people passing these beliefs haven’t actually taken a knowledge test quite some time…in some cases decades.
 
I got a 97% on my instrument written. The two questions I got "wrong" were coded as Icing. There were no questions about icing on my exam. Making sense of the exam is impossible.
Um…if you didn’t recognize what the questions were about, how could you answer them correctly? ;)
 
... and the people passing these beliefs haven’t actually taken a knowledge test quite some time…in some cases decades.

Not so sure about that. I'm surrounded by a large community of young and recently new flight instructors who unabashedly say out loud that the best strategy is to get into a program that helps memorizing answers.
 
Not so sure about that. I'm surrounded by a large community of young and recently new flight instructors who unabashedly say out loud that the best strategy is to get into a program that helps memorizing answers.

The new flight instructors that you describe clearly have no interest in "book learning". They probably believe that the only useful knowledge is learned in the airplane while the Hobbs meter is turning. Anyone can become an instructor, but that doesn't mean that they are actually good at it. I suspect many of these folks ascribe to the "anything over 70% is wasted effort" philosophy. They are clearly promoting mediocrity when they do that.
 
Not so sure about that. I'm surrounded by a large community of young and recently new flight instructors who unabashedly say out loud that the best strategy is to get into a program that helps memorizing answers.
But they learned that from their instructors, who learned that from thei instructors, who learned from their instructors....

I'll be the first to admit that the old (i.e. pre 2014) ATP knowledge test was crap. For most folks, that was the last knowledge test they took. So that sentiment carried on, and got passed on to generations of pilots. I totally understood the reasoning for Shepard for the old ATP, but I never understood it for some of the other knowledge tests like instrument or commercial.
 
But they learned that from their instructors, who learned that from thei instructors, who learned from their instructors....

I'll be the first to admit that the old (i.e. pre 2014) ATP knowledge test was crap. For most folks, that was the last knowledge test they took. So that sentiment carried on, and got passed on to generations of pilots. I totally understood the reasoning for Shepard for the old ATP, but I never understood it for some of the other knowledge tests like instrument or commercial.
What? You didn't care to do a W&B on a 727? And how about that critical knowledge like how many stewardesses are needed on flight with 24 passengers?
 
And how about that critical knowledge like how many stewardesses are needed on flight with 24 passengers?

welcome-back-kotter-kotter.gif


Answer: One
 
I know a solution....

@write-stuff brings back his game show format.

5 prospective students compete. Winner gets his "passed the test" sheet. Losers get recycled to play the game again.

Eventually introduce Squid Game elements.
 
Learning is an ability. College is frequently (at least it used to be, now there are probably too many colleges just trying to make a buck) about teaching young people how to think, how to problem solve, etc etc. It isnt all just rote memorization (though that helps and is useful as well). So Im always amazed when people cant pass the PAR/PPL exam because its only going to get harder from there. I mean this isnt a popular view - but dont we want pilots that can fly the plane AND be able to comprehend whats written on approach plates and process/read/interpret them ? This isnt "rocket science" but its also more than minimal GED (and Im sure there are plenty of successful pilots who only have a GED degree as well). But the point being is that not everything that is taught is learned, and not everything learned is taught.
 
Took the test this morning. Missed 3 questions (made dumb mistakes and didn't read the questions carefully). I figured out exactly the questions I missed. Overall, I thought the test was super easy. Most of the information was identical to the ppl written test.
 
For what it is worth, I checked and our database of IRA questions has a very similarly worded question and the same known "correct" answer. We've seen a number (mind you, not a lot) of questions that have clearly faulty answers - not just on the IRA but in general usually a few on each knowledge test.

I have no doubt the original question posted here has been reported numerous times to the FAA already by those authoring test prep content. From what I've seen, these dubious questions/answers can linger for a long time before they get corrected. One would think now that there is a single company administering the tests, and that it is 100% computerized, that amendments would be made quicker, but alas...
 
F
I have no doubt the original question posted here has been reported numerous times to the FAA already by those authoring test prep content. From what I've seen, these dubious questions/answers can linger for a long time before they get corrected. One would think now that there is a single company administering the tests, and that it is 100% computerized, that amendments would be made quicker, but alas...

Well, it still takes the FAA 112 days or more to change something that's purely published in their own electronic database so it's not surprising when they have to go outside their organization.

The true answer is to do what the FCC does: designate multiple competing companies to do the testing, allow a committee of those on a scheduled 4 year cycle.
 
Back
Top