I remember there was a time when I thought I was finally through with tests. On the road to a doctorate and ABD ("All But Dissertation") status, I had to pass a major (MIS) comprehensive exam, a minor (accounting) comprehensive exam, and an oral comprehensive exam. There weren't exam banks or "brain dumps"--I don't even think I talked to an existing ABD. I really didn't have a clue what might be on the exam; my professors weren't much help. "Anything in the literature over the past 15 years.... You'll do fine." My major one took two solid workdays--I think I wrote out nearly 2 dozen filled pages, I had intense hand cramps and the edge of my left hand was solid blue (ink). (About 60% of lefthanders, including myself, have a hooked writing style.) I'll be the one giving tests vs. taking tests as a professor. Well, even if I didn't decide to take another class or haven't decided yet to pursue a second doctorate, I find I often find myself tested, whether it's a tech screen for my next job or position, getting my next IT certification, etc.
Not all exams are created equal. I remember when I qualified for PhD Program status at UWM (at the time one had to show evidence of scholarly productivity; I think I was the last one who had to qualify: I believe they shifted to automatic qualification at the end of a faculty member's first year.) I still remember the first student exam I participated in. The student came into my office, trying to brown-nose me, for a glimpse of what my question might be. He mentioned that he had read all my articles. I had to suppress my laughter; I'm proud of my work but I also know that I was in a then tiny niche of human factors in IS and methodology. I was well-read and had a number of interests. Unfortunately, the other professors were more predictable: DH would have his database question, KH might cover some end-user computing question, GH would have some sort of computation or simulation modeling question, etc. I came to the exam committee meeting with 3 or so questions. The others loved my questions, but not enough to give up their own exam quota. I got my one. My questions were more open-ended; there wasn't necessarily a best answer. I was more interested in seeing how he fleshed out a response. (I was not pleased with his response.)
No student has ever scored 100 on one of my quizzes or exams; it's not that I gave "impossible" exams or used tricky questions. I remember this one microeconomics professor (this was a summer session course I had to drop because of my work schedule): he used to give these wicked multiple choice questions with a variable number of right answers. In order to get full credit, you had to list all the correct choices and only the correct choices. If you listed any incorrect answer you got double negative credit (e.g., if you got one right and one wrong, you ended up with minus points). If I'm not mistaken, there were students who ended up with negative scores on exams and top scores were like in the 20's-40's. I never sought to intimidate or terrorize students; in fact, I often allowed them to bring in a cheatsheet (and it was amazing what some people could squeeze into a sheet of paper...) and gave exam curves of up to 20% or more. But I have a gift for writing great questions, especially those that could discriminate among different hierarchies of students on a grade scale. The test patterns were so reliable, by mid-term I could tell you the top 5 in rank order before grading the exam. If I was writing a multiple choice question, I could come up with some great bluffs. I could rewrite the same question at least a half dozen different ways.
I've mentioned in the blog what some of my UH undergrad students (while I was a teaching fellow) said about me. I once met one of my earliest students as a salesman when I came in to buy a suit. Like all perfectionists, I thought of all the mistakes I made and things I would have done differently if I were to do it again; I all but apologized for him being one of the first to endure me.. He waved off my attempts, "You know, you gave what I considered to be my first real college exam. Man, when I was taking you, I hated your guts. But later I realized I learned more in your course than any other I was taking. I wouldn't change a thing. In fact, I was just talking about you at lunch the other day." I was amused by other descriptions of my exams: one guy compared it to having a lobotomy. One group of students joked that they rated my exams by how many beers it took to forget them.
I never went into writing an exam by simply writing questions off the top of my head. (I did come up with some wonderful test questions off the top of my head, but they would be part of an item pool that I would sample from in test design.) I went in with the intent of sampling all the incremental (or full, in the case of finals) exam material comprehensively as possible. I kept meticulous word processed lecture notes and track my daily progress in covering the material. Obviously there's only so much you can do in 60-90 minutes of exam time; I wouldn't ask them to code a program--but I could ask them to identify which division in a COBOL program a line of code belongs to, to design a picture clause to accommodate an output field looking like this or to identify the issue with a line of code and fix it. One of my research interests is in measure development, and I have read literally thousands of articles in the referenced applied psychology and education literature. So I really went through test design with a much more rigorous process and well-defined behavioral objectives.
Over 8 years of college teaching, I got almost no critical remarks about my tests, directly or through course evaluations, except on a couple of occasions. One was about a question I had used in an undergrad system analysis course from a textbook instructor test bank (and which I had thrown out as part of the exam curve) and the other was a question about a COBOL subtract statement that one coed insisted that I hadn't covered (in fact, in my lecture notes, I had covered it in the last class before the exam, but I think I had ended up throwing out this question, too).
In the professional ranks, experienced candidates often go through a technical screen. Oddly enough, past employers have not used me often in that capacity. I myself have gone through a number of brutal ones (for example, you could be asked about the details of something you did one time 8 years ago or some Oracle technology you've never used); I'm not a fan of people who try to quiz about something not in my resume. In one case I got a particularly difficult, unusual Oracle issue for a job posting for a federal contractor. They ended up making me an offer, and after I came on board, I asked the DBA in question, "Where did that question come from?" He shrugged his shoulders and said, "I've been researching this problem for 2 weeks. I just wanted to see what you could do with it."
There are a couple of examples of ones I have done. In this one case, I got this polished puffed-up (professionally written) resume. I just decided to describe a key, major piece of Oracle database infrastructure from a functional standpoint and asked him to identify the technology. It really wasn't designed to be a gotcha question; but most testers would probably asked you to define the technology; my question inverted the usual approach. What's sad is over the phone I could hear him flipping pages through a book desperately seeking an answer. I thought to myself, "Dude, are you really cheating on a tech screen? Stop embarrassing yourself..." We went through a few more questions, and those didn't go any better. I started to wrap things up, when the guy said, "Look, I know I screwed up. But I need to know what the answer to the [infrastructure] question is or I'm not going to be able to get to sleep tonight." I was impressed that he was trying to learn from the experience.
In a second example, I had been worked as a subcontractor for one of the City of Chicago entities. I was working for an outsourced staffing group; I had experience with both Oracle EBS versions 11.0.x and 11i, one of the motivations for hiring me, because the city entity was in the process of upgrading from the former to the latter. The consulting business unit of the same company with the outsourcing unit won the bid for an aggressive upgrade project. The first DBA was released 2 weeks into a 12-week schedule for violating department security policy. For some reason, the PM, also a PhD, did not ask me to screen a successor, and he had no Apps DBA experience. The first time I "met" the DBA was on a speakerphone while I had lunch with my manager and PM. I think the PM asked him something like what the first thing the new project DBA was going to do, and he said he was going to use adclone/Rapid Clone to copy the production database. RED FLAG! I told my boss and the PM after the end of the call this guy was totally bluffing his experience. For one thing he didn't have to do a clone--it took me about 3.5 hours to do one, which I routinely did in refreshing my test and dev environments. But more to the point, adclone/RapidClone was new 11i technology which did not exist for 11.0.x and earlier. We have to develop our own cloning scripts before 11i (e.g., the source infrastructure might be on a different server, use different ports, etc.) But the PM and my boss thought I was a control-freak production DBA; not really: if they left me with a botched upgrade, I could be fighting fires for months or years to come. My boss's agenda was different; he wanted to give the PM all the rope he needed to hang himself; if the consulting team failed, it didn't affect his unit's contract, but the last thing he wanted was the PM pointing to me as the reason the project failed. He started maintaining a batch of DBA resumes on his desk where I could see them, and I started seeing descriptions for my (low-paying) job from Internet job board email notifications.
I've gone into this incident in more depth in other blog posts. Let me give one example. Running Oracle on a Windows platform (like we had) was problematic because of registry concerns among other things. The RapidWiz (11i software installation) documentation specifically said in boldface, if you are running this in a Windows system, make sure it's one without Oracle software already installed. So one day I'm in the server room and I hear complaints from project DBA's at their server. They had just bounced the server and couldn't bring the database up. In essence, they had ignored the warning, and the server came up pointing to other, unconfigured Oracle binaries. I told them, "Look, guys; the reason you run RapidWiz at this stage is to get access to scripts for shutting down application processes prior to upgrade activities. I can give you a fresh server and make those scripts available to you in a clone environment." The bad hire said something to the effect. "Prove to me where Oracle says that..." How do you prove common sense? This is a guy who probably is still in a state of denial about the Oracle warning. My boss wanted me to open a problem ticket with Oracle to verify the plain reading of the warning... In the meanwhile, I painstakingly go through the registry and reverse enough entries to enable them to bring up the database and continue work. But the bad hire says, "This guy just hacked the registry, and I don't trust the server now."
He rejected my warnings to keep the database in archived log mode and made weekly backups--but he didn't ensure all database services were down (i.e., his backups were unusable). Then about 4 weeks (6 overall) into the project (without ever once getting to the upgrade driver itself), he decides to add a datafile into the SYSTEM tablespace, has second thoughts because he could have simply extended any of the other 3 datafiles, and fat-fingers the new datafile. (I would terminate a DBA for cause for doing that.) Oracle crashes into an unrecoverable state. I didn't witness this. How do I know? The next morning city personnel are doing training on our Vision Demo database, so I do a last-minute check to see everything was up. The concurrent processes are down; I try to bring them up but they fail. (In most cases, this is because Oracle think the processes are already up, which means I have to cleanup up dirty process statuses in the database. Why does this happen? A DBA shuts down an Apps database without first bringing down the concurrent processes). I get Vision Demo back to a usable state. I knew that I hadn't touched Vision Demo but I couldn't figure out why the bad hire was motivated to shut down my Vision Demo. I didn't even know how he got access to it.
So I asked CB point blank: Did you shut down Vision Demo? He said yes. Why? He explained what he had done and allegedly called Oracle Tech Support. He said that the analyst asked him to check and see if there was another database which had a SYSTEM04.DBF file. It turns out Vision Demo had such a file, so he shut down the database and copied the file over to his server. "And, Ron, would you believe it didn't work?" This was a tragic situation, but it took all my self-control not to openly laugh. I guarantee if you tell this story to any experienced Oracle DBA, this is milk-squirting-out-of-your-nose funny. In short, you can't mix and match datafiles across databases.
So that's how I transitioned to a joint role; CB was supposed to be terminated on his return home to Tampa that weekend, but the client IT manager went ballistic over the lack of progress and demanded all DBA's to work over the weekend, including CB. Anyone thinking I would trust CB after what he just did is crazy; I worked with one of the junior DBA's for "knowledge transfer" purposes. The IT manager never met with me once over my entire stay. After I successfully launch into the Apps driver patch over the weekend, more progress in 2 days than over the previous 6 weeks, everyone was upbeat. I still pressed the PM I wanted CB gone, but now he was trying to justify it saying that if he kept CB on, he wouldn't have to train another DBA up on the project.
This is a long lead up to another case where I did a tech screen. Now there were up to 5 project DBA's. I could bill for 8 hours a day, including my ongoing production tasks. I wanted to schedule the other DBA's around the clock to babysit processes, but the PM refused. One of the junior DBA's had to leave the project, and the PM had CB doing other stuff with one or 2 other DBA's. So I was given a number and/or resume and asked to interview the candidate. I quickly came to the conclusion that he wasn't qualified for the position, so I told him something to the effect I would give my feedback to the hiring manager and the agency would be back in touch, to which he responded, "I'll see you Monday." Monday? What was the purpose for you guys having me qualify him after you already decided to hire him? Well, it seems that he only had a 2-week guarantee... CB was screaming the guy was incompetent by the end of his first day on the project. Talk about the pot calling the kettle black. The DBA was let go after his contract guarantee.
Then come professional certification exams. I have long supported baseline data and the use of standardized measures, especially in trying to access junior-level talent when colleges have varying standards. Now, personally, I find them annoying; I have 3 graduate degrees including a doctorate, and I have worked for the likes of IBM and Oracle itself. I decided to pick up my 10G OCA while I was on the bench working for CSC Consultings' National Oracle Practice. (The sales team went through a cold spell of losing project bids. I was willing to pick up my OCP, but Oracle required candidates to take any of 8 $3500 classes. CSC told me they would consider funding my class if and when I got my utilization/billing up, which of course was out of my control.
However, if you are outsourcing your technical hiring process to certifications, you might want to think again. One example was when I served on UWM's MBA Admissions Committee. Usually there was an automatic admission say if you had a high enough upper-level GPA and a good enough GMAT score. I think someone applied from the University of British Columbia who held a doctorate. No doubt, he felt insulted UWM was telling him he had to take the same stupid test that every other applicant had to take. I don't recall his score but it was probably the lowest I had ever seen. I don't think anybody thought blowing off the test respected the process, but we all agreed his qualifications justified a waiver and voted to admit him.
On the other hand, some certifications do not require relevant experience, and I have seen some certified people who personally were not able to do the simplest practical tasks. I recently wrote an essay about trying to mentor a millennial. She reportedly held a well-regarded computer security certification and an Oracle OCA certification. But she needed my help in configuring networking for static IP's and server firewalls and how to locate, download, and process zipped Oracle server software, not to mention processing server setups for Oracle and running installation or configuration GUI utilities, much of which is clearly documented by Oracle documentation. The certifications she's earned aren't that easy, and granted, you can't assign a 4-hour install in a 90-minute exam--and there are things beyond installs to test for on certification exams. She recently resigned from my employer and told me in effect local recruiters had been coming out of the woodwork trying to hire her. (We were warned of the possibility our project funding might dry up in a matter of weeks and potential layoffs.) How do I explain this? In a phrase: brain dumps: think of past certification exam banks where questions or highly similar ones often appear on the exams.
I described a variation of this in a past post. I had a managerial finance course during my MBA sequence where the tests were brutal. I earned my A the old-fashioned way--through hard work. Flash forward to my shared doctoral student office. Bruce, Minnie and I occupied the away corner cubicles (the office door was at the left front corner). A few MBA students (graders or teaching assistants?) sometimes occupied a cubicle between Minnie and me over our ABD periods. I remember the young guy who was taking the same course/instructor maybe a year or 2 after me was having problems with the simplest net present value problems, e.g., what would I be willing to pay today for a stream of payments starting 3 years from now given a current interest rate of 4% a year? Even after I walked him through several problems, he didn't seem to be picking up on it--and quite frankly, these were the easier problems. I seriously thought he was going to fail his exam. So I warily asked him several days later how his exam went... "Oh, I got a 96. I was done in 5 minutes. I missed a couple on purpose so the professor didn't get tipped off. I waited 30 minutes before turning in the exam." How in the hell...? He explained that there was a small selective group of students who were recycling the professor's exams for a price, and the current students merely memorized the exam keys. They couldn't let most students into the group for fear of blowing the class curve. (That didn't work on my classes because I never reused the same exam questions across semesters--but then I taught a variety of courses as a professor and often changed textbooks.) This put me in a difficult position because I felt the student who worked for their grades were at a competitive disadvantage. I felt a need to tip off the lazy professor without betraying a confidence, The professor seemed totally unconcerned by my news of the marketing of his exams: "How very industrious of them!" Stunned, I snapped back, "You might have these brilliant students come and demonstrate to the rest of the class how they solved the problems..." A few weeks later, he hailed me down and tersely said, "I see what you were talking about; I'm taking care of it." I was surprised given his initial flippant response that he hadn't totally blown me off."
Of course, cheating doesn't just occur at the college level, and it isn't always the students. There's the case of 11 Atlanta educators convicted of changing answers on the kids' standardized tests.
Finally, I would like to see some testing of candidates for public office, e.g., knowledge of legislative procedures. the government's financial statements, including unfunded liabilities, budgetary priorities, economic literacy, regulatory costs, etc.