The PhD vs. DBA Degree (10/28/16)
Note that I've been a professional Oracle database administrator since 1993. I've usually referred to myself as a DBA, an industry acronym. This is not what I'm referring to in the context of academia. I hold a PhD, not DBA, in MIS.
To explain the context of this segment, on my current project, I recently met a retired military veteran (enlisted, like my Dad was); he managed to earn a doctorate by the time he retired. From what I understand, the main thesis of his dissertation dealt with looser coupling of military officers from their troops as they advance through the ranks (predictably, his former service disagreed with his findings). While my academic discipline was MIS (a more managerial/systems perspective of information technology than computer science), Andrew got his degree in Management (Organizational Behavior) from some university in Arizona (not a well-recognized school like the University of Arizona or Arizona State). So our colleagues and/or clients are referring to us as "Dr. Ron" and "Dr. Andrew".
Actually, our degrees are different, beyond our major disciplines; I hold a PhD, and Andrew has a DBA. I really wasn't that familiar with the DBA degree and did a bit of searching on the Internet. I realize that I risk oversimplification in summarizing the differences, but I think the following description is a decent introduction to the topic.
The DBA is a more practically-oriented professional degree program; I've used the analogy of an MA is to PhD as an MBA is to DBA. I hold both an MA and an MBA; my MA required a thesis in math (it was on a topic in abstract algebra). The MBA did not require a thesis per se, but we had a capstone business strategies course (with a number of competitive group presentations on business cases), plus a final course presentation on a company of our choice. (My group chose McDonald's. My group co-lead was interested in McDonald's because he and his then girlfriend/now wife had a long-distance courtship; they lived 300 miles apart and used to reunite at a McDonald's about halfway between cities. I grew up as a military brat, the oldest of 7; we only rarely ate out given my parents' limited income. For us, McDonald's was for a special occasion, like for my First Communion and Confirmation. Not just the burgers and shoestring french fries, but I loved the (vanilla) milkshakes (not simply soda pop). Kroc, the long-time CEO, actually got his start selling milk shake machines to the original McDonald brothers.)
The PhD is internationally recognized, theory-based and oriented more towards original scholarship and a career in full-time academics. Many, including myself, earn their doctorates by their late 20's or early 30's with limited full-time work experience. (There are exceptions, of course.) For many of us, we managed to make enough income through required full-time residency requirements by some stipend, often tied to teaching or research duties. For me, I was a teaching fellow, teaching two classes a semester; I also won a competitive dissertation award and a fellowship my last year. Our research is generally done from scratch.
For example, I managed to collect nearly 500 completed questionnaires, easier said than done. I recall in one instance I went to make a sales pitch at one local company; I had to go through a committee for protection of human rights, which, among other things, required me to list the voluntary nature of study participation. In this case, the company additionally required me to supply stamped, self-addressed envelopes and then went out of their way (the proverbial kiss of death) to tell prospective respondents they didn't care one way or the other whether employees participated. The whole cost came out of my own pocket; I think I got one returned questionnaire from this group from 20 or more distributed. I wasn't happy; I invested a lot of time recruiting the company and preparing for my presentation, and I had paid more than $10 in postage stamps (when rates were significantly lower), not to mention reproduction costs and of course my time (multiple trips to the company location). It's not so much that my fellow newly minted MBA's were making multiples of my income, but every day I remained an ABD vs. professor was a personal financial sacrifice. (It's not that a professor makes a high income; many of my former students were making making more than I ever did within 5 years, and I remember getting less than a 2% raise after a particularly productive year. But I knew professors teaching 4 or less classes than the 2 I taught, and their incomes dwarfed mine.) I lived on a limited budget (roughly $500/month), I rarely dated, and I remember celebrating one of my comprehensive exams (MIS, accounting, and orals) by going to a dollar cinema and buying a carton of popcorn. Don't get me wrong; I'm very grateful for the stipend UH paid which enabled me, as a bachelor without dependents, to pay my living and college expenses without taking on huge loans,
I was in a better position than many of my married colleagues with families to support during their mandatory residency period. Still, I was doctoral candidate #16 in my discipline, and I leapfrogged a dozen to earn PhD #4 (#3 defended her dissertation the week before I did). I managed to complete the dissertation from scratch within 14 months. A number of my colleagues elected to go back to work full-time to support their families after the residency requirement, but a dissertation for most is something that requires more than a part-time effort. In fact, a number of them ended up defending within a couple of years after me; the university was cracking down, reminding ABD's they were running against a 5-year limit to defend their dissertation or risk having to reestablish candidacy with another round of comprehensives, which nobody wanted to do.
The DBA candidate tends to be more of a mid-career professional and oriented more towards an adjunct (part-time) professor/instructor status (if at all). They are often self-financing their degree program and their data collection may reflect the resources of their own or employers' organizations. They focus on more pragmatic research topics than theory-building.
Oddly enough, I campaigned for a variation of the DBA degree after attaining PhD faculty status at UWM. (At the time, they required evidence, beyond the dissertation, of an ongoing research and publication program to qualify; I think I was one of the last to earn that status. The committee eventually liberalized the program criteria to automatically allow admission after a certain time in position, like maybe a year.) While meeting with my senior area colleagues (we would meet to devise upcoming comprehensive exams), I wanted to encourage people with IT/real-world experience to apply to the program. One of the tenured/senior faculty was chairing the PhD program committee and specifically vetoed that idea. I can only hypothesize that he felt defensive, having gone straight from his undergraduate degree to his doctorate.
It turns out that Frank, one of the Wisconsin Bell executives that our lead professor recruited (in a consulting role), went out of his way to call me before I left UWM, saying that he had heard great things about my graduate systems analysis class and regretted not having an opportunity to take it. What Frank probably never knew was that his mentor had personally threatened my career my very first semester by explicitly reminding me that I had no vote in my tenure process. The informal area chair was irate that I, a mere junior professor who didn't realize his place/role was to be seen and not heard, had privately critiqued a poorly written dissertation proposal by one of his students.
I had befriended the student; I remember meeting his young kids bringing in McDonald's sundaes. But he had been oddly defensive and evasive when I asked about his proposal: where there's smoke, there's fire. One thing is that you can't hide; university processes require a freezing of a proposal or dissertation before defense, and faculty can check out copies. The proposal was a disaster, with inadequately described methodology and nebulous, trite hypotheses like "executives with improved information will make better decisions". What it boiled down to was he and his chair wanted to do a field test (an experiment with workers as research subjects), a dubious, almost impossible sell job to companies who do not hire employees as captive research subjects but to do productive work, and he was going to use student subjects as a backup position. But there was no fleshed-out plan of how many subjects, study materials, measures to be used, etc. Even a young researcher like me could rip this to pieces, and I thought a failed proposal would be personally devastating; I told him he should withdraw his proposal because it was premature, at best. He flatly refused; he wanted to go on the academic job market, and he felt without ABD (all but dissertation) status, he wouldn't be competitive. I see the proposal as a type of contract with the committee; even if you don't come up with significant findings, you've advanced knowledge. When you've written a vaguely fleshed out proposal, even if you succeed, you've deferred the tough questions until your defense, which is opening Pandora's box and even more devastating if your committee doesn't sign off
I did attend the proposal defense but kept my mouth shut. The chair had quickly recruited a widely known organizational behavior researcher onto the committee and explicitly warned me if I so much as opened my mouth, the faculty were prepared to attack me personally at the defense. It was unethical, morally bankrupt, undermining the concept of a university. But the fact remains that the proposal was garbage: the student knew it, the dissertation chair knew it, and everybody knew it; you don't resort to blackmail to suppress an honest, irrefutable critique. It shattered my ideals of an academic career. I honestly thought when I left the private sector, I no longer had to deal with petty organizational politics, a Leibnizian notion of "Come; let us calculate our differences"; wrong: academics can be just smarter, more elitist, self-serving assholes.
I knew when I was threatened, my career at UWM was over before I barely got out of the starter's box my first semester. The senior area faculty (at the time, UWM's business school did not have formal departments, chairs, etc.) basically tolerated me, but I was notably excluded from dissertation committees, etc. The four professors were allied into opposing pairs, which oddly made me the swing vote--or the one thing that uniting the four together in a common ground of opposition.
I miss being in academia (other than the politics); I LOVED teaching and research, attending academic conferences. I'm not generally into cocktail parties and small talk, but I would introduce myself to other researchers whose articles I had read or studied, I would go to lunch with European professors, etc.
I think in another way my leaving academia was a loss to the practitioner community. I felt my prior IT experience as a programmer/analyst had benefited my academic perspective, especially in my chosen research interests in documentation and human factors. Fellow APL programmers notoriously wrote complex, highly concise but uncommented code. My employers made money when clients ran applications in costly computer time. I've had a gift for reading and fixing the code of other people. I had to laugh on one occasion when a boss once gifted me with a hard-to-find document of a system I was maintaining; it was literally yellow with age and no longer reflected the current design. Let's be clear: maintenance programming and inheriting the work of less competent programmers is not what any talented IT professional wants to do, and I wasn't keen on the idea of being stereotyped as a hard-to-replace maintenance programming guru.
On the subject of usability, there are a couple of examples that come to mind. HP had introduced a 4-color plotter at the time, and my employer had gotten a programmer who developed an interface for generating plots. But it was virtually unusable; the user would have to manually decide on literally dozens of parameters to generate even a simple plot (e.g., how many tick marks along an axis, the length of an axis, the type and size of tick marks, the colors of lines in the plots, etc.) There was no concept of a "quick and dirty" example plot, boilerplate plots, etc. My boss simply handed the interface and asked me to do something with it. So I did a design which radically simplified the interface to maybe a half dozen intuitive choices. We were soon printing off profitable $20-25 plots like crazy, bringing in significant revenue for the branch,
In another case, there was an Exxon real estate development subsidiary client, and I never even met the client manager in person. He wanted me to design an application to track computer timesharing costs. I wanted more than a phone conversation with fuzzy specifications of what he wanted, but I ended up writing a usable app; he was happy with the results, and I moved on. Weeks later, he called me to let me know that my little program had spread like wildfire, that something like 16 Exxon executives in other departments had already adopted it. He encouraged me to start my own company. (Long story, maybe another post in the future.) It gratifies the ego, of course, to see others appreciating one's work; it reminds me of Sally Field's infamous "you really, really like me" Oscar acceptance speech or Trent Reznor's reflection on writing his classic song "Hurt", how it amazed him seeing his simple song idea reach mainstream success. But I was fascinated by companies pouring millions into the development of systems or software that become failures; what was it that I was doing differently? Of course, I wasn't dealing with large-scale implementations, but my efforts had been based on limited, vaguely specified conversations while a large project typically deploys a number of experienced, dedicated system analysts.
Another example (although this is from my post-academic experience). My employer had consciously left me off a project team which was designing an Oracle application targeted at generating a mailing list of millions of customer clients; they tested everything--except scalability. They finally went live over a Fourth of July weekend and needed the mailing list by Monday. I got an emergency call around midday Saturday pleading for help: the application was spitting out one record every 15 minutes. It didn't take my 2 math degrees to figure out they weren't going to make the Monday deadline. I radically designed an alternative solution processing tables vs. records in less than 5 minutes and met the Monday deadline with time to spare. Others might look at my solutions and say "I can do that", but in fact they didn't.
Another more current example I recently discussed in my software blog: for years, I've been collecting quotes (one of my signature miscellany post regular features is my Quote of the Day). One freeware/abandonware product is Qliner Quotes; one feature was allowing me to insert a random quote from my custom quote text file. So I could generate random quote HTML code that changed every so many minutes, stored in my Dropbox account. My Thunderbird email client allows me to reference HTML code for my signature. I recently had to reinstall Windows 10 to get a recent major update, and the Qliner software subsequently refused to install, arguing it needed some obsolete Microsoft software. It wasn't worth my time and effort to figure out how resolve the incompatibility issue. But one of the latest beta optional features in Windows 10 is an Ubuntu/Linux implementation, and I quickly created a simple Bash script that essentially replicates the desired functionality of Qliner. (On a side note, I'm annoyed that Gmail's signature settings don't (yet) allow references to HTML signature files like Thunderbird does.)
In any event, when I was attending a doctoral consortium (each school was limited to one student, and I represented UH), I was surprised and gratified to hear the keynote speaker specifically single out (not by name but topic) my dissertation as the type of practically-oriented research he would like to see more of. A jealous nearby colleague whispered, "Dude! He doesn't understand what your research is all about..." However, I agreed with the speaker; improved metrics helped address a key management problem. I can't tell you how many times a writer wrongly sees himself as the target reader and often indulges in highly pretentiously nonsense of system acronyms and details one must master to make productive use of the system. Dude, if I'm trying to fight a fire in the trenches, I'm not going to read a 300-page document like a novel in the hopes of finally uncovering a salient detail I need to know. I remember when I turned my dissertation to the university, the administrative assistant said, "Finally! I can actually read the title of your dissertation." I didn't seek to impress my audience with some pretentious, cryptic title only a handful of people could understand. I don't write for reasons of vanity; I put a lot of work into rewriting and reorganizing the material to make it more accessible to a wider audience.
A final anecdote. We all know the story of a handyman who comes in to fix a problem, does a fairly simple repair and then presents a hefty bill for his services. The husband balks at such a steep bill for a simple repair. The vendor responds, "I actually charged a small fee for the fix. The rest of the bill is for knowing a simple solution existed, which you didn't know yourself when you called me." It reminds me of a client in the Virginia Beach, VA area; a hostile Unix system administrator was watching me like a hawk, as I quickly turned around a failing, stalled ERP project, and he complained to his bosses I wasn't doing anything he couldn't do. Dude! You had caused a major problem by downloading a newer version of Oracle software incompatible with the rest of the infrastructure. (One generally has to run interoperability patches, which usually aren't available for release until much later.) They weren't simply paying me the big bucks for following documented procedures; they were also paying for the fact I wouldn't make boneheaded mistakes costing the company thousands of dollars, expensive functional employees and consultants unable to do their job, missing critical project deadlines.
It turns out common sense is not so common.