Wednesday, July 12, 2017

In Search For Fool's Gold: The Saga of Progressive Credentialism



By Aleksey Bashtavenko
Academic Composition
            From Kindergarten to High School, America’s youngsters are taught that education is the key to success in life. The underlying explanation is simple and straight-forward. In order to land a high-paying job, you must be able to think critically and display a good deal of mental agility. After all, if you want to be engineering, you must have a solid grasp of science and mathematics. If you want to be a lawyer, you must be verbally proficient and if you intend to enter medicine, there is an altogether different body of knowledge you must master. What about all of the other, less intellectually rigorous professions?
            As for that, our guidance counselors would say that a degree makes you stand out. If you want to be a book-keeper or a financier, you’d have a much higher chance of getting hired with a degree. Today, more people have academic credentials than they did decades before. Previously, a degree offered one a way of standing out from the crowd, today, it has become the new norm. In other words, a Bachelor’s degree is the equivalent of a High School degree in the 70s.
            As appealing as this comparison may seem, it is a false equivalency. In the 70s, employers had considerable confidence in the quality of education High Schools offer. As such, they were able to justify their preference for applicants who finished High School over those who did not. At that point, it seemed clear that High School graduates displayed superior intellectual, practical and interpersonal skills to those of Middle School graduates. Yet, can one say that today’s graduates are superior to High School graduates in these respects?
Managers routinely complain that College graduates are deficient in basic skills required at the work-place such as verbal communication skills, mathematical calculation, writing proficiency, public speaking and interpersonal ability.
            (https://www.forbes.com/forbes/welcome/?toURL=https://www.forbes.com/sites/karstenstrauss/2016/05/17/these-are-the-skills-bosses-say-new-college-grads-do-not-have/&refURL=https://www.google.com/&referrer=https://www.google.com/_). In a rigorous academic environment, youngsters can cultivate all of such skills, yet it is no secret that the standards American universities employ tend to be woefully inadequate. This is evident in light of the proliferation of ill-conceived majors such as Women’s Studies, Queer Studies, Transgender Studies and so forth. Even students who pursue more respectable scholarly disciplines do not receive the quality of education that their parents and grandparents did (https://www.insidehighered.com/news/2016/03/29/survey-finds-grade-inflation-continues-rise-four-year-colleges-not-community-college).
In the late 60s, the self-esteem movement began taking root on America’s college campuses and the seeds of this worldview were planted in the minds of educators across the country. Shortly thereafter, a significant percentage of teachers and administrators believed that all disciplinary behaviors among children stemmed from a low-self-esteem (https://www.amazon.com/Narcissism-Epidemic-Living-Age-Entitlement/dp/1416575995/ref=sr_1_1?ie=UTF8&qid=1498883511&sr=8-1&keywords=Narcissism+epidemic). To rectify this apparent problem, teachers lavished praise on youngsters and evaluated their students’ work in an exceptionally lenient fashion.
Since then, the trend of grade inflation has accelerated and now the students often display the kind of ignorance and intellectual incompetence that would have been unthinkable for previous generations of college graduates to show (https://www.youtube.com/watch?v=I-t2TwLRdgk&t=1s). Despite the staff’s efforts to accommodate the dullest and least industrious of their pupils, the student body continues to demand further concessions. Today, the students are no longer satisfied with a curriculum where almost anyone can earn an A with a minimal expenditure of effort, they now demand to be sheltered from views that they may disagree with (https://www.theatlantic.com/magazine/archive/2015/09/the-coddling-of-the-american-mind/399356/). It is now customary for professors to issue “trigger warnings” about any readings they may assign that expose students to ideas that may be considered “sexist, racist, homophobic, transphobic, ableist” and so on.
The foregoing generations not only had to be exposed to the conflicting point of view, they were required to consider them with an open-mind. Likewise, they were forced to display a much higher level of academic rigor and communication skills throughout their discussions in class than today’s student do. By the same token, students from 30 years ago were given far greater opportunities to learn to think in an objective manner and form a balanced worldview.
. In 1975, liberal  professors outnumbered their conservative peers by a ratio of 3 to 1 (http://www.washingtontimes.com/news/2016/oct/6/liberal-professors-outnumber-conservatives-12-1/). Today, this ratio is 12 to 1 and continues to increase. In departments where curricula are known to carry a heavy left-wing bias, that ration is often as high as 30 to 1. While a significant percentage of instructors in departments of finance, economics and engineering identify as conservatives, such pedagogues are virtually unheard of in departments of literature and gender studies.
Despite this, left-leaning “news outlets” routinely propagate the notion that even the professors of “soft sciences” are intellectually honest enough to refrain from forcing their views on students (http://www.latimes.com/opinion/op-ed/la-oe-gross-academia-conservatives-hiring-20160520-snap-story.html). The reality is that the matter is much more complicated than that. Considerable evidence suggests that human judgment can often be influenced by subtle hints. (https://www.amazon.com/Pre-Suasion-Revolutionary-Way-Influence-Persuade/dp/1501109790/ref=sr_1_1?ie=UTF8&qid=1498890295&sr=8-1&keywords=Pre-suasion). For example, the effectiveness of an advertisement can be significantly changed if only one word is altered. Likewise, individuals tend to focus on ideas that they were most recently exposed to. In surveys where one is asked if they are “unhappy” with the product or experience in question, most people are likely to reflect on their negative experiences. Yet, the opposite occurs when the question is phrased in a way that emphasizes the positive elements of one’s experience. These findings prompt the question of how the views of students may change when they are constantly exposed to ideas that subtly reinforce the left-wing perspective. Even if the instructor does not explicitly articulate their point of view, they often unwittingly leave subtle hints or assign readings by authors who are sympathetic to the leftist perspective. In other words, the professor does not need to openly claim that “right-leaning Americans are racist and sexist”, all he needs to do is keep on bombarding the students with readings that imply this premise.
At Academic Composition, we’ve served over 10,000 different students, many of whom called for our help with their papers for “electives” and “general studies” courses. Instructors who teach these “disciplines” tend to show much less restraint than those who are in charge of more respectable scholarly métiers such as economics, history, philosophy and political science. Karl Marx infamously declared that the philosopher’s task should be not to understand the world, but to change it. Remarkably, 18% of social science professors identify as Marxists and many others have been influenced by Marxism in subtler ways (http://econlog.econlib.org/archives/2015/03/the_prevalence_1.html). Such academics now subscribe to the ideology that can be broadly described as post-modernism (https://www.realclearpolitics.com/video/2017/06/05/jordan_peterson_why_you_have_to_fight_postmodernism.html).
Unlike the Marxists, post-modernists deny that reality is objective and can be understood in an unequivocal fashion. However, they embrace the Marxist premise that human nature is malleable and the human condition is predominantly a consequence of environmental rather than biological influences. Building on that assumption, they maintain that the most powerful people in society have the capacity to control the destiny of all others. Similarly to how Marxists regarded the capitalists as the oppressors of the working class, the post-modernists impose the same condemnation upon the “privileged white males”.
In large part because of the alliance between Marxists and post-modernists, professors of the humanities can openly support socialist policies without openly endorsing Marxism or communism as it was implemented throughout the 20th century. Without offering clear-cut policy prescriptions, such academics champion vague slogans such as “fighting the right”, “redistributing the wealth”, “taxing the rich” and “doing away with capitalism” (https://www.campusreform.org/?ID=9381). In light of the post-modernists’ rejection of objective truth, it is now acceptable for instructors to parade such platitudes without taking responsibility for their intellectual negligence (http://dailycaller.com/2017/06/13/marxist-wisconsin-professor-rakes-in-170000-per-year-teaching-about-inequality-and-oppression/).
The fusion of Marxism and post-modernism not only creates an environment where professors tend to be biased to the point where they are likely to skew the students’ ideological orientation to the left, even if they do not intend to do so, but also that the collegiate milieu is unlikely to foster the youngsters’ intellectual growth. In a classic on education, “The Closing of the American Mind”, Allan Bloom has shown that the attitude of cultural relativism dampens the students’ passion for the truth (https://www.amazon.com/Closing-American-Mind-Education-Impoverished/dp/1451683200/ref=sr_1_1?ie=UTF8&qid=1498975060&sr=8-1&keywords=The+closing+of+the+American+Mind). The underlying rationale is simple and straight-forward: if one believes that there is no objective truth, there is no reason for anyone to spend their time wrestling with the big questions of life. As a result, learning becomes perfunctory, routinized and aimed at the achievement of extrinsic results.
In light of these developments, it is clear that today’s generation of college students will not receive the quality of education their parents and grandparents took for granted. Although employers may harbor nominal expectations that “college educated” applicants are better workers, they are starting to question this assumption. With every passing decade, employers become more cognizant of the gap between what academic credentials putatively represent and what they empower graduates to achieve. Hence, a Bachelor’s Degree is no longer the ticket to a middle-class living that it used to be.
Despite the diminished economic value of academic credentials, the apologists for the Ivory Tower continue to insist that formal education bestows intangible value upon students that they cannot receive elsewhere (https://www.theguardian.com/commentisfree/2017/may/12/humanities-students-budget-cuts-university-suny). In other words, it is often maintained that the humanities “teach people how to think” in an environment where such lofty skills cannot be acquired anywhere else. It is a well-documented fact that ideological bias is a significant impediment to intellectual growth and multiple academic departments are immersed in it. Moreover, in the minds of many academics, the dogma of post-modernism appears to justify the rapid decline of academic standards. In light of the proliferation of ill-conceived majors and abysmal standards that students must fulfill in order to graduate with distinction, the suggestion that “humanities teach people to think” is risible.
Although it is true that numerous employers see academic credentials as an indicator that the applicant in question is capable of “finishing something” and in this sense, an impractical university degree may not be entire worthless. However, attaining such credentials is seldom worth the cost. In the amount of time one devotes to a degree in order to show that they can “finish something”, they could have learned a trade and acquired considerable work experience. Regardless of what skill one chooses to pursue, there is a considerable advantage to postponing formal education in favor of entering the work-force. First of all, this is a great way for a youngster to obtain relevant work experience and determine if a university degree serves their purpose. Secondly, if college education seems appropriate at that point, it will become easier to see which specific major is worth pursuing. Thirdly, real-world experience should inoculate most people from the ideological indoctrination that takes place in the general education courses. Fourthly, students who have practical experience will then see formal education as merely one activity they could pursue. Unlike those who enter college straight out of high school and think that earning a degree is their only viable option, the world-weary students will be in the position to evaluate all of their options and commit to a judicious course of action. 




6 comments:

Anonymous said...

Good article as usual. I think you miss something important though. You often "need" a degree because the one hiring you has one. So you damn well better have one or else!

Many years ago was filling out an application for a government job. One of the questions was "Do you have a degree?" A reasonable follow up question to that would be a degree in what? Nope no follow up question. You could have a degree in underwater card shuffling and that would at least give you a better shot at an interview.

Anonymous said...

This is impressive:

https://www.youtube.com/watch?v=nyZshmWMn9M

tpkeefe said...

Nothing new here. We all know by now that the universiites have failed to teach any critical thinking skills to students. There are some hold-outs there, and good on the students that have been able to keep their academic heads above water.

More important, is the other comment . . . that is, that employers still, to this day, insist on you having a degree because they have them and many of their staff do as well, even though these people are incompetent and deficient in basic social skills. No degree = no interview or follow up. And, if you do have a degree, there's no guarantee that you get a follow-up from that, either.

Much of the blame still belongs to the employers for insisting on degrees as proxies for something they fail to do -- TRAIN people and INVEST in them.

kerdasi amaq said...

Dropping standards for degrees means that college professors don't have to reach the same standards of competence as they used to do.

JK Brown said...

Anon 3:45

Funny thing, if that government job was for US federal employment, it is illegal as in there's a law, not just a regulation, to require a degree unless it is specifically required for the specific duties of the job. Of course, the law is rarely enforced to the point that most job managers don't know they have to specifically justify the education requirements.

A Texan said...

I would still like one of these teachers or an administrator with their BS 'educational doctorate' tell me why after a K-12 education, no one in the top 10% or 20% of a high school graduating class can immediately apply for any elementary teaching positions without another BS 4 year credential? If K-12 is really that good, you should at least qualify for an elementary schoolteacher job.