Surf’s Up

Demographer and statistician Steve Sailer has a book review (“Surfer Privilege” at Taki’s Magazine) of war correspondent William Finnegan’s Barbarian Days: A Surfing Life.  It’s about Finnegan’s idyllic youth in Southern California and Hawaii at a time when a working-class Irish family could afford real estate in some of the United States’ most desirable zip codes, while also supporting four children (Finnegan’s father worked in television, but was a pump jockey at a gas station upon first moving to Los Angeles; he could purchase a house and support his wife and child on that salary).

I recommend reading the book review linked above, as it contains classic Sailerean demographic analysis.  The real estate opportunities accessible to working- and middle-class Americans in the 1950s and 1960s are truly astonishing, and Sailer argues that, if you were born in 1946, the world was your oyster (Sailer was born in 1952, which he argues was also a pretty good year to enter into this world).

The real estate analysis rings true.  I’m 33 and earn a modest income as a history and music teacher at a small private school in rural South Carolina, which I supplement with adjunct teaching at a local technical college and with private music lessons (as well as the occasional music gig).  I’m also an extreme budgeter and put a significant chunk of my earnings into retirement accounts (IRAs and a 403(b) through my employer), and I drive a twelve-year old Dodge minivan.  While I live like a king compared to most people in human history, I still rent a little cottage and don’t support any dependents, much less a wife.  I’ll probably work hard for most of my life (though my long-term retirement planning should pay off over the course of decades; I’m definitely “getting rich slowly”), and I’m not counting on Social Security being around when I hit 70.

Had I been born when my parents were, I’d probably have a house, a wife, four kids, a pension, and a convertible, earning six figures in “consulting.”

I’m not complaining.  I highly value hard work, and I don’t think demography is always destiny (just look at all the miserable, divorced Boomers who are trying to figure out what went wrong).  I believe God has a purpose for us, and we live in our respective time for a reason (not that I haven’t, at times, experienced a sense of dislocation from our current era).

But Sailer’s demographic analysis of the period under consideration—a time that was so safe and prosperous, a kid could spend thousands of hours surfing and his parents didn’t much worry about him—is compelling, and points to long-term problems endemic in our culture today, such as mass immigration, an overly-rosy view of diversity, and idealistic subjectivism.

The Boomer generation was blessed to ride a long wave of economic prosperity and expansion.  As a product of the Great Recession, I’m growing more optimistic that future generations will enjoy similar gains.  I’m also cautiously hopeful that economic growth can prevent the unfortunate Millennial tendency toward idealizing socialism.

Hopefully, we’ll all be able to say “surf’s up!” again soon.

TBT: Economics: A Human Science

The unofficial, unintentional theme of this week’s posts have been about economics in general (other than Tuesday’s SCOTUS piece)—the power of tax cuts, the potential upsides to tariffs, etc.  In that spirit, I thought for week’s post about diving back into a piece that reflects my gradually evolving thinking about economics.

The summer before my sophomore year of college, I read the second edition of Milton Friedman’s Capitalism and Freedom, a work that completely revolutionized how I thought about the world and economics.  Free-market principles became my lodestar, and colored my ideology for a decade.  Indeed, I still adhere to these principles when it comes to economic questions.

However, as I grew older and (hopefully) more experienced, I began to realize that neoliberal economic theory, while elegant, is not always hard-and-fast, and that there are many more wrinkles to economic issues than appear at first glance.  I don’t believe in overcomplicating things—again, cutting taxes tends to stimulate economic growth—but most issues contain a frisson of nuance that is easy to miss.

I’d long held to the idea that free trade is a largely unalloyed good, and that the short-term costs of lost jobs or reduced wages in some industries domestically would be made up for by increased efficiency of production and the rise of new, better industries.  Sure, there’d be some friction in the duration, but people will manage, and we can always throw some funds for reeducation their way.

While I think such disruption is inevitable, I don’t think we should embrace it so blindly that we forget about the people who find themselves out of work, or in a position that they can’t modify their skillsets to find a new job.  I live in the rural South, and there are hundreds of little towns that dried up once the mill the left, the railroad shut down, or the big family farms sold off.  Part of that story is the onward march of Time and economic progress—and the drama of human history.  But part of it is the story of globalist elites selling out Middle America.

This situation is not one merely of tariffs, taxes, and the like, but also of a radical ideology that would see national borders dissolved and massive immigration—even illegal immigration—encouraged.  I am libertarian on many issues, but the pitfall of modern economic libertarianism—and there are many—is that it only conceives of issues in terms of economic efficiency (and, if you get right down to it, it’s inverted Marxism, to the extent that, for Marxists, everything is about economics—or, more properly, materialism).  And, yes, generally greater efficiency means greater quality of life, but economics is not always the clean, elegant science that its proponents claim it to be.

To that end, I argue that economics, properly considered, should be considered part of the humanities, as it deals in a direct, visceral way with the people’s lives.

I don’t know the precise balancing act, or what should be achieved.  I highly recommend reading Patrick J. Buchanan’s The Death of the West for a more complete treatment of how to revive wages for workers while maintaining a high degree of quality and efficiency.  I don’t agree with all of Buchanan’s proposals, which are heavily influenced by Catholic social teachings, but there is an appeal to the idea that, if the government is going to interfere in the economy (and it is, and does), then it should be in favor of workers and families, not at their expense.

Finally, I wrote this essay in the context of the Brexit vote—which I intend to write an eBook on soon—and the arguments I was hearing about the economic catastrophe Brexit would be (that hasn’t been the case yet).  I argued, essentially, that the liberty and national sovereignty are more important than sweet European Union bennies and transfer-of-wealth payments.  The EU is a despicable organization as it currently operates, and as a lover of liberty, I’m thrilled to see nationalist-populist movements rising in major European countries.  I don’t agree with all of these groups or their policies (many of which are socialistic in nature), but the impulse towards greater national sovereignty is, in general, a healthy one in our age of excessive globalization and unelected supranational tyrants.

With that lengthy introduction, I give you 24 June 2016’s “Economics: A Human Science“:

If you’ve read my blog the past couple of weeks, you know that I am strongly in favor of Brexit, or Great Britain voting to “Leave” the European Union.  I’ve laid out my reasons here and here.  As I write this post, results are trickling in on that historic vote, and I am intermittently checking them with great interest–and not a small bit of trepidation.  Right now (about 10:30 PM EST/-5 GMT), “Leave” has a slight edge, but the outcome is too close to call.

Already, though, the British pound and the euro have taken a beating in value, as gold prices soar (this blog is conservative in viewpoint, so I probably should start urging you to buy gold, guns, and freeze-dried food reserves; sourcehttp://www.bloomberg.com/news/articles/2016-06-23/pound-surge-builds-as-polls-show-u-k-to-remain-in-eu-yen-slips).  One of the major bogeymen of the “Remain” side in the referendum was the threat of economic downturn.  As I conceded in both of my previous posts on Brexit, there will no doubt be major economic disruption should Britain vote to “Leave.”  However, a (likely temporary) drop in the value of the pound sterling is a price well paid for restored national sovereignty.

God Save the Queen… and Great Britain from the clutches of Eurozone bureaucrats

As conservatives, we’re accustomed to viewing economics–or, at least, economic growth–as a positive good.  After all, we believe in the power of free markets to satisfy human needs and desires, and to innovate new ideas and products that alleviate human suffering, drudgery, and toil.  Conservative politicians tend to focus on job growth and prudent deregulation–often coupled with tax and spending cuts–as perennial, bread-and-butter issues that directly affect voters’ pocketbooks for the better.

 “…these [fiscal] policies are not about making gobs of cash… but about what those gobs can do to improve lives.”

But economics, like much else, is not a means unto itself.  The reason conservatives like economic growth–besides, well, making money–is that it demonstrably improves people’s lives.  Deregulation, similarly, can work beneficially (if you doubt me, just ask anyone who has ever dealt with the Affordable Care Act and the Department of Health and Human Services).  In essence, these policies are not about making gobs of cash–although that is certainly nice–but about what those gobs can do to improve lives.

Thus, we have a stark contrast between the organic, healthy, occasionally unpredictable economic growth of a free market and the regimented, inequitable, limited economic growth of progressive corporatism.  Our current economic environment, I fear, is far closer to the latter than the former.  Complex, heavy regulations benefit larger firms and discourage the formation of smaller, newer firms by raising the upfront costs of entry.  Perverse incentives raise the costs of healthcare for young, fit Americans, while making it unrealistically cheaper for older, sicker, chubbier patients.  Overly-generous social safety benefits (some of which, like the food stamp program SNAP, the government actively advertises and encourages people to use) discourage able-bodied Americans from pursuing work.

I could go on (and on… and on).  In short, conservatives are used to being correct on principle and on economic outcomes.  Typically, conservative fiscal policies align with, rather than try to manipulate, economic realities, so the outcomes of those policies tend to be both principled and positive.

“As fiscal conservatives… let us never lose sight of the human side of economics.”

In the case of Brexit, however, the quest for restored sovereignty–a stand on an important first principle–will result in some negative economic outcomes.  A major argument of the “Remain” side is that staying in the European Union will preserve Britain’s economic stability and ensure it a place in a European common market.

Such an argument is seductive, but it leads to a gilded cage.  Nobel Prize-winning economist Milton Friedman famously said that economic freedom is a necessary precursor to, though not a guarantor of, political freedom.  With Brexit, the axiom is almost reversed–by reclaiming its political freedom, Britain will then be able to pursue renewed economic freedom.

As fiscal conservatives–or those that support free markets, freer trade, and light regulations–let us never lose sight of the human side of economics.  We too often treat economics as a science.  Instead, it should find a home alongside the humanities.

Our chief aim should be to unleash human potential.  So liberated, its creativity and ingenuity can lift human life to greater heights.

We already have a model:  we’ve been doing it in the United States for over 200 years.

The Evolution of Judicial Supremacy – Judicial Review

Last night, President Trump nominated Judge Brett Kavanuagh to serve on the Supreme Court to fill the vacancy left by the retirement of Justice Anthony Kennedy.  As such, I thought it would be germane to explore briefly the role of the Supreme Court.

Popular understanding of the Court today is that it is the ultimate arbiter and interpreter of the Constitution, but that’s not properly the case.  The Court has certainly assumed that position, and it’s why the Supreme Court wields such outsized influence on our political life, to the point that social justice snowflakes are now worried about Justice Ruth Bader Ginsburg’s diet and exercise regimen.

Properly understood, each branch—the President, the Congress, and the Court—play their roles in interpreting the constitutionality of laws.  Indeed, President Andrew Jackson—a controversial populist figure in his own right—argued in his vigorous veto of the Bank Bill, which would renew the charter of the Second Bank of the United States, that the President had a duty to veto laws that he believed to be unconstitutional.

Unfortunately, we’ve forgotten this tripartite role in defending the Constitution from scurrilous and unconstitutional acts due to a number of historical developments, which I will quickly outline here, with my primary focus being a case from the early nineteenth century.

The notion that the Supreme Court is to be the interpreter of the Constitution dates back to 1803, in the famous Marbury v. Madison case.  That case was a classic showdown between Thomas Jefferson and James Madison on one hand—representing the new Democratic-Republican Party in control of the executive branch—and Chief Justice John Marshall, a Federalist appointee, on the other.

The case centered on an undelivered “midnight appointment” of William Marbury to serve as Justice of the Peace for Washington, D.C.  The prior president, John Adams, had issued a handful of last-minute appointments before leaving office, and left them on the desk of the incoming Secretary of State, James Madison, to deliver.  Naturally, Jefferson and Madison refused to do so, not wanting to pack the judicial branch with any more Federalists, and Marbury sued for his appointment.

If Marshall ruled that Madison must deliver the appointment, there was a very real risk that the Jefferson administration would refuse.  Remember, the Supreme Court has no power to execute its rulings, as the President is the chief executive and holds that authority.  On the other hand, ruling in Madison’s favor would make the Court toothless in the face of the Jefferson administration, which was already attempting to “unpack” the federal courts through acts of Congress and the impeachment (and near removal) of Justice Samuel Chase.

In a brilliant ruling with far-reaching consequences, Marshall ruled that the portion of the Judiciary Act of 1789 that legislated that such disputes be heard by the Supreme Court were unconstitutional, so the Supreme Court could not render a judgment.  At the same time, Marshall argued strongly for “judicial review,” the pointing out that the Court had a unique responsibility to strike down laws or parts of laws that were unconstitutional.

That’s all relatively non-controversial as far as it goes, but since then, the power of the federal judiciary has grown to outsize influence.  Activist judges in the twentieth century, starting with President Franklin Roosevelt’s appointees and continuing through the disastrous Warren and Burger Courts, have stretched judicial review to absurd limits, creating “penumbras of emanations” of rights, legislating from the bench, and even creating rights that are nowhere to be found in the Constitution.

Alexander Hamilton argued in Federalist No. 78 that the Court would be the weakest and most passive of the branches, but it has now become so powerful that a “swing” justice like former Justice Kennedy can become a virtual tyrant.  As such, the confirmation of any new justice has devolved into a titanic struggle of lurid accusations and litmus tests.

The shabby treatment of the late Judge Robert Bork in his own failed 1987 nomination is a mere foretaste of what awaits Judge Kavanaugh.  Hopefully Kavanaugh is well-steeped in constitutional law and history—and will steadfastly resist the siren song of personal power at the expense of the national interest.

Tax Cuts Work

Back in December, I wrote a post on the old blog begging Republicans to pass tax cuts.  When they did, I danced around my house like a silver-backed gorilla on Christmas.

I cannot understand objections to the Tax Cuts and Jobs Act, other than fiscal conservatives’ fear of increasing deficit spending.  By that I mean I can intellectually understand objections in an abstract, academic sense, but I’m unable to accept those arguments as valid in this case, and many of them are specious.

The historical record is clear:  tax cuts works.  Be it cuts on income, corporate, estate, or sales taxes, cutting taxes, in general, stimulates economic growth and usually increases government revenues.

Take the example of Calvin Coolidge, whom we might call the godfather of modern tax cuts.  As president, Coolidge used his predecessor’s Budget and Accounting Act of 1921 to carefully monitor and eliminate excess government spending.  He also signed into law the Revenue Act of 1926, reducing the top rate to 25% on incomes greater than $100,000.

By the time he left office, the government had increased revenues (due to the stimulative effect of the tax cuts on the economy—rates fell, but more people were paying greater wages into the system), federal spending had fallen, and the size and scope of the federal government had shrunk, a feat no other president has managed to accomplish.

The perennial wag will protest, “But what about the Depression?”  Certainly, there were a number of complicated reasons that fed into the coming Depression, but the stock market crash—really, a massive correction—did not cause the Depression.  Had the government left well enough alone, the economy should have adjusted fairly quickly, although modern SEC rules and regulations were not in place.  That’s a discussion for another post, but I suspect that Herbert Hoover’s signing of the Smoot-Hawley Tariff (1930)—a tax increase on imports—did much to exacerbate the economic situation, and a decade of FDR’s social welfare experiments injected further uncertainty into markets.

But I digress.  Subsequent presidents have championed tax cuts in the Coolidge vein, albeit without the corresponding emphasis on spending cuts.  John F. Kennedy pushed for tax cuts, which threw gasoline onto the fire of the post-war American economy.  Ronald “Ronaldus Magnus” Reagan’s tax cuts created so much prosperity, the ’80s are remembered for hair metal and cocaine; had he not had to spend the Soviets out of existence (and faced a Democratic Congress), he could have cut spending, too.

President Trump’s tax cuts have breathed new life into a sluggish, post-Great Recession recovery.  Jobs growth is increasing month after month, and wages are rising, slowly but surely.  Black unemployment is down from 7.7% in January to 5.9% as of May—the first time it’s ever been below 7% since the government began keeping statistics in 1972.

Leftists object that the cut to the corporate tax rate benefits big fat cats instead of everyday Americans, but the statistics suggest otherwise (see the article linked in the previous paragraph for more good news).  Further, Leftists moan and groan when companies put increased revenues into dividend payments to stockholders, as if this move is detrimental.  On the contrary, as more Americans invest in mutual funds in their 401(k)s or IRAs, they stand only to gain from these investments.  Progressives only see these investments as “big company benefits,” without following through on what that money does.

Of course, that’s because the Left’s focus is emotional (not economic), and worries about all the sweet government gigs that majors in Interpretative Queer Baltic Dance Studies will lose without the federal government’s largesse.  Getting voters off the welfare rolls further inhibits the Democratic Party’s mantra of “Soak the Rich,” as upwardly-mobile workers naturally want to keep a good thing going.

Conservative concerns of deficit spending are more grounded in economic reality, and while the federal deficit seems like an abstraction to most Americans, it does present a looming crisis.  Perpetual indebtedness in a personal sense seems inherently immoral if undertaken as a financial strategy unto itself (taking out a loan for a car, a house, a business, or education is one thing; living off of borrowed money, and borrowing more, with no intention of paying it back is quite another; I’m referencing the latter situation); the government should be held to the same standard.

That said, the problem of the federal deficit is a longstanding issue that has more to do with excessive and wasteful spending.  The stimulative effect of the tax cuts, by putting more people to work, will increase revenues.  The most pressing concern now is for Congress to make the income tax cuts permanent—another no-brainer, win-win move for all concerned.

Taxes are a necessary evil—we need the military, roads, and the like—and there comes a point of diminishing returns with cuts just as there are with increases, but allowing Americans to keep more of their money is, in almost every situation, the better choice, both economically and morally.

#MAGAWeek2018 – Limited Government

For the last day of MAGA Week 2018, I’m dedicating this post not to any specific historic individual, but instead to a facet of political (conservative) philosophy:  limited government.

It’s easy to take limited government for granted, or even to fail to recognize it:  it doesn’t seem to occur organically, although modern-day economic libertarians will claim as such (never mind that this ignores most of the last 6000 years of recorded history).  Limited government is rooted in self-government, which did flourish once given a chance, but it certainly required the fertile soil of 550 years of English political history.

Limited government is not quite the same as small government.  A government can be “small” in terms of its expenditures, but still not “limited”—imagine a lean, efficiently-run dictatorship in a very small country.

Limited government, on the other hand, suggests a “smallness” of scale, but also carefully delineates the scope of the government’s purview.  The “checks and balances” and “separation of powers” of the Constitution are key ingredients, not just between the different branches of the federal government, but between the federal and State governments as well.

The whole point of our system—beautifully laid out by the Framers, but particularly James Madison—is to defuse and counterbalance power, and to submit all authority to rule of law.  The Constitution, then, is the limiting of what government can do.  Everything else is left to the people.

Unfortunately, our nation has given up our old attitude—“ask forgiveness, not permission”—for “assume you’re not allowed to do something without getting it sanctioned by some authority first.”  That’s a shocking change from the Framers’ attitude, and to our century of “salutary neglect” prior to 1763.

Recall that Americans didn’t declare independence in 1776 because taxes were too high, but that they were being taxed without their consent—without representation in Parliament.  The whole theory was that government ruled with the consent of the governed, a notion dating back to the feudal privileges of the barons at the signing of the Magna Carta at Runnymede in 1215.

Similarly, the Boston Tea Party was a response to the monopolization of tea entering the colonies—a corporatist scheme cooked up by the British government to bail out the failing, government-subsidized British East India Company.  The colonists rightly reasoned that, should the importation of tea be monopolized, any product could be subject to monopolization—potentially destroying the colonial economy under the thumb of an exploitative British government.

Americans believed—correctly—that their rights as Englishmen were being trampled, and that the British government was overstepping its bounds.  In essence, Parliament and King George III failed to apply the traditional limits of English government to their colonial possessions in British North America.

As such, our Framers put together a written Constitution (unlike Britain’s unwritten constitution, which can essentially be changed at the majoritarian whim of Parliament; thus, people arrested in Britain for posting controversial topics on Facebook—and the persecution of Tommy Robinson), one that clearly delineates the roles of the three branches of government.  The Bill of Rights prudently added the Tenth Amendment, which devolves power down to the States and the people.

As for why limited government is great, I will close with this recommendation:  watch the video below from Prager University.  Adam Corolla lays out the best case for limited government I’ve ever heard.

Enjoy—and thanks for making America great again!

TBT: Happy Birthday, America!

Two years ago, I dedicated my Fourth of July post to analyzing Lincoln’s Gettysburg Address.  In the spirit of MAGA Week 2018—and to preserve the TPP TBT tradition—I’m re-posting that classic post today.

A major theme of the blog posts from that summer was the idea of America as a nation, an idea I still find endlessly compelling.  The election of President Trump in November 2016 has reinvigorated public debates about the nature of American nationalism, as well as revived, at least partially, a spirit of unabashed patriotism.

As a child, I took it for granted that America was a special place.  When I learned American history as a child, I learned the heroic tales of our Founders.  While revisionist historians certainly have been correct in pointing out the faults of some of these men, I believe it is entirely appropriate to teach children—who are incapable of understanding such nuance—a positive, patriotic view of American history.  We shouldn’t lie to them, but there’s nothing wrong with educating them that, despite its flaws, America is pretty great.

Today the United States of America celebrates 240 years of liberty.  240 years ago, Americans boldly banded together to create the greatest nation ever brought forth on this earth.

They did so at the height of their mother country’s dominance.  Great Britain emerged from the French and Indian War in 1763 as the preeminent global power.  Americans had fought in the war, which was international in scope but fought primarily in British North America.  After Britain’s stunning, come-from-behind victory, Americans never felt prouder to be English.

Thirteen short years later, Americans made the unprecedented move to declare their independence.  Then, only twenty years after the Treaty of Paris of 1763 that ended the French and Indian War, another Treaty of Paris (1783) officially ended the American Revolution, extending formal diplomatic recognition to the young United States.  The rapidity of this world-historic shift reflects the deep respect for liberty and the rule of law that beat in the breasts of Americans throughout the original thirteen colonies.

America is founded on ideas, spelled out in the Declaration of Independence and given institutional form and legal protection by the Constitution.  Values–not specific ethnicity–would come to form a new, distinctly American nationalism, one that has created enduring freedom.

***

Rather than rehash these ideas, however, I’d instead like to treat you to the greatest political speech ever given in the English language.  It’s all the more remarkable because it continues to inspire even when read silently.  I’m writing, of course, about Abraham Lincoln’s Gettysburg Address.  Here is the transcript (Source:  http://www.gettysburg.com/bog/address.htm):
“Four score and seven years ago, our fathers brought forth on this continent a new nation: conceived in liberty, and dedicated to the proposition that all men are created equal.

“Now we are engaged in a great civil war. . .testing whether that nation, or any nation so conceived and so dedicated. . . can long endure. We are met on a great battlefield of that war.

“We have come to dedicate a portion of that field as a final resting place for those who here gave their lives that that nation might live. It is altogether fitting and proper that we should do this.

“But, in a larger sense, we cannot dedicate. . .we cannot consecrate. . . we cannot hallow this ground. The brave men, living and dead, who struggled here have consecrated it, far above our poor power to add or detract. The world will little note, nor long remember, what we say here, but it can never forget what they did here. It is for us the living, rather, to be dedicated here to the unfinished work which they who fought here have thus far so nobly advanced.

“It is rather for us to be here dedicated to the great task remaining before us. . .that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion. . . that we here highly resolve that these dead shall not have died in vain. . . that this nation, under God, shall have a new birth of freedom. . . and that government of the people. . .by the people. . .for the people. . . shall not perish from the earth. “

 ***
The Gettysburg Address is elegant in its simplicity.  At less than 300 words, it was a remarkably short speech for the time (political and commemorative speeches often ran to two or three hours).  Yet its power is undiminished all these years later.  President Lincoln was only wrong about one thing:  the claim that the “world will little note, nor long remember, what we say here” has proven untrue.
I will likely write a deeper analysis of the Address in November to commemorate its delivery; in the meantime, I ask you to read and reread the speech, and to reflect on its timeless truths.
God Bless America!
–TPP
To read different versions of the Gettysburg Address–there are several versions extant–check out this excellent page from Abraham Lincoln Online:  http://www.abrahamlincolnonline.org/lincoln/speeches/gettysburg.htm.

#MAGAWeek2018 – Thomas Jefferson & The Declaration of Independence

Happy Independence Day, America!  242 years ago, the Second Continental Congress declared independence from Great Britain, changing the course of history and spawning independence movements all over the globe.

As such, it’s only fitting that today we look at the author of the Declaration of Independence, Thomas Jefferson.

Few figures in the period of the Early Republic have inspired as much debate as Jefferson, who clashed frequently with President Washington’s Secretary of Treasury, Alexander Hamilton, while serving as Secretary of State.  His friendship with John Adams turned into a bitter, acrimonious rivalry, as the two parted ways on the proper response to the French Revolution, then squared off against one another in the 1796 and 1800 presidential elections.  The two would make amends later in life, exchanging some of the liveliest, most insightful correspondence of the period.

After the publication of Thomas Paine’s revolutionary pamphlet “Common Sense” electrified pro-independence sentiment throughout the colonies, the Second Continental Congress put aside any hopes of reconciliation with Britain, and instead decided to declare independence.  To draft the document that would take the colonies across the Rubicon, the Congress selected Jefferson.

Jefferson wrote the Declaration with his fellow countrymen and other European nations in mind, although it was addressed to Parliament and King George III.  The Declaration is one of the most brilliant documents ever written, and its opening paragraphs are almost more important than the specific list of grievances against the English government.

Jefferson’s claim—radical at the time—that “all men are created equal”—shook the world, and its reverberations through history are well-documented.  There are, however, some other key phrases.  The phrase “When in the Course of human events” seems innocuous on the face, but carries an important meaning:  the “unalienable” rights are not unique to any one people, nation, or time in history, but are universal.  All peoples enjoy natural rights that are woven into the fabric of the universe—and which were “endowed by [our] Creator.”

Jefferson was likely a Deist, believing that a God created the universe, but afterward left it to work and unfold according to physical laws of nature.  Nevertheless, Jefferson believed—as did many of the Founders, who were often products of the Scottish Enlightenment (and, fortunately, not the more destructive French Enlightenment)—that the Creator imbued the physical universe with natural rights, just as He created gravity.

Regardless, after some revisions—the congressional committee that commissioned Jefferson had him change “Life, Liberty, and Property” to “Life, Liberty, and the pursuit of Happiness”—the Declaration was adopted as both a specific list of grievances detailing America’s case to “a candid world,” and as a timeless expression of America’s belief in natural rights.  The usual disclaimers apply—women and free blacks, not to mention slaves, were left out of this consideration at the time, despite objections from Abigail Adams, wife of our second president (and mother of yesterday’s subject)—but the Declaration paved the way for all Americans to enjoy greater liberty.

When time permits, I will dive into a deeper, lengthier discussion of Jefferson’s legacy; as it is, it’s taken me several hours just to write this much, as I’m fulfilling my avuncular duties of watching my niece and nephew.  For now, I will end on one final anecdote:

On 4 July 1826, Thomas Jefferson passed away—the fiftieth anniversary of the signing of the Declaration of Independence.  A few shorts hours, in what is likely the most serendipitous event in American history, an aged John Adams slipped away, too.  Moments before his passing, Adams said, “Thomas Jefferson still survives,” although Jefferson had passed just hours before.  An attendant by Adams’s side said that, at the moment of the great man’s death, a sudden thunderstorm whipped up, as if the artillery of Heaven were welcoming him home.

***

To read a full transcript of the Declaration of Independence, I recommend this version at Archives.gov:  https://www.archives.gov/founding-docs/declaration-transcript

#MAGAWeek2018 – John Quincy Adams

John Quincy Adams

If yesterday’s MAGA Week profile of George Washington was straight from “American History Greatest Hits, Volume I,” today’s selection is like a bootlegged deep-cut from an obscure local musician’s live show.  John Quincy Adams—an American diplomat, Secretary of State, President, and Congressman—deserves better.

US History students of mine for years have recoiled at the dour daguerreotype portrait of our somewhat severe sixth President.  But behind that stern, austere visage churned the  mind of a brilliant, ambitious man—and probably the greatest Secretary of State in American history.  I will be focusing on Adams’s tenure in that position in today’s profile.

An “Era of Good Feeling”

Adams was one of several “all-star” statesmen of the second generation of great Americans.  After the careers of George Washington, Alexander Hamilton, Thomas Jefferson, and John Quincy’s father, John Adams, a new, youthful cadre of ambitious and talented national leaders took their place at the helm of a nation that was growing and expanding rapidly.  From the ill-fated War of 1812 through the Mexican War, leaders like John C. Calhoun, Daniel Webster, Henry Clay, and Andrew Jackson—the populist odd-man out—forged a national identity and sought to navigate the nation through its early growing pains.

John Quincy Adams was among this group.  After the War of 1812, his father’s old Federalist Party largely died out, both due to the treasonous actions of the so-called “Blue Light” Federalists (who openly sided with the British) and to demographic changes brought about by westward expansion and the Louisiana Purchase in 1803.  More Americans were small, yeoman farmers, and the Federalists’ pro-British, pro-industry, pro-commerce platform held little appeal for feisty frontiersman who were suspicious of a strong federal government and the hated Second Bank of the United States, charted in 1816.

As such, the United States entered an “Era of Good Feeling” under President James Monroe, in which one party, the Democratic-Republican Party, remained.  Monroe’s cabinet was a “who’s who” of young, dynamic men, and Adams was his Secretary of State.

Secretary of State

It was in this context that Adams made his most significant contributions to American foreign policy and nationalism.  While serving as Secretary of State, he laid out a vision for America’s future that held throughout the nineteenth century.

In essence, Adams argued that the United States should pursue a realist foreign policy that avoided wars and foreign entanglements; that the nation should not seek a European-style “balance of power” with its Latin American neighbors, but should be exercise hegemonic dominance in the Western Hemisphere; and that the United States should gain such territory as it could diplomatically.

In 1821, Adams famously issued his warning against involvement in foreign wars of liberation.  The context for this warning was the Greek War of Independence from the Ottoman Empire, an endeavor that was hugely popular in Europe, particularly in Britain.  Many Americans urged Congress to intervene in the interest of liberty, and for Americans to at least send arms to help in another fledgling nation’s war for independence.

Adams perceptively saw the dangers inherent in the United States involving itself in other nations’ wars, even on the most idealistic of grounds.  To quote Adams at length:

“Wherever the standard of freedom and Independence has been or shall be unfurled, there will her heart, her benedictions and her prayers be. But she goes not abroad, in search of monsters to destroy. She is the well-wisher to the freedom and independence of all. She is the champion and vindicator only of her own. She will commend the general cause by the countenance of her voice, and the benignant sympathy of her example. She well knows that by once enlisting under other banners than her own, were they even the banners of foreign independence, she would involve herself beyond the power of extrication, in all the wars of interest and intrigue, of individual avarice, envy, and ambition, which assume the colors and usurp the standard of freedom. The fundamental maxims of her policy would insensibly change from liberty to force…. She might become the dictatress of the world. She would be no longer the ruler of her own spirit.” (Emphasis addedSource:  https://www.mtholyoke.edu/acad/intrel/jqadams.htm)

If America were to involve itself in open-ended wars of liberation—even once!—it would set a dangerous precedent that the United States would become constantly embroiled in the squabbles of other nations.  No matter how well-meaning, such intervention would commit the nation to a disastrously unlimited policy of nation-building and war.

The Transcontinental Treaty (1819)

Prior to rumblings for intervention in Greece, Adams brokered the purchase of Spanish Florida in a rather amusing fashion.  The hero of the Battle of New Orleans, General Andrew Jackson, pursued a group of Seminole Indians into Florida, violating orders to respect the international border.  In the process, Jackson attacked a fort manned by Seminoles and escaped slaves, killed two British spies, and burned a Spanish settlement.

Instantly, an international crisis seemed imminent.  To a man, President Monroe’s cabinet demanded disciplinary action be taken against General Jackson.  It was Adams—who, ironically, would become Jackson’s bitterest political opponent in 1824 and 1828—argued against any such action, and planned to use Jackson’s boldness to America’s advantage.

With apologies to Britain and Spain, Adams pointed out that, despite the government’s best efforts, Jackson was almost impossible to control, and was apt to invade the peninsula again.  Further, Spanish rule in Florida was increasingly tenuous, due to the various Latin American wars of independence flaring up at the time.  With revolts likely—and facing the prospect of another Jackson invasion—Spain relented, selling the entire territory for a song.

The Oregon Country and the Convention of 1818

Adams was also key in securing the Oregon Country for the United States, although the process was not completed in full until James K. Polk’s presidency, some thirty years later.  The Oregon Country—consisting of the modern States of Washington and Oregon—was prime land for settlement, but the United States and Great Britain both held valid claims to the territory.

Adams realized that the United States could afford to be patient—given America’s massive population growth at the time, and its citizens’ lust for new lands, Adams reasoned that, given enough time, American settlers would quickly outnumber British settlers in the territory.

Sure enough, Adams secured another territory for the United States, albeit in far less dramatic fashion that the acquisition of Florida one year later.

The Monroe Doctrine (1823)

Perhaps Adams’s greatest contribution to the United States was his work on the Monroe Doctrine in 1823.  Once again, Adams’s diplomatic brilliance came into play.

Adams sought to keep the United States out of foreign wars, but he also wanted to keep European powers out of the Western Hemisphere.  As Spain continued to lose its grip on its American colonies, the autocratic nations of Russia, Prussia, and Austria (the Austrian Hapsburg controlled Spain at this time) sought to reestablish monarchical rule in the Western Hemisphere.

President Monroe and Secretary Adams were having none of it—nor was was Great Britain, which enjoyed a brisk trade with the newly-independent republics of Latin America.  To that end, Britain proposed issuing a joint statement to the world, with the effect of committing both nations to keeping the new nations of Latin American independent.

Monroe was excited at the idea, but in his ever-prescient manner, Adams argued for caution.  Were the United States to issue the declaration jointly with Britain, they would appear “as a cockboat in the wake of a British man-o-war.”  It would be better, Adams argued, to issue a statement unilaterally.

The United States had no way, in 1823, to enforce the terms of the resulting Monroe Doctrine, which pushed for three points:  Europe was to cease intervention in the affairs of the Western Hemisphere (non-intervention); Europe was to cease acquiring new colonies in the Western Hemisphere (non-colonization); and the United States would stay out of open-ended entanglements and alliances with Europe (isolation).

However, Adams knew that Britain would enforce the Monroe Doctrine with its mighty navy, even if the United States issued it unilaterally, because it would be in Britain’s national interest to do so.  Sure enough, Adams’s shrewd realism won the day, and, other than France’s brief occupation of Mexico during the American Civil War, European powers never again established colonies in the New World.

After Monroe’s Cabinet

For purposes of space and length, I will forego a lengthy discussion of Adams’s presidency and his tenure in Congress.  He was an ardent nationalist in the sense that he sought an ambitious project of internal improvements—roads, canals, harbors, and lighthouses—to tie the young nation together.  In his Inaugural Address, he called for investment in a national university and a series of observatories, which he called “lighthouses of the sky,” an uncharacteristically dreamy appellation that brought him ire from an already-hostile Congress.

His presidency, too, was marred by the unusual circumstances of his election; Adams is the only president to never win the popular or electoral vote, or to ascend to the position from the vice presidency.  That’s a story worth telling in brief, particularly for political nerds.

The presidential election field of 1824 was a crowded one, and the “Era of Good Feeling” and its one-party dominance were showing signs of sectional tension (indeed, the second system of two parties, the National Republicans—or “Whigs”—and Jackson’s Democratic Party, would evolve by 1828).  There were four candidates for president that year:  Andrew Jackson, John Quincy Adams, Speaker of the House Henry Clay, and Secretary of Treasury William Crawford of Georgia.

Jackson won a plurality of the electoral votes—99—but no candidate had a clear majority.  In this event, the top three candidates are thrown to the House of Representatives, where each State’s delegation votes as one.  Crawford, who finished third, was deathly ill, and was not a suitable candidate, and Henry Clay, in fourth, was not eligible constitutionally.

That left the rabble-rousing Jackson and the austere Adams.  Clay, as Speaker of the House, held immense influence in Congress, and could not stand Jackson, so he threw his support behind Adams, who won the election in Congress.

Apparently losing all the wisdom and prudence of his days at the State Department, Adams foolishly named Clay as his new Secretary of State—an office that, in those days, was perceived as a stepping stone to the presidency.  Jackson supporters immediately cried foul, arguing that it was a “corrupt bargain” in which Clay sold the presidency in exchange for the Cabinet position.

While it appears that Adams sincerely believed Clay was simply the best man for the job, the decision cast a pall over his presidency, and Jackson supporters would gleefully send their man to the Executive Mansion in 1828.

At that point, Adams expected to settle into a quiet retirement, but was elected to represent his congressional district in 1830.  During his time in Congress, he fought against slavery and the infamous “gag rule,” which prevented Congressman from receiving letters from constituents that contained anti-slavery materials.  He was also a vocal opponent of the Mexican War—as was a young Abraham Lincoln during his single term in Congress—and died, somewhat disgracefully, while rising to oppose a measure to honor the veterans of that war.

Regardless, Adams’s career shaped the future of the country, gaining it international prestige and setting it on track to emerge as a mighty nation, stretching from sea to shining sea.  Through his service and genius, Adams made America great—and, physically, in a very literal sense.

Quick References

#MAGAWeek2018 – George Washington

It’s July Fourth week here in the glorious United States, and in the spirit of good, old-fashioned patriotism, I’m dubbing this week “MAGA Week 2018” (and adopting the hashtag in celebration—please share this post with the appropriate dose of shameless promotion I crave).  Each day I’ll be highlighting some historic individual who, in his or her own way, made America great again in their respective time.

George Washington

As with all American firsts, we’re starting with the first President of the United States, George Washington.  A bit cliche, perhaps, but I believe that we take George Washington’s legacy for granted.  Yes, he’s ubiquitously arrayed on our currency, and he shows up around Presidents’ Day in commercials for local car dealerships (“I cannot tell a lie—the price on these 2018 Chevy Silverados is unbeatable!”), but Washington has suffered at the hands of social justice warrior academics and white-male bashers.

Of course, none of those Gender Studies majors could even bash our first President were it not for his choices in and before taking office.  Indeed, we owe an immense debt of gratitude to George Washington for the system of government we enjoy today.

Surrendering Power

Washington is not some kind of patriotic demigod—he more or less blundered the British Empire into the lengthy and expensive Seven Years’ War/French and Indian War with France—but he did something that few military men have ever done in history:  he voluntarily gave up power.

George Washington received a commission from the young Continental Congress to lead the Continental Army in 1775.  On 23 December 1783, General Washington resigned his commission, surrendering his commission back to the civilian authority that originally granted it.

Historically speaking, it is hard to articulate how rare such an action is.  Washington could have gone the direction of the “George Washington of South America,” Simon Bolivar, and continued to fight for more power, or to expand the Revolution abroad.  Many men in his position—a position of immense popularity and holding the keys to the nation’s fighting force—would have forced the Continental Congress at gun point to extend “emergency powers” or the like to them.  Washington could easily have made himself King of the United States.

Instead, Washington followed the model of the humble Roman farmer Cincinnatus, who saved the young Roman Republic and promptly returned to his plow.  Thus, Washington is remembered to this day as “The American Cincinnatus.”

The Newburgh Conspiracy

Before he surrendered his commission, Washington narrowly saved the fledgling young Republic once again.  In Newburgh, New York, a group of disgruntled soldiers, upset that they had gone unpaid for so long, began to foment a conspiracy to march on Philadelphia, demand their wages on gunpoint, and, should the Continental Congress refuse, overthrow the assembly by force.  Such an uprising would have been disastrous, and could have ushered in a military junto, supplanting the Articles of Confederation.

Despite his best efforts, Washington could not win over the hardened veterans with eloquence.  He produced a letter from a soldier to read aloud to the rowdy bunch of soldiers, then fumbled about in his coat pocket for his glasses.

As he put on his glasses, Washington remarked, “Gentleman, you must pardon me, for I have not only grown gray but almost blind in service to my country.”  Washington’s soldiers had never seen him in such a light, and began to weep openly.  Thus, General Washington prevent a military coup simply by donning his glasses.

Shays’ Rebellion

After Daniel Shays’ Rebellion in 1786-87, Washington again—reluctantly—returned to public life, and was among the handful of men who began to push for a stronger national government.

Shays’ Rebellion shook many leading Americans to the core.  The State of Massachusetts, which was diligently and vigorously paying off its debts from the American Revolution, placed heavy demands on the limited resources of farmers in the western portion of the State (a recurring theme in American history—the tension between eastern commercial elites and western farmers).  As many farmers were unable to pay their debts—much less in the hard currency specie the law required—they faced debtors’ prison or confiscation of their property.

Their backs against the wall, the young Daniel Shays led an ad-hoc army of about 4000 men to occupy courthouses to prevent foreclosures and seizures of property.  Ultimately, the State of Massachusetts had to raise a private militia, as no other States would send troops to assist what they saw as a matter exclusive to Massachusetts.

Shays’ Rebellion highlighted the need for a stronger central government than the Articles of Confederation provided.  The Articles did not give the federal government the ability to tax the States or imports, and while a national army technically existed, it could only requisition troops from the States, and it had no way to compel the States to provide troops (seeing as it lacked a national military).

Reluctantly, Washington came out of retirement, and presided over the Constitutional Convention in 1787.

First President

Once the Constitution was ratified in 1789, the nation faced a number of problems that had gone unresolved during the long years under the Articles of Confederation.  America in the 1780s was economically depressed and suspicious of national authority (the latter is not entirely a bad quality), and it had failed to fulfill several of its obligations to Great Britain under the Treaty of Paris of 1783, the treaty that ended the Revolutionary War.  Additionally, the French Revolution broke out in 1791, creating a sticky situation between the young Republic and its technical Revolutionary allies.

The only man who could engender the support of all thirteen States was George Washington, who won unanimously in the Electoral College.  At his Inauguration, Washington wore a simple, brown suit—the attire of a common but respectable gentleman of the time—and eschewed any regal titles.  Vice President John Adams, knowing the difficulty of the job any President would face, proposed the unwieldy, monarchical title “His Highness, President of These United States, and Protector of Their Liberties” (in response, congressmen snickered that Adams should be called “His Rotundity”).  Washington wisely adopted the simple “Mr. President.”

The Whiskey Rebellion

In 1795, during his second term, George Washington faced the Whiskey Rebellion in Pennsylvania.  Like Shays’ Rebellion, the uprising pitted the interests of the nationalists against rural farmers.  Western farmers, lacking reliable transportation and long transport times, relied on converting their corn into whiskey, which would survive the long, arduous trip to market.  The Federalist-dominated Congress placed an excise tax on whiskey as a way to increase revenue, placing a heavy burden on western farmers (who, incidentally, tended to vote for Thomas Jefferson’s new Democratic-Republican Party).

Farmers began refusing to pay the tax, and assembled a militia to prevent its collection.  This time, however, things went differently—George Washington, as Commander-in-Chief, personally led the Army to face the farmers.  At the sight of the American military, the farmers threw down their weapons and dispersed.  Washington—in one of his multiple instances of magnanimity and mercy—pardoned the leaders of the rebellion, sparing them the hangman’s noose.

The lessons of the Whiskey Rebellion were clear:  good order must be maintained; armed insurrection is not tolerated; change must occur at the ballot box, not by force of arms.  Washington’s response to the Whiskey Rebellion cemented the authority of the Constitution, which had survived one of its first major tests.

End of Presidency

Washington served ably as President, keeping America out of France’s costly and radical revolution, establishing good government, and uniting a young country that was suspicious of centralized power.  And, once again, Washington yielded power, opting to serve only two terms—an important precedent that every President followed until Franklin Delano Roosevelt.

That final act cannot be dismissed too quickly.  Had Washington served a third term, he would have died in office.  The precedent would have been set that any President should try to hold onto power as long as was electorally feasible—or by extra-constitutional means, if necessary.

In his Farewell Address, Washington spelled out the ingredients that make a good government work.  He wrote, “Of all the dispositions and habits which lead to political prosperity, religion and morality are indispensable supports…. It is substantially true that virtue or morality is a necessary spring of popular government.”

We would do well to remember his words—and, I fear, we have forgotten many of them.

Fortunately, God blessed the United States with George Washington, and worked through him to make ours a more perfect union.  At multiple times in our young Republic’s history, George Washington Made America Great Again!

Quick References

Progressivism and Political Violence

The modern Left idealizes political violence.  That’s a bold statement, but it’s true, and the truth of that claim dates back to the French Revolution.  That revolution—so different from our own—was the root of almost all totalitarian movements in the 20th century, and of the American Left’s current mood for mob activity in the name of “progress.”

The big story in the world of the American Right this week has been Democratic Congresswoman “Auntie” Maxine Waters’s calls for active disruption of Trump administration officials in their private lives, to the point of harassing them at restaurants, department stores, and gas stations—even picketing at their homes, as happened to Secretary of Homeland Security Kirstjen Nielsen‘s home twice.

Waters’s execrable remarks—and her blasphemous contention that “God is on our side” (if she’s referring to Baal, the ancient Canaanite fertility god who worshipers tried to appease with child sacrifices, I’m sure he is pleased with Democrats’ support of abortion, but THE One True God must be weeping constantly over those lost lives)—were inspired by the ouster of White House Press Secretary Sarah Huckabee Sanders from the Red Hen, a restaurant in Lexington, Virginia.  In a Fox News interview after the fact, Sanders’s father, former Arkansas Governor and bassist Mike Huckabee, alleged that the progressive owner of the restaurant followed the Sanders party down the street, heckling them.

None of these events, in my mind, are surprising, but, rather, a reminder of the progressive Left’s taste for violence—or, at the very least, of achieving its long-term political goals by “any means necessary” (a slogan of the so-called “Resistance”).

Recall the soon-forgotten shooting of congressional Republicans last year as they practiced for Congress’s annual interparty baseball game.  That attack, the fevered result of a Bernie Bro’s break with reality, nearly killed Louisiana Congressman Steve Scalise.  It’s easy to forget the anti-Trump hysteria of 2017 (because the anti-Trump hysteria of 2018—after the President’s proven himself in office—seems even more unhinged), but the Left was out for blood after the Inauguration, with pink-hatted activists shouting at the sky in protest.

The Left has taken America’s cold civil war hot because it doesn’t control any of the levers of power in government.  With the retirement of swing Justice Anthony Kennedy, progressives may see their last ace-in-the-hole, the courts, lost for a generation (to be clear, the Left is still dominant in academia, pop culture, the arts, major non-profits, the corporate world, and pretty much everything that isn’t the federal and State governments).  The last tactic, then, is to amp up their social intimidation to borderline—and, if necessary, actual—violence.

Consider that the Left can only push forward its agenda for any length of time through means of coercive power (although maudlin emotional manipulation comes in handy, too, and works well with easily-manipulated “feel-good” types).  Traditionally, that’s been through the power of the state—the massive reach of the federal government.

It was the modern political Left, growing out of the Progressive movement of the late nineteenth and early twentieth centuries, that brought first the New Deal, and then the Fair Deal and the Great Society, that vastly expanded the size, scope, and reach of federal power.

While Americans were largely content with some government assistance during the throes of the Depression—and naively believed that the federal government could actively solve the nation’s problems after the Second World War, given the government’s success in fighting that global conflict—they could not stomach actual Marxism.  So it was that Democrats began gradually to lose their mid-twentieth-century vice grip on the ballot box.

With the rise of the “New Right” in the 1960s and 1970s, followed by the election of His Eminence Ronaldus Magnus in 1980, Leftists increasingly turned to the courts to fulfill by judicial fiat what could not be achieved at the ballot box.

Take, for example, the overturning of California’s ballot initiative, Proposition 8, to amend the State’s constitution to outlaw same-sex marriage.  In California—the beating heart of the modern progressive movement—a small cadre of unelected officials overturned the will of the people.

Similarly, Justice Kennedy more or less decided that federalism doesn’t matter, and we should believe that the Founding Fathers meant to support casual same-sex boning, but just forgot to put it in the Constitution (I have friends who support same-sex marriage who disagree with the Obergefell v. Hodges ruling, arguing that it oversteps the Supreme Court’s constitutional authority).

The courts were the back-up plan.  I’ve actually read (anecdotal evidence alert) some progressives posting on Facebook to the effect that, “Well, we overplayed the judicial activism thing for too long, and we relied on it at the expense of electoral victory.”  Those comments are rare—more of them are childish weeping and/or promises to move to Canada or “stop joking around.”

Now that President Trump is in the White House, Republicans control Congress, and the Supreme Court is ready to tip narrowly toward constitutional originalism, Leftists are apoplectic, and are showing their true colors.  They have two choices:  make a compelling case to the American people to elect more Democrats in November, or double-down on hysteria and send us hurtling closer towards the Second American Civil War.

While there’s been much talk of a “blue wave” this November, the Left’s outbursts and fascistic tactics seem to be hurting Democrats nationally.  That doesn’t mean they won’t take the House or the Senate—after all, some of these districts are so blue they keep voting in borderline illiterates like Congresswoman Sheila Jackson Lee of Texas—but their chances are narrowing.

Even if they do take control of one or both chambers, President Trump will still control the executive branch, and, as yet, has done nothing impeachable (being crude or saying awesome stuff on Twitter don’t qualify as “high crimes and misdemeanors”).  Sure, they might try, but it would be like the Radical Republicans impeaching President Andrew Johnson for ignoring an unconstitutional act of Congress—purely politically-motivated.

If there is impeachment in the House, it will fail—Trump will not be removed from office by the Senate—the Democrats will find themselves stuck for another two years with a president they irrationally despise.  The way things are going, he’s likely to win reelection in 2020 (please, sweet Lord).

But all of this is conjecture.  There’s a good chance Republicans hold onto the House and pick up vulnerable Democratic seats in the Senate (such as Heidi Heitkamp’s seat in North Dakota).  What then?  With a new conservative Supreme Court justice, the Left is marginalized at the federal level, other than their Deep State cronies.

My guess is that we’ll see more insanity and violence before we see less.  The Left will double-down on this progressive agenda for a decade, until a moderate, Bill Clinton-style moderate appears, or the economy turns sour (not likely!), or they can cobble together another Obama-style rainbow coalition.

The question is, will their propensity for political violence boil over into full-scale warfare and defiance of constitutional authority?  We’ve already seen California nullify federal law by refusing to enforce immigration law.  Distrust between people of different political backgrounds is at feverish highs.

Beyond some fringe kooks, no one on the American Right wants to see violence.  But the progressive Left’s deep-rooted love of “punching Nazis” and strangling dissent won’t broach much room for disagreement.

We’re living in scary times.