Quantcast
Channel: American Exceptionalism – American Exceptionalism

“I Pledge Allegiance”: Respect over Freedom

0
0

By Emma Kaden

“I pledge allegiance to the Flag of the United States of America, and to the Republic for which it stands, one Nation under God, indivisible, with liberty and justice for all.”

The Pledge of Allegiance was a noble idea—to encourage students to show their respect for the United States and the sacrifices Americans have made for their country over the centuries. And perhaps it still accomplishes that goal. However, many liberals argue that students should be allowed to abstain from saying the pledge. Some reasons come from the right line of thought (freedom of speech) while others are simply unfounded (children are mature enough to decide whether or not their country is worth their respect).

There is merit to both sides of this argument. On the one hand, it’s a valiant way to respect the country you live in, but on the other, how can children respect their country when they haven’t yet learned all about it? Additionally, shouldn’t showing your respect for your country be a voluntary act? Though the words imply respect, it’s the feeling behind them that matters. If students are simply forced to say the Pledge of Allegiance, they may be saying empty words—and if so, then what’s the point?

One may argue that children are children, and they don’t know whether or not they respect their country. However, why should the school they attend decide that for them, especially when so many parents cannot afford to send their children to a school that most aligns with their beliefs? Respect is a value that has been slowly disappearing over the decades, and though it is important—if not necessary—to instill this trait in the upcoming generations, perhaps the Pledge of Allegiance is not the best method.

Imagine you complimented a friend or family member every time you saw them. However, you always gave them the same compliment. No matter how well-intentioned, after four years, that compliment would lose its value. Imagine how much value the Pledge of Allegiance must hold to students after thirteen years of repetitive daily recitation.

In an article from the Foundation for Economic Education, writer Tom Muller discusses the semantics of the Pledge of Allegiance, and how some of its language undermines the sacred ideals of the Founding Fathers.

An Atlanta, Georgia, charter school announced last week its intention to discontinue the practice of having students stand and recite the Pledge of Allegiance during its schoolwide morning meetings at the beginning of each school day, opting to allow students to recite the pledge in their classrooms instead. Predictably, conservatives were immediately triggered by this ‘anti-American’ decision, prompting the school to reverse its decision shortly after.

The uproar over periodic resistance to reciting the pledge typically originates with Constitution-waving, Tea Party conservatives. Ironically, the pledge itself is not only un-American but antithetical to the most important principle underpinning the Constitution as originally ratified.

Admittedly, the superficial criticism that no independent, free-thinking individual would pledge allegiance to a flag isn’t the strongest argument, although the precise words of the pledge are ‘and to the republic for which it stands.’ So, taking the pledge at its word, one is pledging allegiance both to the flag and the republic. And let’s face it, standing and pledging allegiance to anything is a little creepy. But, then again, it was written by a socialist.

But why nitpick?

‘One Nation’

It’s really what comes next that contradicts both of the republic’s founding documents. ‘One nation, indivisible’ is the precise opposite of the spirit of both the Declaration of Independence and the Constitution (‘under God’ wasn’t added until the 1950s).

The government in Washington, D.C., is called ‘the federal government.’ A federal government governs a federation, not a nation. And the one persistent point of contention throughout the constitutional convention of 1787 and the ratifying conventions which followed it was fear the government created by the Constitution would become a national government rather than a federal one. Both the Federalist Papers and the Bill of Rights were written primarily to address this concern of the people of New York and the states in general, respectively.

Moreover, the whole reason for delegating specific powers to the federal government and reserving the rest to the states or people was to ensure there would not be ‘one nation,’ but rather a federation of self-governing republics which delegated a few powers to the federal government and otherwise reserved the rest for themselves.

By the way, the Bill of Rights as originally written applied only to the federal government and not to the states. Sorry, liberals, but the First Amendment doesn’t guarantee a ‘separation of church and state’ within the states. It was written for the opposite reason, to protect the existing state religions of the time from the federal government establishing a national one and thereby invalidating them.

And sorry, conservatives, the Second Amendment wasn’t written to keep states from banning guns. Quite the opposite. It was written to reserve the power to ban guns to the states. That’s why most states, even those established after the Bill of Rights was ratified, have clauses in their own constitutions protecting the right to keep and bear arms. They understood the Second Amendment applied only to the federal government, not the states.

If there is one thing that is clear from all of the above, the Constitution did not establish ‘one nation.’ In fact, the states only agreed to ratify it after being repeatedly promised the United States would be no such thing, allowing the states to govern themselves in radically different ways, at their discretion.

‘Indivisible’

Then, there’s ‘indivisible.’ One would think a federation born by its constituent states seceding from the nation to which they formerly belonged would make the point obvious enough. But the Declaration makes it explicit:

That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness.

It would be impossible to exercise that right—that duty, as the Declaration later calls it—if the republic were indivisible. The strictest constructionists of the time didn’t consider the nation indivisible. Thomas Jefferson didn’t threaten to send troops to New England when some of its states considered seceding upon his election. Quite the opposite. And in an 1804 letter to Joseph Priestly, he deemed a potential split in the union between ‘Atlantic and Mississippi confederacies’ not only possible but ‘not very important to the happiness of either part.’

The people advocating ‘one nation, indivisible’ in those days were big government Federalists like Hamilton, whose proposals to remake the United States into precisely that were flatly rejected in 1787.

Proponents of absolute, national rule like to quip this question was ‘settled’ by the American Civil War. That’s like saying Polish independence was ‘settled’ by Germany and the Soviet Union in 1939.

In fact, it is precisely the trend towards ‘one nation’ that has caused American politics to become so rancorous, to the point of boiling over into violence, over the course of the last several decades. This continent is inhabited by a multitude of very different cultures, which can coexist peacefully if left to govern themselves. But as the ‘federal’ government increasingly seeks to impose a one-size-fits-all legal framework over people who never agreed to give it that power, the resistance is going to get more and more strident. If there is any chance to achieve peace among America’s warring factions, a return to a more truly federal system is likely the only way.

Getting rid of the un-American pledge to the imaginary nation would be a good, symbolic start.”

No doubt, the Pledge of Allegiance was created with honorable intentions. Should it be modified to ensure the preservation of “liberty and justice for all”?

 

Tom Mullen is the author of Where Do Conservatives and Liberals Come From? And What Ever Happened to Life, Liberty and the Pursuit of Happiness? and A Return to Common  Sense: Reawakening Liberty in the Inhabitants of America. For more information and more of Tom’s writing, visit www.tommullen.net.


America’s Got Talent and the American Dream

0
0

By Emma Kaden

The American Dream is an essential tenet of American culture: it’s the belief that anyone can come to America and, if they work hard enough, achieve the success they’ve always wanted. In many instances, this ideal is forgotten or cast aside. However, it still rings true for many on the TV show America’s Got Talent, of all places.

Despite the show’s namesake, America’s Got Talent (informally AGT) draws in a variety of contestants from across the world. The premise of AGT is simple: a selection of talented acts competes for a shot at $1,000,000 and a show in Las Vegas. With that chance of success at stake, it’s no wonder so many non-Americans are drawn to the show. What performer doesn’t want a million dollars and a headline show in Las Vegas?

Not only do contestants travel from all around the world to compete in AGT, they win! Japanese dancer Kenichi Ebina stole American hearts in Season 8, and triumphed over other contestants, including stand-up comedian Taylor Williamson.

The current season features a variety of performers from all over the globe. Perhaps one of the weirdest acts to grace the AGT stage is The Sacred Riana, a horror illusionist from Japan who was eliminated in week two of the quarterfinals. Another interesting non-American act is Junior New System, an all-male dance group from the Philippines whose major novelty is that they wear high heels during their performances.

So why is AGT different? It’s not often that a TV show so clearly represents the American Dream (especially without trying). Perhaps a part of it is that TV producers are always searching for a drama-infused backstory for people who appear on reality and game shows, and a childhood in a poor country or a similar “pity story” fits that description perfectly. However, it may just be that America’s Got Talent has set up the perfect conditions for inspiring talented performers from across the globe to pursue the American Dream.

It seems clear from these revelations that America’s Got Talent isn’t exactly an apt title for the global talent sensation it represents. Although many of the acts on AGT are from America, many talented acts come from all over the Earth to compete. A more accurate title would be Earth Has Talent—or, much more fittingly, America Cultivates Talent. If we were to break the naming scheme altogether, we might find the most suitable title of all: Pursuit of the American Dream.

Oh, the Humanity! New Film Downplays Moon Landing as ‘Human Achievement’

0
0

By Emma Kaden

The Moon landing represents one of the greatest American achievements in history. It symbolizes America’s victory in the Space Race and the classic American “pioneer spirit.” Americans (not to mention people worldwide) watched spellbound as American astronaut Neil Armstrong became the first human to walk on the Moon. It was a landmark event in American history, and it paved the way for the United States to become the undisputed champion in technological and scientific innovation. (Sorry, Soviet Union.)

And that’s exactly what it was: a purely American achievement. However, the new movie “First Man,” a biopic about Armstrong, omits one of the most significant aspects of the landing: the planting of the American flag. Ryan Gosling, who stars as Armstrong, explained the decision to not prominently feature the planting of the American flag: “I think this was widely regarded in the end as a human achievement [and] that’s how we chose to view it.”

After news of the decision to overlook the flag moment went viral, many Americans were shocked, including Sen. Marco Rubio (R-FL), who tweeted that the decision was “total lunacy,” and “a disservice at a time when our people need reminders of what we can achieve when we work together. The American people paid for that mission, on rockets built by Americans, with American technology & carrying American astronauts.”

Buzz Aldrin also weighed in on the situation on Twitter, posting pictures of the American flag prominently displayed on the Moon with the hashtags #proudtobeanAmerican and #onenation.

So what’s the deal? Why in the world would Hollywood rewrite the Moon landing? It seems like Hollywood has taken “based on a true story” to a whole new level. Hollywood has a long history of being a little loose with the facts—but that was almost always to add dramatic effect. However, this instance of fudging the facts seems to promote Hollywood’s anti-American agenda. Ever since the McCarthy hearings and the Hollywood blacklist of the Cold War era, Hollywood has been increasingly fond of liberal doctrine and opposed to American exceptionalism.

However, Hollywood in and of itself thrives because of American exceptionalism. The very concept of Hollywood would not exist if not for American greatness. So why are Hollywood movie producers disparaging the country that affords them the opportunity to make movies that the entire world can’t live without?

Perhaps there are no right answers to these questions. Perhaps we will never know why, one after the other, Hollywood films are forsaking the recognition of American achievement. However, one thing we do know is that this must come to an end. “Human achievement” or not, the United States should be given credit for the Moon landing. No matter what Hollywood wants you to believe, the American flag was planted on the Moon—with pride.

Home Sweet Home: How Brazil Abandoned Educational Freedom

0
0

By Emma KadenImage result for school

Approximately 1.7 million children are homeschooled in the United States, according to federal estimates. Although this is not many in the large scheme of things, it seems much more significant when you consider that the Brazil Supreme Court recently ruled homeschooling unconstitutional—ensuring that the number of homeschooled children in Brazil is zero.

The homeschooling situation in Brazil highlights the exceptional freedom American parents have to enrich their children’s education. Although many Americans cannot afford to homeschool or send their children to the school that best serves their unique needs, the education system overall is comparatively very free. For example, in many countries around the world, such as Brazil, Germany, Sweden, and Turkey, homeschooling is illegal.

Additionally, many countries that allow homeschooling place restrictions on it. In Romania, children who have a physical disability or other condition keeping them from going to school may be homeschooled, but only under the supervision of an accredited teacher. In Poland and Hungary, homeschooled children must still be supervised and tested by an authorized school.

Countries with such strict laws on homeschooling make the United States seem even more open to educational freedom. Although only 1.7 million American children are homeschooled, each of those children would have to attend public or private school if not for the exceptionalism of the United States. Although Brazil’s Supreme Court may have ruled homeschooling illegal, the United States shines as a beacon of hope for educational freedom.

The following article, “Homeschool Is Unconstitutional? That’s What Brazil’s Supreme Court Ruled,” by Octávio Arruda, discusses Brazil’s Supreme Court decision in more depth, and was originally published by the Foundation for Economic Education:

“The supreme court of Brazil, the Federal Supreme Court (STF), referred to as the ‘Enlightenment vanguard’ by Minister Barroso, recently ruled that the practice of homeschooling is unconstitutional. The trial occurred on September 12, and nine of the ten ministers present for the ruling rejected parents’ right to practice homeschooling. The Brazilian Constitution of 1988 is considered by many jurists to be a great, positivist, and rights-insuring document of the 20th century. Yet despite the popular belief that it is an inclusive guarantor and regulator, the constitution is completing its 30 years in 2018 with another reinforcement that it is a great enemy of individual liberties.

The guardians of the constitution voted all but the rapporteur to deny parents’ ability to choose how their children will be raised and educated. The idea that the state should protect children from their parents should be cause for astonishment among liberty defenders.

Parents or Monsters?

Among the arguments the ministers used against homeschooling were:

‘Legitimizing this practice could stimulate child labor and conceal other serious ills affecting minors.’

‘An overprotection harmful to the child…’

‘Brazil is a very large, very diverse country. Without specific legislation establishing frequency monitoring, I am afraid that we will have major problems with school drop-out. Brazil already has one of the highest dropout rates. Without detailed congressional regulations, with pedagogical and socialization evaluations, we will have scholastic evasion of home teaching.’

Such statements prove that the court considers the parents of students who adopt homeschooling to be more negligent and abusive than the Brazilian state itself despite the fact that it is responsible for what is considered one of the greatest negative examples of public education in the world in almost all indexes and rankings, including PISA and The Learning Curve.

The Law Only Serves the State

Another name for the Federative Republic of Brazil could be, easily, ‘Bastiat’s Nightmare.’ A reading of the French economist’s The Law should amaze any Brazilian because all legal norms in Brazil protect everything except the freedom and property of its citizens, including, unfortunately, their own bodies and their children. The complete perversion of the law is clear in the Brazilian homeschooling scenario, and it is impossible not to relate this situation to the warnings given in Bastiat’s book:

‘The law has placed the collective force at the disposal of the unscrupulous who wish, without risk, to exploit the person, liberty, and property of others. It has converted plunder into a right, in order to protect plunder. And it has converted lawful defense into a crime, in order to punish lawful defense.

How has this perversion of the law been accomplished? And what have been the results?

The law has been perverted by the influence of two entirely different causes: stupid greed and false philanthropy.’

— From The Essential Frédéric Bastiat, published by FEE

The indignation from the defenders of individual freedom in Brazil was profound, and even more so because the Ministers’ decision is in accordance with the law. The Brazilian constitution mandates the attendance of children in school, and it is saddening that the problem is not in the decision of the magistrates, but in the 1988 constitution which makes Brazilian citizens nothing more than state property.”

America Rock: The American Experience Set to Music

0
0

By Emma Kaden

“And the shot heard ’round the world / Was the start of the Revolution…”

Schoolhouse Rock!, “The Shot Heard Round The World”

So many Americans grew up with the sound of Schoolhouse Rock!, the lyrics of “Conjunction Junction” or “Electricity” engraved into our minds, the tunes of “Interplanet Janet” and “Interjections!” blasting in our ears, and the characters from “I’m Just a Bill” and “The Tale of Mr. Morton” moving before our eyes. Unfortunately, the Schoolhouse Rock! videos lost popularity in classrooms and homes as the years passed by, and somewhere in the early 21st Century, Schoolhouse Rock! became something to look back on with nostalgia, rather than something to discover, enjoy, and learn from.

However, this phenomenon caused more than just a generation of Americans who couldn’t recite the entirety of “Lolly, Lolly, Lolly, Get Your Adverbs Here”—young Americans now have little understanding of basic civics, and of American history. This isn’t something one could attribute entirely to the lack of Schoolhouse Rock!… but maybe if more kids today watched some “Mother Necessity,” there wouldn’t be such a necessity to reeducate every American adult on civics.

In fact, Schoolhouse Rock! has an entire section devoted to educating viewers on the basics of American politics and history, aptly titled “America Rock,” and a few other songs that address topics. Here’s a little overview of the selection—hopefully these songs will inspire you to “Unpack Your Adjectives.”

  1. “No More Kings”

“He even has the nerve / To tax our cup of tea. / To put it kindly, King, / We really don’t agree.”

Schoolhouse Rock!, “No More Kings”

 

The Mayflower, the colonies, the Boston Tea Party, and the start of the young nation that would eventually become the United States—“No More Kings” contains all the essentials for a basic understanding of early America. In the video, the colonists make their way to what would later become the east coast of the United States on the Mayflower, build the colonies, and begin to consider their own independence. King George III enacts the tea tax, which we learn is taxation without representation, and the colonists take part in what is now known as the Boston Tea Party. Not only is the journey toward American independence chronicled in “No More Kings,” but it’s all set to a catchy tune.

  1. “The Shot Heard Round The World”

Take your powder, and take your gun. / Report to General Washington. / Hurry men, there’s not an hour to lose!”

Schoolhouse Rock!, “The Shot Heard Round The World”

The American Revolution is a vital part of American history, and Schoolhouse Rock! fits its contents into a song that informs the listener, albeit maybe not on the first listen. The lyrics are jam-packed with historical events, including Paul Revere’s ride through Lexington, the fight at Bunker Hill, and Lord Cornwallis’ surrender.

  1. “Fireworks”

“The Continental Congress said that we were free / Said we had the right of life and liberty, / …And the pursuit of happiness!”

Schoolhouse Rock!, “Fireworks”

Simply put, “Fireworks” is the story of the Declaration of Independence. In the song, listeners learn about the influences behind the declaration (Thomas Paine’s Common Sense), the date of the signing of the Declaration of Independence, who wrote it, and why it was so important. “We hold these truths to be self-evident…”

  1. “Preamble”

“We the people, / In order to form a more perfect union…”

Schoolhouse Rock!, “Preamble”

One would be hard-pressed to find someone who can recite the entire U.S. Constitution word for word. However, many Americans still remember the preamble to the U.S. Constitution, thanks to Schoolhouse Rock!. If you had trouble remembering the preamble before, just give “Preamble” a couple listens and you’ll have it memorized in no time.

  1. “Three Ring Government”

“Ring one, Executive, / Two is Legislative, that’s Congress. / Ring three, Judiciary.”

Schoolhouse Rock!, “Three Ring Government”

One of the fundamentals of American government that the U.S. Constitution instituted are the three branches: executive (the President of the United States), judicial (the Supreme Court), and legislative (Congress). In “Three Ring Government,” Schoolhouse Rock! connects the three branches of government to a three-ring circus, not in a way that’s meant to mock legislative officials and compare them to animals, but in a way that makes it easier for everyone to understand the balance of power between the branches.

  1. “I’m Just a Bill”

“I’m just a bill. / Yes, I’m only a bill. / And I’m sitting here on Capitol Hill.”

Schoolhouse Rock!, “I’m Just a Bill”

You knew it was coming—the classic song about how a bill becomes a law. Not only is this song one of the Schoolhouse Rock! classics, but it teaches a fundamental aspect of the United States government: how legislation is passed. In truth, the legislative process isn’t exactly simple, but Schoolhouse Rock! condenses it into a catchy 3-minute song. It’s fun to listen to, and explains a normally over-complicated idea into something even little kids can remember as they mature. In short, “I’m Just a Bill” exemplifies everything Schoolhouse Rock! stands for.

However, these songs are just the beginning. There are other songs about the United States: “Elbow Room,” which discusses westward expansion; “Sufferin’ Till Suffrage,” which discusses women’s rights; “The Great American Melting Pot,” which discusses immigration; “I’m Going to Send Your Vote to College,” which explains how the electoral college works; and “Mother Necessity,” which details the history of American inventions.

Overall, it’s not hard to see why Schoolhouse Rock! was so popular when it came out, and hopefully it will continue to be a mainstay in houses across the United States for years to come.

Unfortunately, this is the end of this article. Or as the song “Interjections!” so eloquently put it, “Darn! That’s the end!”

From The Great War to ‘Make America Great Again’: 100 Years in Reflection

0
0

By Emma Kaden

On November 11, 1918, World War I—then known as the Great War—ended. Not with a bang, but with an armistice signed by Germany and the Allies that finally made the guns fall silent. Much has changed since the armistice was signed—here’s a look at how the world has evolved since the end of World War I.

As the Great War ended, American soldiers came home to seek employment and return to normalcy. Because military innovation and weaponry were no longer needed, scientists and inventors could turn their attention to more consumer-oriented products. In fact, in the decade after World War I, a great machine would be invented that would change the human lexicon for all time: the bread-slicing machine. And henceforth, inventions (such as the CD, the telephone, and the digital camera), would all be “the greatest thing since sliced bread.”

In fact, if one were to travel through time from 1918 to 2018, one of the first things they would notice is the great advances in technology that have revolutionized society. For example, cars became more widespread and filled with creature comforts, spurring a transportation frenzy that led to commercial airplanes, cutting-edge public transportation, and so much more. Additionally, the computer has evolved from a bulky, primitive system to a device that not only fits in your pocket but can perform infinitesimally more functions at breathtaking speeds, shaping the way the world interacts with technology.

As hardware evolved, software redefined interpersonal relations in the blink of an eye with the invention of the internet. From the basic communication system of the early internet to the advanced, widespread web universe that supports revolutionary technologies such as social media, artificial intelligence, and virtual reality, the evolution of the internet has made sweeping changes to the way people communicate—and much, much more.

Not only have communication devices been changed how we interact, new devices such as air conditioning, the vacuum cleaner, the microwave, and laundry machines have changed how we perform household work. This has also allowed women to spend less time performing mundane household chores and opened up time to pursue academic and career endeavors.

Alongside more comfortable lifestyles, technological advances have more than doubled the lifespan of the average American. From penicillin to vaccines to MRIs to DNA mapping, modern medicine is far more advanced than that of the early 1900s.

Innovation followed the Great War in other areas of society, too. Immediately after the war, art reflected the pain and grief associated with the war—popular art was somber, to reflect the emotions most Americans were feeling. On the other hand, as Americans started to live more carefree lifestyles and experience brighter emotions, American culture evolved with them. Modern art took hold, striking and bold, giving artists more space to experiment and create new, exciting works and giving the average American a chance to really start the conversation about what a piece of art really means.

Literature, too, has changed in a big way since World War I ended. As the average American’s vocabulary and speaking style became more informal, the books they were reading followed suit. This was especially true as other forms of media began to take hold: television, online video services such as YouTube, video games, blogs, etc. As Americans sought out more and more digital content, books needed to find a digital form to keep pace. In 1971, nearly 50 years after the Great War’s ceasefire, the first e-book was published, changing the way Americans would consume information. Even now, literary culture continues to evolve at a swift pace. In fact, libraries have become hubs not just for physical books, but for e-books, computers, tablets, other technology, programming, and even everyday items such as umbrellas and bike locks (often referred to as the Library of Things).

Looking back, it is hard to see how the United States shifted from a place in which people were isolated to a place where everyone has a device in their pocket that can access information, contact others across the world, and do so many more things, all at a moment’s notice. The ways in which the nation has changed since World War I ended are immense and unmistakable. It’s hard to imagine what life would have been like if the Great War had ended just a few years later—or even a few years earlier. But the United States of today would not be the same without the innovations and cultural shifts that have shaped it since the armistice was signed on November 11, 1918.

Giving Thanks for Thanksgiving

0
0

By Emma Kaden

The First Thanksgiving took place in November 1621, and back then Thanksgiving was a way to celebrate the discovery of America. The pilgrims (supposedly) feasted on waterfowl, venison, ham, lobster, clams, berries, fruit, pumpkin, and squash. You probably can’t imagine having lobster at your Thanksgiving meal, right? That’s because the United States—and Thanksgiving—has evolved a lot since the 1600s.

Most Americans know what foods belong on this year’s Thanksgiving table: turkey, cranberry sauce, stuffing, mashed potatoes and gravy, pumpkin or pecan pie, dinner rolls—the whole shebang. Yet what not every American knows is how Thanksgiving dinner has evolved since that first fateful feast in 1621.

The First Thanksgiving

When Thanksgiving was first celebrated, pumpkin pie wasn’t even in existence—and, years later, when the first pumpkin pie recipes were written, it was made much more like an apple pie than the pumpkin pie modern Americans know so well. Another food that wasn’t on the table was turkey, although that might be hard to fathom for some Americans today. Instead of unwrapping a Butterball and baking it, the pilgrims enjoyed whatever food was fruitful. This meant taking advantage of natural resources so that the whole community was able to enjoy the feast.

In fact, when the first Thanksgiving rolled around, it really was the whole community—the 53 surviving pilgrims. Yet somehow, as time went on, the community-wide gathering of Thanksgiving turned into something celebrated only by families, with community gatherings being the exception rather than the rule.

Evolution of Thanksgiving

As time passed, Thanksgiving became less of a festival and more of a tradition, and in 1863, more than two centuries after the first Thanksgiving, then-President Abraham Lincoln established a national holiday for Thanksgiving to be celebrated, on November 22nd. Since then, not only have Americans been given time off of work and school to celebrate Thanksgiving, but many Americans take off extra time to spend time with family, as far away as they might be.

Not only are Americans leaving their homes to travel to see their families for the Thanksgiving holiday, they are leaving their homes to eat Thanksgiving dinner, too. According to a poll from the Bureau of Labor Statistics, 43 percent of Americans have Thanksgiving dinner away from home, 7 percent more than in 1986. As American culture changes, so does Thanksgiving, but it’s important to remember the origins of Thanksgiving.

Celebrating as a Community

As Americans enjoy their mashed potatoes and gravy, they ought to remember what it was like for the pilgrims. They celebrated, as one, their community and the very fact that they were alive—and that’s a lesson all Americans can learn from. This Thanksgiving, one can only hope that Americans will come together as a community and cherish the very country they live in.

Perception of a Reasonable Citizen

0
0

By Emma KadenThe Thinker by Rodin

“Reason” is often defined as the power to comprehend, judge, and think rationally. In a recent article, author Alexandra York describes the necessity of reason in understanding and thriving in a complex world—that is, the world we live in. In the article, York argues that reason is a fundamental that all worldly citizens must have. Not only does she say that reason is necessary, she contends that without reason, the world descends into a place where people can be manipulated into believing in illogical things. In short, York says that everyone should work to employ reason in order to build a more satisfactory—and healthier—lifestyle.

Protect Your Mental Immune System by Practicing Reason

By Alexandra York

Biologically, parasites are organisms that invade the skin’s surface or internal organs of another living creature in order to survive by robbing their “host’s” store of nutrients thus diminishing or depriving nourishment to the host.

Sociologically, America’s welfare system has created what regrettably can be termed human parasites by giving unnecessary “aid” — monetary payments, housing, healthcare, etc. — to (by now) generations of perfectly capable people who permanently obtain their survival needs from other humans who produce the money providing these benefits. Although many individuals actually warrant (temporary) assistance, the harsh truth is that like biological parasites these “freeloaders” leach “nutrients” from healthy, in this case wealth-producing, “hosts.” Unlike biological parasites they do not directly harm individual hosts; they drain producers through government taxation-expropriation via a socio-economic system that takes some fruits of labor from producers to give to non-laboring non-producers. But this isn’t the worst.

Psychologically, in America we now endure a swiftly proliferating kind of parasite that invades neither the human body nor the purse. This parasite takes many forms and invades the human mind via ideas lethal to healthy cerebral functioning necessary for life-supporting behavior. America is not the only country where ideological parasites are embedded into its socio-economic/political-educational-communication systems, but it’s the latest to succumb to severe infection from deleterious ideas anathema to sound human mental health. All living creatures, including humans, rely on a biological immune system to fight off disease (including physical parasite invasion), but because of free will rather than lower-animal instinct, humans also must have a mental immune system consisting of values from which to choose in order make judgments and decisions. Values are principles selected or accumulated to guide choices; they can be rational or irrational.

Over 2,000 years ago, Aristotle conceptually identified the faculty of reason — logical, non-contradictory thinking — as the fundamental requisite for human survival. Activating reason means that values and choices will be consonant with verifiable facts and existential reality. If reason is damaged and/or abandoned, then unfounded or even life-destroying anti-values can flourish and render the human mind incapable of making choices that support independent personal survival. Individuals bereft of reason must then survive off the production of others who do employ reason.

Socially fake-dependent human parasites are of a passive nature; they merely survive physically off the monetary wealth of producers. Now, however, our socio-economic/political-educational-communication system is creating mental breeding grounds for power-hungry parasites to propagate. Power means having the ability to get others to do (or think) what you want them to.

Public (and much private) education that imposes group oriented, politically correct tenets rather than independent, critical thinking skills to students are advancing ideological and behavioral conformity that deprives young people the opportunity to activate and practice the ability to reason — the only means to achieve a fulfilled, productive, and self-sustaining adulthood. Educators who employ such tactics do not teach; they indoctrinate by way of parasitically paralyzing rational thought and injecting venomous ideas into value systems of the young, who are vulnerable because they are in formative stages of mental development. Humans are born with the faculty of reason, but reasoning itself is not automatic; it must be learned. Irrational ideas that damage human nature’s survival requirement (reason) flout the facts of reality; thus, once students are infected with non-reality-oriented ideas, insecurity, and childish emotionalism take the place of thinking. Like undisciplined little kids, irrational individuals cry, scream, throw temper tantrums, and sometimes become violent when they don’t get their way. Witness today’s university students. What do educator-parasites get out of this? Nourishment for their own irrationality by destroying reason in their host-students and a pathological sense of power when they succeed in converting others to validate themselves.

Most of today’s media — entertainment and news — consist of agenda-driven actual or wannabe celebrities intent on invading minds of the gullible adult masses with senseless violence and senseless views on nearly every possible senseless subject. What do media-parasites get out of infesting host-audiences with unhealthy pastimes and ludicrous and/or false ideas? Nourishment to boost their own inane shallowness by influencing lifestyles and thoughts of others who (absent reasoned judgment) admire them and a pathological sense of superiority and power when they succeed.

After decades of increasing government regulations preventing free-market commerce that otherwise would respect decisions for gain or loss made independently by businesses, government now dictates not only economic activity but also hiring, firing, “diversity,” “equal pay” and other so-called “nondiscriminatory” employment requirements, internal employee behavior, and more. This massive invasion into business practices smothers the private sector into submission both financially and behaviorally to politicians’ demands, thus depriving individuals and companies from using reasoned judgment to run businesses according to their own values. What do politician-parasites get out of using force of law to control others? Position, perks, and power. More power. And more power to feed their insatiable greed for . . . raw power.

Once settled into host organisms, all parasites survive by attacking weak immune systems or actively weakening a healthy immune system until it becomes receptive to nourishing the parasite. Americans currently suffer both a longstanding but increasing physical (economic-behavioral meddling by government) and a fast-breeding mental (irrational indoctrination/influence by educators and media persons) parasitical epidemic. The only antidote for this colossal whole-culture parasite induced insanity is reason. Where reason resides parasites of the mind cannot penetrate nor proliferate. When individuals fail to utilize reason, they have no valid value system to provide immunity against thought-invading parasites; thus, as more and more Americans become underdeveloped, slothful of mind, or unresistingly submissive to tyrannical dictates we witness the alarming deterioration of an exceptional, robust nation into an obedient collective of “hosts” to nourish brain-devouring parasites.

Exceptions exist, but parasites of the mind have colonized education, media, and government. Victims are young, dull-witted, or fearful.

Ergo: Protect your mental immune system against parasite invasion by strengthening your own faculty of reason. Listen circumspectly. Think logically. Judge critically. Decide objectively. Speak openly. Teach truthfully. Behave sensibly. Live . . . rationally.

This article was originally published by Newsmax Media. Alexandra York is an author and founding president of the American Renaissance for the Twenty-first Century (ART) a New-York-City-based nonprofit educational arts and culture foundation (www.art-21.org). She has written for many publications, including “Reader’s Digest” and The New York Times. Her latest book is “Adamas.” For more on Alexandra York, click here.


The Education Takeover: How Early Schooling is Ruining Lives

0
0

By Emma Kaden

Lately, the trend has been to send children to school earlier and earlier in their lives. One has to wonder what would happen if children were simply taken to school from the moment they’re born—but that goes in the other direction, too. It’s becoming more and more standard to attend a college or university after graduating from high school, and some experts estimate that soon master’s degrees will be the norm as well, extending the average American’s education 5 to 7 years after high school graduation.

However, education after high school isn’t nearly as much of a problem as early education. In an article recently published by the Foundation for Economic Education, education policy writer Kerry McDonald discussed a new Harvard study that revealed the dangers of early schooling.

“Harvard Study Shows the Dangers of Early School Enrollment

Every parent knows the difference a year makes in the development and maturity of a young child. A one-year-old is barely walking while a two-year-old gleefully sprints away from you. A four-year-old is always moving, always imagining, always asking why, while a five-year-old may start to sit and listen for longer stretches.

Growing Expectations vs. Human Behavior

Children haven’t changed, but our expectations of their behavior have. In just one generation, children are going to school at younger and younger ages, and are spending more time in school than ever before. They are increasingly required to learn academic content at an early age that may be well above their developmental capability.

In 1998, 31 percent of teachers expected children to learn to read in kindergarten. In 2010, 80 percent of teachers expected this. Now, children are expected to read in kindergarten and to become proficient readers soon after, despite research showing that pushing early literacy can do more harm than good.

In their report Reading in Kindergarten: Little to Gain and Much to Loseeducation professor Nancy Carlsson-Paige and her colleagues warn about the hazards of early reading instruction. They write,

When children have educational experiences that are not geared to their developmental level or in tune with their learning needs and cultures, it can cause them great harm, including feelings of inadequacy, anxiety and confusion.

Hate The Player, Love the Game

Instead of recognizing that schooling is the problem, we blame the kids. Today, children who are not reading by a contrived endpoint are regularly labeled with a reading delay and prescribed various interventions to help them catch up to the pack. In school, all must be the same. If they are not listening to the teacher, and are spending too much time daydreaming or squirming in their seats, young children often earn an attention-deficit/hyperactivity disorder (ADHD) label and, with striking frequency, are administered potent psychotropic medications.

The U.S. Centers for Disease Control and Prevention (CDC) reports that approximately 11 percent of children ages four to seventeen have been diagnosed with ADHD, and that number increased 42 percent from 2003-2004 to 2011-2012, with a majority of those diagnosed placed on medication. Perhaps more troubling, one-third of these diagnoses occur in children under age six.

Children who start school as the youngest in their grade have a greater likelihood of getting an ADHD diagnosis than older children in their grade.

It should be no surprise that as we place young children in artificial learning environments, separated from their family for long lengths of time, and expect them to comply with a standardized, test-driven curriculum, it will be too much for many of them.

New findings by Harvard Medical School researchers confirm that it’s not the children who are failing, it’s the schools we place them in too early. These researchers discovered that children who start school as among the youngest in their grade have a much greater likelihood of getting an ADHD diagnosis than older children in their grade. In fact, for the U.S. states studied with a September 1st enrollment cut-off date, children born in August were 30 percent more likely to be diagnosed with ADHD than their older peers.

The study’s lead researcher at Harvard, Timothy Layton, concludes: “Our findings suggest the possibility that large numbers of kids are being overdiagnosed and overtreated for ADHD because they happen to be relatively immature compared to their older classmates in the early years of elementary school.”

This Should Come as No Surprise

Parents don’t need Harvard researchers to tell them that a child who just turned five is quite different developmentally from a child who is about to turn six. Instead, parents need to be empowered to challenge government schooling motives and mandates, and to opt-out.

As universal government preschool programs gain traction, delaying schooling or opting out entirely can be increasingly difficult for parents. Iowa, for example, recently lowered its compulsory schooling age to four-year-olds enrolled in a government preschool program.

As New York City expands its universal pre-K program to all of the city’s three-year-olds, will compulsory schooling laws for preschoolers follow? On Monday, the New York City Department of Education issued a white paper detailing a “birth-to-five system of early care and education,” granting more power to government officials to direct early childhood learning and development.

As schooling becomes more rigid and consumes more of childhood, it is causing increasing harm to children. Many of them are unable to meet unrealistic academic and behavioral expectations at such a young age, and they are being labeled with and medicated for delays and disorders that often only exist within a schooled context. Parents should push back against this alarming trend by holding onto their kids longer or opting out of forced schooling altogether.”

From Red Trees to the Anti-Christmas Disease

0
0

By Emma Kaden

Last week, First Lady Melania Trump shared her White House Christmas decorations with the world… and some of the world wasn’t very pleased. Many people criticized her choices in décor, especially a hallway full of bright red berry trees. They were compared to discriminatory outfits from Margaret Atwood’s The Handmaid’s Tale, among other not-so-pleasant things. So why the backlash? Since when has it become socially acceptable to insult the holiday décor at the White House?

Well, it goes a lot deeper than a simple dislike of red trees.

The United States was founded on Judeo-Christian beliefs, and as such, Christmas has been celebrated since the colonists first arrived. This celebration may not hold so much religious importance as it may have long ago, but has become a cultural tradition that has spread throughout the country. As such, many Americans hold Christmas near and dear to their heart—and many celebrate Christmas even if their religious beliefs do not align with Christianity.

However, Christmas has morphed into something even more than just a cultural tradition. Christmas nowadays involves a specific set of symbols, decorations, music, and so much more—and anyone who deviates from that has the mob mentality of a nation set upon them. Christmas trees must be green, they must have a star on top, there must be plenty of ornaments—and those are just the requirements for Christmas trees. Culturally, Christmas has become something where subscribing to the societal norm is the only way to escape criticism.

At the same time, society is struggling to stay inclusive—that is, you can have your tree perfectly decorated, have your Amazon Echo playing all the right Christmas songs, and still get in trouble for saying “Merry Christmas.” In fact, many big corporations have taken to putting “Happy Holidays” on their December packaging and advertising in order to be inclusive to those celebrating Hanukkah and other holidays. Yet this, too, has caused a divide. According to a poll by the Public Religion Research Institute, 67 percent of Republicans think switching “Merry Christmas” for “Happy Holidays” is unnecessary, and 30 percent of Democrats think the same.

Through the lens of the cultural phenomenon of Christmas, and how it has evolved from a Christian celebration to a set of rules about what color your tree can be (hint: green), maybe “Merry Christmas” isn’t so secular after all. Or maybe it still holds enough religious value to merit switching out the phrase.

What do you think? Let us know in the comments below.





Latest Images