• History-of-Freedom-of-Conscience
    • Prior Posts
    • The Inspiration
    • Tiny Minded Thoughts (TMT)

The Wall

  • Right to be Independent of: Personal Interest or the Common Good, Pick One!

    July 11th, 2024

                Human rights, a vague declaratory term that provides no insight as to where they come from. While the concept of human rights or individual rights originated from the natural rights, both concepts are now ruled by the zealous rationale of egalitarianism and freedom. Egalitarianism asserts that individuals have rights to abortion, rights to healthcare, rights to food, rights to a shelter, rights to equal pay, etc. These alleged rights can be derived from any number of books, theories, beliefs, religious doctrine or logical backflips. This paper is going to explore the shifting of the American political society from pursuing the common good to the desire for individual rights as moral objectivity was replaced by the pseudo-religious conception of egalitarianism under the grand umbrella of moral relativism which has displaced the social contract, with social demands.

    While there are few distinctions between rights that can be made[1], for the sake of this paper the difference between natural rights, fundamental positive rights and ordinary positive rights will be the primary focus once the ground upon which the concepts of American rights has been shown.

    Preview

                The essay will first give a definition of the term freedom, equality, and right, that will have a functional use within the narrower categories of rights. Then the differences between natural and positive rights will briefly be touched upon as to lay the foundation for the Social Contract Theory and its conceptual background in the modern era. After which, the goal of a political society will be discussed as the common good. This will bring up the lack of an ethos by which a common goal can be defined. Finally, resulting in the showing that individual rights are contradiction to a common good as egalitarianism is all about the self.

    1. Definitions and Terms

    As with any reasonable endeavor to further one’s understanding of a concept or any topic substantively. The definitions of words and terms used in the study of the topic are of vital importance. Further, the definitions of conceptual words are of consequence in order to grasp a topic or even have a discussion about a topic, as the proverbial saying goes: there can be no reason when the definitions of the words being used are not agreed upon.[2] Much of the language used to convey these ideas are old and have been re-salvaged several times for different religious, cultural, political and social means.          

    Prior to getting into the differences between natural, positive, and negative rights. The terms freedom, equality and right need to be defined.

    1. Freedom: The Mythical Beast of Fantasy.

                The term Freedom has scientific, philosophical, legal, as well as everyday uses. Freedom is one of the most butchered words of the 20th and 21st centuries. The way it is so loosely used to describe the willy nilly desires of anyone and everyone who invokes its name is disgusting from a reasonable, conceptual, and linguistic perspective. The word comes from the Freo which means “to love” like love which is beyond just having a feeling for something there is a tradeoff. Freo can also mean to share with a friend, as anyone who knows about friendship there is a give and take necessary to keep friendships healthy.

                In the more modern ages the term Free, has meant not a slave, not subject to foreign domination nor a despotic government. All to say that one was released from the physical bounds that previously restricted them. However, the concept of being “free” to do whatever one wants to do subtly snuck into the common use of the term freedom in the 19th and 20th century.

    The way the term “free” was used for most of its usage was in contrast be being forced into living a lifestyle that one did not have a choice in and was outside of the person’s control. Whereas freedom historical meant that the decisions that one was making with their life were independent of the coercive force of a master, lord, aristocrat or monarch. However, the idea of freedom expanded to mean that one should be able to do whatever they want to do, and that certain acts, behaviors and lifestyles should be acceptable was never a fore or afterthought when the term free(dom) was being uttered prior to the 19th century, or when it was used in the Declaration of Independence, or any of the United States Founding Documents. Thereby, freedom in the modern era is a political principle, which is merely a buzzword used along rights to justify an individual’s actions outside the content of the common good.

    • Equality, Equity, and The Fairytale of Egalitarianism:

    Equity comes from equal which the Latin is aequus which is commonly derived from the use of “being flat,” “even” or “level.” But the term Equity in its modern definition seems closer to the word superaequum when as a noun it is “over the plain” and when used as an adjective it can mean “razed,” as in the land was razed to level it out. Equity then is an attempt to “re-equalize outcomes” from between the separate already existing and newly defined groups.[3] Egalitarianism then requires one to notice that there are differences between individuals, but instead of accepting the differences between individuals, as innate or just, there must be an “unfair” structural order that set the differences up. Therefore, to achieve justice, the group or individual needs rights that assist that group in getting to the same level as a majority group or group member.

    • What is a Right?

    Upon a simple looking up of the term “right” one will see that it can refer to a physical direction, the correctness, or other abstract and conceptual meanings. Rights in the political and social realm tend to broadly refer to legal duties, obligations, and entitlements. Yet the scope of the term must be limited to have a meaningful discussion. When looking at the etymology of the word “Right”:

    1.         The word “right” originates from the Latin word Rēctus which can be translated as, straight, proper, or Righteous (moral integrity). Rectus is also the root word of the term Rectitude meaning morally correct behavior.[4]

    2. “Old English riht (West Saxon, Kentish), reht (Anglian), “that which is morally right, duty, obligation,” also “rule of conduct; law of a land;” also “what someone deserves; a just claim, what is due, equitable treatment;” also “correctness, truth;” also “a legal entitlement (to possession of property, etc.), a privilege,” from Proto-Germanic *rehtan (see right (adj.1)).[5][6]”

    For the sake of argument, the term Right or Rights in this paper shall mean an entitlement, since any rights argument is about what an individual inherently merits. The breadth of an entitlement is based on logic, beliefs, principles, and ideologies.

    The problem with creating a simple and easy to understand definition for rights, is that similar to the right of Freedom of Speech one can go “round and round on the original meaning of the First Amendment”.[7] However, at the founding of United States the Founding Fathers believed in natural rights.[8][9] There is debate as to what specifically constituted natural rights during the Founding-Era. The Founding-Era being the period when both the Constitution, Bill of Right and Declaration of Independence were drafted, amended, and ultimately ratified (which is typically considered 1774 to 1797).[10] Nonetheless, what is ultimately agreed upon by scholars is that natural rights are rights that exist without government. Legal Scholar Jud Campbell in his Constitutional Commentary, Republicanism and Natural Rights at the Founding, argues that the natural rights considered in the Founding-Era were not legal privileges or Entitlements but a “mode of thinking” which would maintain a “good government, not necessarily less government.”[11] This idea of a good government ultimately is tied into the goal of any political society when it is founded, and is expressed via the common good.[12]

    According to this perspective natural rights were philosophical pillars used by the Founding Fathers to apply to a social contract and create the Republican Government of the United States via the Constitution.[13] However, Campbell also posits that there were positive rights which required the Government to engage in certain forms of protections in exchange for control of the political society.[14]

    • The Social Contract: Between Whom and Why?

    Thomas Hobbes was the first modern Philosopher to assert The Social Contract Theory. In his book, Levithan, Hobbes lays out, through his version of first principles, how reason is a faculty of humans and that humans in the state of nature are inherently self-interested, though humans are self-interested they learn that though they cannot overcome nature they can reason with one another for cooperation which is the foundation of the social contract.[15] But just like nature imposes itself on humans, there needs to be someone, a Sovereign, at the top of the social contract that controls the tone of the people in a state.[16] The Sovereign is chosen because they have the ability, or power, to enforce the rules of the social contract when there are issues between individuals or if another state breaks the social contract that was maintained between ones state and another state.[17] However, in order for the social contract to be effective, the body of the state has to give up its individual rights to the Sovereign in order for him to maintain the desired order; Hobbes reasons that this is because individual persons’ ability to reason can be overrun by their passions, so the sovereign must have absolute control for the greater good of the body politic.[18]

    John Locke was the second modern philosopher to speak about the Social Contract Theory. Locke outlines the state of nature and the laws of nature, suggesting that individuals in this period are still governed by moral principles; however, he acknowledges the potential for conflict and the need for government to protect individuals’ natural rights.[19] Consequently, for Locke, the purpose of government was the protection of property and the preservation of individuals natural rights against external and internal threats.[20][21] Locke emphasized that individuals enter into civil society and thereafter form a government to secure their natural rights, indicating the need for protection from others who would intrude on the enforcement of the right to self-preservation and a violation of an individuals right to self-preservation delegitimates a governments legislature.[22]

     Thomas Jefferson, the primary Author of the Declaration of Independence[23] and James Madison, the primary Author of the Bill of Rights were both profoundly influenced by John Locke.[24] So much so that the term “pursuit of happiness” was lifted by Thomas Jefferson from one of John Locke’s writings[25] and in a correspondence with John Trumbull, Jefferson referred to John Locke as one of the greatest men who have ever lived;[26] Madison spoke of a government overreaching in their breadth and domain if they are attempting to decide or dictate what is “religious truth,”[27] which is nearly identical to Locke’s version in the “Letter on Tolerance”.[28] Over the centuries there has been challenges that have attempted to refute the influence that John Locke had on the American Revolutionaries[29] but in recent years the refutation has been completely dismantled.[30]

    The Declaration of Independence makes four direct and indirect references to Natural Law: (1) “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights”. This statement implies that these rights are inherent to all individuals by virtue of their existence, aligning with the principles of natural law.[31] This was also seen as recognition of basic human rights.[32] (2) “That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed”. This suggests that the purpose of government is to protect natural rights, reflecting the concept of natural law as the foundation of legitimate governance, not that new rights are created via a new government but that the whole reason for government is to protect these naturally occurring rights;[33] (3) “That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it[,]” Here, the Declaration emphasizes the right of individuals to resist oppressive governments that violate natural rights principles;[34] and (4) “Appealing to the Supreme Judge of the world for the rectitude of our intentions[,]” this reference acknowledges a higher moral authority, echoing the idea of natural law as a universal standard of justice.

    The Bill of Rights refers to inalienable rights.[35] However, the founding documents of the United States were not merely a regurgitation of Lockean principles and treatises. Legal scholars have debated over what the American Social Contract constitutes as well as the effects of natural law on the social contract that was elected under the United States founding documents, yet there has been precedent in several states as well as Federal Courts of the social contract applying as a legal theory.[36]

    1. The Making of the Social Contract According to Jud Campbell

    Campbell lays out four proposed stages of the social contracting completed in the Founding-Era:

    1.  The first stage was recognizing that there are rights that exist outside the realm of any government, which would be natural rights. According to Zephaniah Swift who Published the first Legal Treatise in the United States,[37][38] natural law was “the enjoyment and exercise of a power to do as we think proper, without any other restraint than what results from the law of nature, or what may be denominated the moral law.”[39]
    2. The second stage was the negotiation stage where some of these natural rights were being discussed and an agreement as to the fundamental rights are being worked out, as the political society is figuring out where this line should be. [40]
    3. The third stage is where the Fundamental Positive Rights that a would-be Government must provide in exchange for the natural rights individuals relinquish for certain protects is established.[41]
    4. The fourth stage was after the political society, or government, is formed and ordinary positive rights are then created by the legislature, these ordinary positive rights are known by the common law and statutory rights that are ratified. This being a continuing event as the legislature ratifies and abolishes laws.

    The relevance of these stages is important because in the modern day when it comes to certain alleged rights, the question arises whether they are natural rights that are protected by the social contract. Since at the third stage, it is assumed that individuals relinquish their natural rights in order to maintain the peace and security that is provided by a social contract.[42] This also implies that every individual consents to the political society, which results in a “single entity—a body politic—composed of all members of the political society.” [43] Campbell, while using the words of James Wilson[44] argues that “the first law of every government” is “the happiness of society”.[45] The Constitution[46] as well as the Federalist leader Theodore Sedgwick, felt that individuals rights could be suspended whenever said rights would “endanger ‘the public welfare.’”[47]

    Campbell proceeds to explain Alexander Hamilton’s view that no rights were lost, and that federalist argued the same thing.[48] However, this was seen as a minority view, and contrary to what many of the founding fathers believed, articulated and wrote.[49] Campbell then asserts that the conflict regarding retained natural rights was largely a semantic one since it was commonly agreed that “the government could restrict natural liberty in the public interest.” Two of the common disagreements regarding natural rights being ones that explain the scope of the public interest. The first being: that natural rights were already part of social obligations so no rights were lost by political society controlling these rights for the social benefit of the group; and the second being: that natural liberty was preserved as the liberty was fairly traded for a representative political society. This latter assertion was foundational because “representative institutions could consent to restricts of natural liberty on behalf of individuals.”[50] Thereby, “the sovereignty of the society as vested in & exercisable by the majority, may do anything that could be rightfully done by the unanimous concurrence of the members.”[51]

    Campbell then explains the difference between positive rights and fundamental rights and how they were not the same, but similar to the conception of fundamental positive rights and were combined; and how some of the founding fathers beliefs that enumerating the fundamental rights were not necessary since it was obvious; while others thought that even with rights enumerated there were other fundamental rights that could recognized.[52]

    As the Founding Fathers “blamed the problems in the states under the Articles of Confederation on an excess of democracy,”[53] the idea of individual rights seems like a manipulation of founding documents towards egalitarian aims that were not envisioned in the founding-era. Further, James Madison questioned the impartiality of the common person to be able to have the interests of society, instead of their own. As Madison wrote “the great danger lies rather in the abuse of the community than in the legislative body”.[54] Madison appeared to be fearful of what a political society would look like when there was rule by popularity or even rule by the majority.[55] When discussing the issues regarding elected representatives Madison stated that most people sought public office because of their ambition and for self-interest, while few sought office for the sake of the public good.[56] Based on these observations, Madison believed that the biggest threat to the American Republic was factions that would seek that which would benefit their own faction but not what is good for every one within a society, because then factions will create coalitions with one another in order to benefit each other at the expense of other factions, not for the sake of the public good.[57] (This is what was originally coined with the aphorism The Tyranny of the Majority). Just to list a few observe the Caucasus we have today: Congressional Asian Pacific American Caucus, Pacific Islands Caucus, Public Shipyard Caucus, Problem Solvers Caucus, New Democrat Coalition, Pro-Choice Caucus. Congressional LGBTQ+ Equality Caucus. Many of these Caucasus were not created solely for the common public good but merely for the good and interests of that group at the expense of others. Some will attempt to argue that “the assumption of a fixed-pie of resources” is the reason people believe that one group’s access to benefits is at the expense of another. However, as there are finite resources and a budget where there are only a set number of allocations possible. Yet, the larger point should not be missed that once a political society is separated by the interests of certain factions it means the common interests or goods are divergent from one another and not set towards the good of the whole political society, just certain constituents or Clienteles.

    The shift from the rights that are necessary for the common good to the rights that benefit individuals appears to have happened in the early 19th century as industry and monopolies were granted special privileges and subsidies to establish a “more robust economy”. The libertarian idea of “a commitment to the public welfare to an interest in wealth accumulation” did not change from being a means of achieving the general welfare to being the end goal until the 1820’s.[58]

    Currently, we have advocates for all sorts of new rights that were not around when the social contract was negotiated. There is a strong opinion that the social contract should be reaffirmed. There is also the opinion that every time a person votes they are reaffirming the social contract. The problem is that the common good, or the public interest is now inherently layered with self-interest, but further how is a consensus to be reached when there is not a common ethos in the United States. The ethos that is used to ratify laws or asserted positive rights appears to be via policy, principle, and the societal good.

    1. The Social Contract in Practice: The Battle Between Policy, Principle and the Societal Good.

    Many scholars see an issue with public policy and how it has run amuck, as minority groups are receiving benefits that help their group at the expense of the common good. This statement can apply to schools of thought that fundamentally disagree with one another since terms like “minority,” and “common good” are buzzwords that can be plugged into many arguments regarding the way public policy is aimed. Such examples include wealthy persons, affirmative action, government housing, gender equality, desegregation, and oil subsidies. The same person who believes that a poor person, who cannot afford their housing or is homeless should receive housing from the government will also believe that is immoral for individuals with millions of dollars in assets to be able to use tax breaks. But both the poor and the wealthy are minorities. Within this category of believing the above statement to be true both Allan Bloom and Martha Nussbaum can be included, even though they fundamental disagree with each other as to how higher-educational institutions, governmental policy and philosophers should conduct themselves and influence the world.[59] 

    Appeals to public policy appear to be appeals to the vague way a policy sounds good, appeals to the emotional senses, or the vague notion of freedom. Such as providing tax breaks to blind people.[60] By contrast, Principles seem to justify a political decision that regulates or secures an individual right.[61] The argument for anti-discrimination statutes, comes from the principle of equality.[62] Public policy and principles are not mutually exclusive nor are other facial justifications, such as generosity or an ethical society excluded.[63]

    While some scholars and philosophers such as John Rawls, who has influenced American public policy via his Original Position,[64] believed that the individuals rights should trump societal goals and should be the focus of public policy. In “A Theory of Justice” Rawls claims that “the rights secured by justice” are not beholden to “the calculus of social interests.”[65]

     At the other end of the spectrum we have people like Ronald Dworkin, and Thomas Sowell who see an issue with “individuals rights [being] political trump cards held by individuals[,]” when the interests of society should be the ultimate goal.[66] Meanwhile, others argue that a balance should be struck between societal goals and human rights with a hierarchy displaying which should take priority and when.[67][68]

    These positions of societal interest and individuals’ rights, both rely on public policy and Principles to appeal to the sense of the common good. But there is no longer a consensus as to ethos of the American morality. Some believe in objective morality and that is why their position should be upheld for the good of society, and others believe in relative morality which is why individual rights should be upheld more than anything else. This makes a differing vision between the common good, and its source of authority.

    • The Warring Ethos: Objective Morality vs. Subjective Morality (aka nature versus egalitarianism)

                We live in a fragmented political society, where there is no consensus as to the meaning of the societal good. There are scholars who believe that the United States is a Democracy, when the main drafter of the Constitution and Bill of Rights, James Madison, was vehemently against Democracy and wanted a Republic that consisted of a small number of representatives that sought the common societal good, not the rights of a group of people, or the rights of individuals to be championed.

    Objective morality requires one to believe that there are actions that are good and that there are actions that are evil. However, the way one decides which action belongs to which category is dependent on the value system one subjects themselves to. In the Declaration of Independence, the phrase of “their Creator” is imbued with several connotations that legal and philosophers argue over, but the perspective that is clear is that this was meant as a matching Right by which to level Great Britain’s claim of superiority. Being as that most of the American colonials were at the very minimum deists, and the chief reason many decided to make the treacherous and deadly ocean voyage over the Atlantic was to get away from religious persecution dominated by the Protestants and Catholics. As these two groups were having violent battles within the nations themselves and between nations. Despite what was going on in Europe, there was peace, and space between these minority Christian denominations in the US. Thereby, Creator was a neutral term, which refers to God, without referring to a specific entity in the trinity or lack of trinity that was likely to cause strife within the American Settlers as they had complete contrasting views of the hierarchy within the church.

    Accordingly, it follows that notions of being free were under the general view that there is a God or being that set the order to this world. While the first amendment makes it emphatic that there was to be no established religion the moral objectivity agreed upon by the United States founders, its elected representative, and appointed judges were also under the vague tutelage of an objective reality of good and bad set by a creator. There was discussion within natural rights, and even fundamental positive rights as to the breadth of this objective reality, but there was a consensus as a newly formed political society desired to achieve the common societal good as a higher priority over the rights of individuals, as long as the persons who were being governed consented. Thereby, believing that morality is objective, means that what is correct or good for people is fixed, and established; and any attempts to deviate from what is good will cause destruction; this view tends to be more closed-minded but also higher in consciousness.

    Whereas believing that morality is relative, requires a religious zealousness to concepts such as freedom and equality as “Rousseau said a civil religion is necessary to society, and the legislator has to appear draped in the colors or religion.”[69] Meaning that the downstream terms like culture that are meant to preserve a religious like structure while combining reason and religion without drawing the distinction between their polarities.[70] The fundamental issue with relative morality then is that it asserts that there are values or fundamental principles such as freedom and equality but it does not search for or explain the origin of these principles and there is no expedition of determining good and evil just merely asserting so. Further, since nature of people is not stagnant and is not fixed, it is something that society can move past; however, no evidence is required to back up this assumption, the only assumption necessary is believing that you are a moral and just being.[71] Even if there is no standard by which this justness can be substantiated since “secularism is the wonderful mechanism by which religion becomes nonreligion.”[72]

    From a practical level, the issue with people who push for relative morality and the individual rights that are subsequently asserted, is that these values are not authentic. Freedom and equality are vague principles that are based on nature, nor are they negotiated with people groups are working together because they have shared “values by which a life can be lived.”[73] Such true values must compass lessons, stories, and teachers who live by example of how day-to-day life should be conducted, such a Moses, Jesus, Homer, Buddha, “as a value is only a value if it is life-preserving and life-enhancing” for the people who live by said value.[74] Such as respect for life, treating others as you would want to be treated, giving charity, attending community events.

    Egalitarianism is founded on rationality, but egalitarian values are not based on the nature of those who it would force to conform to egalitarianism.[75] Thereby, this secular idea of egalitarianism imposes positive rights that individuals have, based on the devout belief that everything being equal is inherently good, but provides no supporting evidence from the nature of the individuals who are claimed to need the rights. This view then can dismantle the idea of the common good, since the common good does not hold or believe that freedom or equality trump the nature of individuals. Relative morality is fluid and leads to openness, but this openness is confined to mandating the belief of Egalitarianism which promotes indifference to how others are conducting themselves. Whereas, for there to be a consensus as to what constitutes the common good, there must be a shared sense of fundamental beliefs that does not hold all people equal, whether that be in physical ability, mental capacity, or some other criteria.

                Those who are in the moral relativism camp, appear to be more in the Lockean camp of believing that by nature people are moral. But they twist Locke’s words of a social contract and individual rights to Egalitarianism at the expense of the common good. The moral relativist end fulfills Madison’s deepest fears and uses their religious tiered principles of freedom and equality to subvert the common good by fractioning the interests of the minorities groups as being in the opposition to the majority. Further, they completely misinterpret the criticism that the founders had of a Majority, the criticism was of a democracy that used popularity to subvert the common good, not of Majority merely using their political powers for not Egalitarian uses. Also, they overlook that the principles of freedom and equality requires a repression of nature by reconstructing what it is to be human.[76] For a moral objectivist there is assumed to be a nature that exists and we freely subject ourselves to a social contract that seeks to elevate the negative inclinations of nature. While a moral relativist believes that nature has been conquered and now, we merely need to reorder our social contract as to reflect the proper forms of social conditions that are fully within our control, by placing equality and freedom as the foundations of our political society. In this endeavor, these political principles have destroyed the religious faith and traditional morality by advocating for individual rights that exist without any form of responsibility.

    • Delineation: Natural Rights, Fundamental Positive Rights, and Ordinary Positive Rights.

    For the founding Fathers rights were:

    “Divided between natural rights, which were liberties that people could exercise without governmental intervention, and positive rights, which were legal privileges or immunities defined in terms of governmental action or inaction, like the rights of due process, habeas corpus, and confrontation.” (“Yale Law Journal – Natural Rights and the First Amendment”)[77]

    The fundamental positive rights appear easy to delineate as they are the ones in the Bill of Rights. As for natural rights, being that are not legally or definitionally recognized, they are derived from life-preserving and life-enhancing values, in which there would reasonable be disagreement because one’s tradition, belief system, religious faith or denomination. Examples of Natural rights consisted of rights that did not depend on the government, such as speaking, eating or walking.[78] Ordinary positive rights were meant to be laws that were believed to be in the common good of the political society.

    • My Take Aways on Rights

    When I started my research into the topic of rights, I started with the assumption of referring to natural rights and fundamental rights together as negative rights and I was referring to ordinary positive rights as positive rights. However, after conducting the research that went into this essay, I have realized that there is a more fundamental issue at play. The use of egalitarian principles being utilized to subvert the United States political cohesion into factions that are fighting over pieces of the pie for their own group’s interests, instead of the interest of the common good.

    My prior criticism of assertions regarding the right to healthcare, or a right to food, was that the political society we live in is not mandated to force the Federal[79] or State[80] government to take action when a citizens personal liberties, or fundamental positive rights, have been infringed upon by a fellow citizen. Further, only in special situations does current law require of State acters to act:

    Only in certain limited circumstances does the Constitution impose affirmative duties of care on the states.[81]  As originally defined by the Supreme Court, those circumstances exist where (1) the state takes a person into custody, confining the person against his or her will, and (2) the state creates the danger or renders a person more vulnerable to an existing danger.[82] However, the “stated-created danger” doctrine has since been superceded by the standard employed by the Supreme Court in Collins.[83] Now, “conduct by a government actor will rise to the level of a substantive due process violation only if the act can be characterized as arbitrary or conscience shocking in a constitutional sense.”[84][85]

    However, now my criticism comes from the fact that these alleged rights are purely egalitarian principles that are asserting an alleged right while masquerading as a public common good. This façade wears the mask of rationality, and emotional manipulation, to benefit certain groups at the cost of a cohesive nation that wants a unified common good. The nature of the egalitarian gamesmanship is to find minorities and persons of different identities and pit them against the Majority who is claimed to be holding unequal power, rank, or influence.

                These would be rights are merely tools of the secular religious dogma that is not based on how individuals live their lives but claims to be the champion of the individuals’ rights at the expense of community, and the common good that individuals would normal coalesce around and work together to achieve. Instead, the concepts of freedom and equality replace the common good with the personal interest of retaining the ability to rebel against the common good for the sake of one’s own alleged good.

    There are arguments that a right such as a right to healthcare is a natural right, but it is obvious that when someone says this, they are not using Natural right in the same way that it was being articulated in the Founding-Era. Once there was a Political Society, or a government that agreed to come together, the common good was the main goal of the political society, not the individual.

                History to the egalitarian appears to be a source of explaining why they deserve more individual rights or why they deserve reparations. There appears to be no thought of the common good as separable from what the person wants in the moment. “To live for the moment is the prevailing passion—to live for yourself, not for your predecessors or posterity. We are fast losing the sense of historical continuity, the sense of belonging to a succession of generations originating in the past and stretching into the future.”[86]


    [1] Natural v. Legal; Claim v. Liberty; Individual v. Group; Civil, political, economic, social, religious, cultural… etc.

    [2] Hobbes, Thomas. Leviathan. Edited by Edwin Curley. Indianapolis: Hackett Publishing Company, 1994. Chapter 5.

    [3] such as Institutional racism as a philosophy that hinges on whiteness and white people with power as being the new definition of racism. Where because white or straight people are the majority by egalitarian standards other groups or minorities need to be make equal to the majority. Even if it makes no sense or is judged by a moving standard.

    [4] Merriam-Webster. (n.d.). Rectitude. In Merriam-Webster.com dictionary. Retrieved from https://www.merriam-webster.com/dictionary/rectitude

    [5] Harper, D. (n.d.). Right. In Online Etymology Dictionary. Retrieved from https://www.etymonline.com/word/right

    [6] Oxford English Dictionary. (n.d.). Right, adj. In Oxford English Dictionary Online. Retrieved from https://www.oed.com/dictionary/right_adj?tab=etymology#25350159

    [7] 1 Rodney A. Smolla, Smolla and Nimmer on Freedom of Speech § 1:11 (2016).

    [8] National Constitution Center. (n.d.). The Declaration, the Constitution, and the Bill of Rights. Retrieved from https://constitutioncenter.org/the-constitution/white-papers/the-declaration-the-constitution-and-the-bill-of-rights

    [9] Campbell, J. (2017). Republicanism and Natural Rights at the Founding. Constitutional Commentary, pg. 87. University of Minnesota Law School Scholarship Repository.

    [10] Wilson, J. F. (2011). The Founding Era (1774–1797) and the Constitutional Provision for Religion. In Oxford Handbooks Online. https://doi.org/10.1093/oxfordhb/9780195326246.003.0001

    [11] Campbell, J. (2017). Republicanism and Natural Rights at the Founding. Constitutional Commentary, pg. 87. University of Minnesota Law School Scholarship Repository.

    [12] Id 88.

    [13] Id 90, 112.

    [14] Id 91.

    [15] Hobbes, Thomas. Leviathan. Edited by Edwin Curley. Indianapolis: Hackett Publishing Company, 1994. Chapter 6, 8, and 17. (Hobbes discusses the concept of the social contract and the formation of a commonwealth in Chapter XIII, titled “Of the Natural Condition of Mankind as Concerning their Felicity, and Misery,” and Chapter XVII, titled “Of the Causes, Generation, and Definition of a Commonwealth.” In Chapter VI, “Of the Interiour Beginnings of Voluntary Motions, Commonly Called the Passions; and the Speeches by which They are Expressed,” Hobbes discusses reason as a faculty guiding human actions and passions)

    [16] Id Chapter 13.

    [17] Id Chapters 17 and 18.

    [18] Id chapter 17; Internet Encyclopedia of Philosophy. (n.d.). Social Contract Theory. Retrieved from https://iep.utm.edu/soc-cont/#SH2a

    [19]Locke, J. (1690). Second Treatise of Government (Chapter 2, Sections 6-8). Retrieved from https://english.hku.hk/staff/kjohnson/PDF/LockeJohnSECONDTREATISE1690.pdf

    [20] Id Chapter 9 section 124

    [21] ID Section 87

    [22] Id Section 22, Chapter 8 section 95, chapter 4, and Chapter 13 section 149.

    [23] Library of Congress. (n.d.). Thomas Jefferson: Declaration of Independence: Right to Institute New Government. Retrieved from https://www.loc.gov/exhibits/jefferson/jeffdec.html

    [24] Southern Methodist University. (n.d.). Influences on Madison’s Memorial and Remonstrance. Retrieved from https://people.smu.edu/religionandfoundingusa/james-madisons-memorial-and-remonstrance/influences-on-madisons-memorial-and-remonstrance/

    [25] Pursuit of Happiness Foundation. (n.d.). John Locke. Retrieved from https://www.pursuit-of-happiness.org/history-of-happiness/john-locke/

    [26]Library of Congress. (n.d.). Jefferson’s Draft of the Declaration of Independence. Retrieved from https://www.loc.gov/exhibits/jefferson/18.html

    [27] Founders Online. (n.d.). James Madison to Thomas Jefferson, 24 October 1787. Retrieved from https://founders.archives.gov/documents/Madison/01-08-02-0163

    [28] Southern Methodist University. (n.d.). Influences on Madison’s Memorial and Remonstrance. Retrieved from https://people.smu.edu/religionandfoundingusa/james-madisons-memorial-and-remonstrance/influences-on-madisons-memorial-and-remonstrance/

    [29] Isaac Kramnick, Republican Revisionism Revisited, 87 AM. HIST. REV. 629, 629–35 (1982) (surveying how a generation of “republican” scholarship downplayed the influence of Lockean ideas at the Founding).

    [30] Stanford Encyclopedia of Philosophy. (n.d.). Locke’s Influence. In Stanford Encyclopedia of Philosophy. Retrieved from https://plato.stanford.edu/entries/locke/influence.html

    [31]  Jeffrey Sikkenga, Lest We Forget: Clarence Thomas and the Meaning of the Constitution, 6 ON PRINCIPLE (Dec. 1998), available at http://www.ashbrook.org/publicat/onprin/v6n6/sikkenga. html (last visited Oct. 8, 2009). (Fortunately, though, the natural law approach has held a high place in American jurisprudence. Thomas Jefferson and James Madison agreed, for example, that the best guide to the Constitution is the Declaration of Independence and its philosophy of natural rights. This view was common at the Founding; so common, in fact, that early Supreme Court decisions, like Calder v. Bull (1798), claimed that even laws “not expressly restrained by the Constitution” should be struck down if they violate natural rights. Nor was this view limited to the Founding era. Before and during the Civil War, for example, Abraham Lincoln repeatedly appealed to the legal authority of the Declaration in his fight against slavery). See also discussion of William R. Long, Calder v. Bull (1798): The Issue of Natural Law (2005), available at http://www.drbilllong.com/LegalEssays/CalderlI.html (last visited Mar. 29, 2008).

    [32] Influence of the Natural Law Theology of the Declaration of Influence of the Natural Law Theology of the Declaration of Independence on the Establishment of Personhood in the United Independence on the Establishment of Personhood in the United States Constitution H. Wayne House

    [33] O’Scannlain, D. F. (2011). The Natural Law in the American Tradition (pp. 1516).

    [34] Balkin, J. M., & Levinson, S. (Year of publication). To Alter or Abolish. USC Law Review, 416.

    [35] Philip A. Hamburger, Natural Rights, Natural Law, and American Constitutions, 102 Yale L.J. 907, 908-911 (1993)

    [36] See previous 4 footnotes. Social Contract Theory in American Case Law Social Contract Theory in American Case Law Anita L. Allen.               

    [37] Conley, Patrick T. (1992). The Bill of Rights and the States: The Colonial and Revolutionary Origins of American Liberties. Rowman & Littlefield. p. 109. ISBN 9780945612292.

    [38] “Zephaniah Swift’s First Legal Texts in America”. Connecticut Judicial Branch Law Libraries. Retrieved December 30, 2012.

    [39] Campbell, J. (2017). Republicanism and Natural Rights at the Founding. Constitutional Commentary, pg. 91. University of Minnesota Law School Scholarship Repository.

    [40]  Id 92

    [41] Id

    [42]Campbell page 88 note 16 James Wilson, Of Municipal Law, in 1 COLLECTED WORKS OF JAMES WILSON 549, 553–54 (Kermit L. Hall & Mark David Hall eds., 2007); see also JOHN LOCKE, TWO TREATISES OF GOVERNMENT, bk. 2, chap. 9, § 130 (5th ed.; London, A. Bettesworth 1728) (individuals surrender “as much . . . natural Liberty . . . as the Good, Prosperity, and Safety of the Society shall require”).

    [43] Campbel page 88 note 17 JOHN ADAMS, A DEFENCE OF THE CONSTITUTIONS OF GOVERNMENT OF THE UNITED STATES OF AMERICA 6 (Philadelphia, Hall & Sellers 1787); Alexander Hamilton, The Farmer Refuted (Feb. 23, 1775), in 1 THE PAPERS OF ALEXANDER HAMILTON 81, 88 (Harold C. Syrett ed., 1961).

    [44] A founding father and former SCOTUS justice

    [45] See, e.g., James Wilson, Considerations on the Nature, and Extent of the Legislative Authority of the British Parliament (1774), in 1 COLLECTED WORKS OF JAMES WILSON, supra note 16, at 3, 5 n.c. (“The right of sovereignty is that of commanding finally—but in order to procure real felicity; for if this end is not obtained, sovereignty ceases to be a legitimate authority.”).

    [46] Article I, Section 8, Clause 1: “The Congress shall have Power To lay and collect Taxes, Duties, Imposts and Excises, to pay the Debts and provide for the common Defence and general Welfare of the United States; but all Duties, Imposts and Excises shall be uniform throughout the United States;”

    [47] Id Campbell ANNALS OF CONG. 1169 (Feb. 10, 1790) (remarks of Rep. Theodore Sedgwick).

    [48] THE FEDERALIST NO. 84, supra note 45, at 578 (Alexander Hamilton). 54. See, e.g., Remarker, INDEP. CHRONICLE, Dec. 27, 1787, in 5 THE DOCUMENTARY HISTORY OF THE RATIFICATION OF THE CONSTITUTION 527, 529 (John P. Kaminski & Gaspare J. Saladino eds., 1998); Virginia Ratification Convention Debates (June 16, 1788) (remarks of George Nicholas), in 10 DOCUMENTARY HISTORY, supra note 32, at 1334.

    [49] ID

    [50]  Campbell, J. (2017). Republicanism and Natural Rights at the Founding. Constitutional Commentary, pg. 96. University of Minnesota Law School Scholarship Repository.

    [51] Id 97

    [52] Very great historical analysis but to esoteric for this essay.

    [53] Campbell, J. (2017). Republicanism and Natural Rights at the Founding. Constitutional Commentary, pg. 103. University of Minnesota Law School Scholarship Repository.

    [54] Id 104

    [55] Madison, J. (1787, April). Vices of the Political System of the United States, section 11. https://founders.archives.gov/documents/Madison/01-09-02-0187#:~:text=Among%20the%20vices%20of%20the,over%20commerce%3B%20and%20in%20general

    [56] Id

    [57] Id

    [58] James L. Hutson, Virtue Besieged: Virtue, Equality, and the General Welfare in the Tariff Debates of the 1820s, 14 J. EARLY REP. 523, 525 (1994).

    [59] Nussbaum, Martha C. (2012). Philosophical Interventions: Reviews 1986–2011. New York: Oxford University Press. doi:10.1093/acprof:osobl/9780199777853.001.0001. ISBN 978-0-19-977785-3. Pg 45 (Nussbaum questions if Allan Bloom is really a philosopher).

    [60] Dworkin, R. (n.d.). pg. 82 Taking Rights Seriously. Retrieved from https://www.law.nyu.edu/sites/default/files/Ronald%20Dworkin%20-%20Hard%20Cases.pdf

    [61] Id.

    [62] Id.

    [63] Id 83.

    [64] The Heritage Foundation. (n.d.). Justice as Fairness (p. 43). Retrieved from https://www.heritage.org/progressivism/report/the-hidden-influence-john-rawls-the-american-mind ((1) Everyone is entitled to the same basic liberties, and (2) Inequalities in social or economic outcomes are only justifiable if: (a) All citizens have a fair shot at attaining the offices and positions from which these inequalities result, and (b)They benefit the most disadvantaged members of society)

    [65] Rawls, J. (1971). A Theory of Justice (p. 4).

    [66] Dworkin, R. (1977). Taking Rights Seriously (p. xi); Sowell, T. (1996). The Vision of the Anointed (pp. 209-211).

    [67] Queiroz, R. (2018). Individual liberty and the importance of the concept of the people. Palgrave Communications, 4, 99. https://doi.org/10.1057/s41599-018-0151-3

    [68] Quintavalla, A., & Heine, K. (2019). Priorities and human rights. The International Journal of Human Rights, 23(4), 679–697. https://doi.org/10.1080/13642987.2018.1562917

    [69] Bloom, A. (1988). The Closing of the American Mind (p. 196). New York: Simon and Schuster.

    [70] Id 197

    [71] Id 199

    [72] Id 211

    [73] Id 201

    [74] id

    [75] id

    [76] Id 97

    [77] https://www.yalelawjournal.org/article/natural-rights-and-the-first-amendment citing See Congressional Debates (June 8, 1789) (statement of Rep. James Madison), in 11 Documentary History of the First Federal Congress of the United States of America 811, 822 (Charlene Bangs Bickford et al. eds., 1992) [hereinafter Documentary History of the First Federal Congress]; Letter from Thomas Jefferson to Noah Webster, Jr. (Dec. 4, 1790), in 18 The Papers of Thomas Jefferson 131, 132 (Julian P. Boyd ed., 1971); An Old Whig IV, Index. Gazetteer (Philadelphia), Oct. 27, 1787, reprinted in 13 The Documentary History of the Ratification of the Constitution 497, 501 (John P. Kaminski & Gaspare J. Saladino eds., 1981) [hereinafter Documentary History of the Ratification]. The Founders sometimes referred to positive rights as adventitious rights or social rights. See The Impartial Examiner 1, Va. Index. Chron. (Richmond), Feb. 20, 1788, reprinted in 8 Documentary History of the Ratification, supra, at 387, 390 (1988); [George Logan], Letters Addressed to the Yeomanry of the United States . . . 39 (Philadelphia, Eleazer Oswald 1791); Philanthropos, Newport Herald, June 17, 1790, reprinted in 26 Documentary History of the Ratification, supra, at 1051, 1051 (John P. Kaminski et al. eds., 2013).

    [78] ID.

    [79] Lujan v. Defenders of Wildlife: 504 U.S. 555 (1992)

    [80] 42 USCS § 1983

    [81] Doe v. Braddy, 673 F.3d 1313, 1318 (11th Cir. 2012). 

    [82] DeShaney v. Winnebago Cnty. Dep’t of Soc. Servs., 489 U.S. 189, 198-201, 109 S. Ct. 998, 103 L. Ed. 2d 249 (1989). 

    [83] Waddell v. Hemerson, 329 F.3d 1300, 1305 (11th Cir. 2003). 

    [84] Id. (citing Collins, 503 U.S. at 128).

    [85] McKenzie v. Talladega City Bd. of Educ., 242 F. Supp. 3d 1244, 1255-56.

    [86] Lasch, C. (1991). The Culture of Narcissism: American Life in an Age of Diminishing Expectations (p. 5).

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
  • A little more on Symperifora and Semainess

    January 14th, 2024

    (1) Semainess, derived from the word meaning, meaning has its roots in the Greek term semasía, which signifies a message, and semainein, meaning “to signify.” The core of the word “meaning” lies in signification. The Greek use of the word virtue encapsulates this idea, illustrating how performing actions in an excellent manner is dignified. Consequently, there are goals where the mere attempt to achieve them brings contentment amid other challenges. This doesn’t necessitate belief; rather, the mere practice or habit itself adds a level of holistic connection. Semainess involves making sense of one’s actions, recognizing that engaging in certain activities contributes to a sense of wholeness. Even if those actions may appear trivial on the surface, striving for something virtuous brings numerous benefits that affect how one perceives the 7th dimension of group experiences, informs their conscience and consciousness, and influences their 5th dimension representations.

    (2) Symperifora: ΣΥΜΠΕΡΙΦΟΡΑ Root Word: “ΣΥΜ-” (sym-): Indicates association or togetherness. Affix: “-ΠΕΡΙ-” (-peri-): Denotes around or about. Root Word: “-ΦΟΡΑ” (-fora): Refers to carrying or bearing.

    a. Symperifora captures the essence of engaging with others in a meaningful and purposeful way, considering the various dimensions of self and the group experience. It emphasizes the significance of both individual conduct and collective participation in achieving a holistic understanding of one’s place in the group and the broader context.

    b. These elements converge to convey “conduct” or “behavior,” “participation” or “involvement” all into one word. Certain ways of orienting oneself provide a level of resilience by integrating with the dimensions below. Symperifora requires a belief in something larger, signifying that meaning is part of a broader scheme—one that may remain unknown or beyond complete understanding. Individuals experiencing the 7th dimension may discern distinctions in the meaning associated with the experiences brought forth by 7th-dimensional activity and understand how the overall purpose of this activity extends into other dimensions of the self.

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
  • Identity: Why Choosing What you Believe MATTERS

    November 13th, 2023

    No individual or living being exists in isolation. Entities are products of preceding influences. Individual persons emerge as products of various interconnected aspects of existence, including the mental, physical, spiritual, genetic, and more. These factors interact with core properties that are relatively unchangeable, such as genetic material, natural proclivities, and inherited trauma—collectively termed “Natural Identity.” On the other hand, properties subject to change, like weight, contentment, attitudes, beliefs, skills, and group affiliations, constitute what we can call “Nurtured Identity.” The former involves physical and mental traits experienced as inherent, while the latter involves those traits that individuals actively shape and adopt.

    To evolve into something, you begin as one entity and undergo a transformation into another. When someone declares, “I am now a doctor,” they transition from a non-doctor to a doctor. This transformation involves progressing through various stages, such as high school, undergrad, medical school, residency, and beyond, excelling at each stage until reaching the point of becoming what is considered a doctor.

    The same principle applies to other forms of identity, but with a distinction: identity, as a sense of belonging that imparts a purposeful meaning to one’s life, is often a subject, journey, or destination guided by one’s parents, immediate surroundings and/or community. In contrast to this, being a doctor is an occupation that influences both Natural and Nurtured identity. Individuals progress from participating in specific events during childhood to gradually taking on more responsibilities and engaging with customs. This evolution continues until adulthood (puberty), where the responsibilities and challenges of the world become a force that individuals must contend with independently. While they have their family and community members as guides and helpers, these individuals play a crucial role in assisting with the ongoing battle between Natural and Nurtured identities.

    The challenge with identity arises when individuals choose to represent themselves as a complete entity in aspects that exclusively pertain to either Natural Identity or Nurtured Identity. This involves the way people identify, connecting with concepts or expressions within familiar categories such as race, gender, illness, occupations and more.

    WHY IS THIS AN ISSUE?

    Throughout history, the way people identified themselves differed from the nuanced manner we do today. Individuals are dynamic, and identification is a contemporary expression of connecting intellectually or emotionally to something. It encompasses self-concept, perceiving oneself in relation to an idea, and possibly feeling represented by specific markers. The key concern lies not in feeling connected to a group similar to oneself, but in assigning undue weight or status to that group affiliation in how one conducts their life and strives for self-improvement.

    We all share a physiological need for belonging and a psychological desire to be part of a group. Our chemical makeup rewards connections with those who are similar or familiar to us. In the modern world, we articulate similarities and communicate the downsides, neutralities, and upsides of belonging to specific groups in new ways. We express different aspects of ourselves through various intersections of self-categorization, seen in various lights influenced by biases, stereotypes, and historical contexts related to different groups. However, within the complex landscape of identity, there’s an objective truth that has been obscured— not all ways of identifying are equal, and some can be detrimental to individuals, groups, and communities that strongly identify based on distinguishing factors that create the named groups.

    For example, when individuals prioritize their race as the highest form of identity they connect with, it inherently implies one of two perspectives: (1) considering their race as superior, or (2) regarding their race as inferior. Otherwise, the identification with a particular group wouldn’t hold any significant benefit. Things that are entirely equal don’t warrant distinctions. Therefore, as individuals, it is natural to perceive one’s own ethnic or racial group as superior, unless one is in a state of depression influenced by feelings of inadequacy.

    What I want to clarify is that perceiving oneself as superior doesn’t necessarily involve harming other groups to advance, at least not if one genuinely holds that belief. Consider basketball; if someone is the best 3-point shooter on the team, it doesn’t imply they look down on teammates with lesser shooting skills. More likely, their teammates excel in other areas on the court, like rebounds, assisting, defense, or alternative scoring methods. The underlying principle here is teamwork, as without it, the dynamic could devolve into a divisive “us versus them” mentality.

    This is why the concept of SELF-IDENTIFYING and holding one’s race in the highest regard is ridiculous and only serves to further divide society if it does not address the elephant in the room. IF IT IS OKAY TO BE PROUD OF YOUR RACE, ANY RACE SHOULD BE ABLE TO DO SO AND IF IT IS BAD FOR ONE RACE THE WHITE RACE (which is categorically undefined and constantly expanding) THEN IT IS BAD FOR ANY RACE TO BE SOLELY PROUD OF THEIR RACE.

    The PYRAMID of IDENTITY not a flat plain

    Similar to many aspects of life, there exists a natural hierarchy regarding which form of identity is more constructive for individuals and the communities contributing to the broader locality, ultimately shaping society. This hierarchy can be visualized as a pyramid, recognizing that nothing in this world is inherently equal. Falling for the noble lie that everyone is equal often stems from the noble notion that “the government should treat everyone with the same level of respect, rights, and dignity.”

    However, even if it is true—and society as a whole should indeed strive for this—it doesn’t mean that everyone is exactly the same or equally useful in various situations. This is not intended to diminish individuals. Every human being possesses innate value and a unique soul, with a distinct set of capabilities, tasks, and trials that only they can overcome. Each person contributes skills or abilities that make them invaluable to their community. Yet, the notion that everyone’s value is equal, while well-intentioned, is taking the idea to an impractical extreme. It only serves to create an illusion, allowing incompetent individuals to act as if they are on par with someone else, merely based on the chance that they belong to a different demographic. This leap in logic leads to the erroneous conclusion that every demographic is equal across the board.

    The idea of universal equality leads to a belief system that is developmentally challenged. This system necessitates ignoring inherent differences in favor of forced equity. Equity, in this context, is a belief system that is willing to disregard facts to make outcomes appear more similar than they are. Additionally, it inherently assumes that any disparities in outcomes between distinct identity groups (easily distinguishable on the surface) are solely due to oppression. This is why the current landscape of identity politics is so absurd—it compels people to act as if every form of identity inherently deserves the same outcome without considering the nuances involved. It has people acting as if every form of identity deserves the same outcome a priori.

    If you become a parent outside of marriage, the likelihood increases that both you and your children will experience poverty. Similarly, if you lack the skills for career advancement, your income is likely to plateau rather quickly. Identifying solely as a single mother may not work to your advantage. Even if you’re a skilled union worker—say, a TIG welder—once you’ve perfected your technique, your income potential might not increase significantly unless you venture into entrepreneurship. I’ve chosen welding as an example due to its demanding nature, the need for hands-on experience and education, and its relatively high pay. However, like many occupations, it also faces an income ceiling due to the limitations inherent in being a component within an organization.

    THE TOP IDENTITY DAWG

    If identity serves as a multi-faceted instrument guiding how one interacts with the world through Natural and Nurtured identity components, what is the most beneficial identity? A religious identity.

    The Nurtured Identity with the most profound capacity to navigate, support, and assist an individual through life’s myriad challenges is belief. What one believes holds the power to shape one’s entire perception. The very same words can evoke anger or laughter, contingent on the individual’s beliefs about themselves, their relationship to the spoken words, and the beliefs they hold about the role the speaker plays.

    There is no shortage of “isms” that can be employed as theories or themes to interpret one’s perceptions of events. However, most of these “isms” shape beliefs about the physical world and how it should be according to an ideal advocated by the respective “ism.” Beliefs based on such doctrines do not offer a framework for viewing the world in the form of normative; instead, they serve as descriptors into which one fits their experiences. Furthermore, these beliefs lack canonical traditions or cultures passed down from generation to generation. They function more as philosophies with which individuals can interact, or they aspire for the state or government to use them as a societal model. These “isms” and philosophies rise, fall, gain popularity, and then fall out of style over the centuries since their conception. Each iteration believes that if the philosophy is taught correctly or adhered to appropriately this time, the world will witness its superior perspective.


    “But what sets religious identity apart from the ‘isms’ and philosophies?”


    Religious identity demands something distinct from all other forms of belief. It necessitates belief in an entity, being, collective consciousness, or aura that transcends one’s individual existence. In contrast, other ‘isms’ merely require individuals to adopt the perspectives they provide regarding the material world unfolding before them. These often lack elements of practice, and when present, they are usually limited in scope. For instance, as someone who appreciates Stoicism, I’ve found it helpful in emotional regulation, but it doesn’t offer guidance on what is inherently good or bad. It emphasizes that such evaluations depend on how one perceives harm, asserting that only through perception can something be deemed hurtful.

    In contrast, religious identities demand that individuals first scrutinize their own behavior and adhere to the moral codes governing normative conduct in their personal lives. Moreover, every religious text provides insights into emotions—what they signify, their causes, how to manage them, and the reasons behind the world’s workings. Embracing a religious identity involves a profound struggle, requiring individuals to contend with their natural inclinations and subdue the instinctive pursuit of pleasure devoid of higher purpose. It also entails a heightened awareness of the lower-level nurtured identities one may embody or associate with, whether correctly or incorrectly. A Religious Identity imparts purpose and meaning to one’s life in the face of existential uncertainties. Anyone identifying as religious or belonging to a religion understands the ongoing internal conflict between desires and the morally correct course of action.

    This concept seems challenging for many with liberal inclinations. It appears that they naturally possess a lower level of negative predispositions towards what modern society generally deems as bad or evil. As a result, the internal struggle that often propels individuals toward a religious identity is mitigated by the contemporary legal landscape. Moreover, liberalism tends to amplify emotions, transforming them into a currency that can be exchanged for any other grievances, provided they align with the prevailing dogma.

    BUT WHY SHOULD YOU CARE.

    It matters because all other forms of identity not only tend to become tribal but are inherently tribal, particularly natural identities. They function as clubs one is either born into or must undertake specific acts to be considered a member. However, these identities fail to provide meaning regarding why each individual is important in the world, why personal struggles are surmountable, or an overarching purpose. While many borrow theistic principles, they often lack the codification of those principles within the identity itself. In essence, these identities are constructed and activated solely for political purposes to address the current societal “dilemma.”


    This is why identity politics is perilous. The aim of these loosely defined forms of identity is often to gather as many members and allies as possible for a cause, whatever that cause may be. Unlike traditional identities that foster communities, uphold traditions, maintain culture, and strive for a better future for the next generation, modern political identities tend to select one of these objectives and utilize emotional narratives to enlist people as proxies for the agenda of the purported leaders of the identity group. They assert that you are fighting for the just cause and, if not you, then it will all be over, as if the entire world rests solely within your control, burdening you with the overwhelming weight of the world on your shoulders.


    This is not to dismiss the noble pursuit of positive change. However, many of these identities lack an a priori perspective on what is truly noble. They often stem from reactions or power grabs founded on abstract notions of equity, which may seem superficially plausible when one unquestioningly accepts the noble lie of everyone being exactly equal. In contrast, religious identity mandates that individuals align their own values before attempting to influence the broader context.

    Moreover, religious identities necessitate genuine community. Definitions of community rooted in secular (non-religious) principles or identities serve practical purposes by delineating socially recognizable groups of individuals. However, they often lack the concrete and unchanging moral guidance required to steer society towards a virtuous course. In contrast, religious communities stand out by providing enduring, timeless principles that have withstood the test of time and continue to be steadfastly embraced by their adherents (community members). This holds significance because communities shape children into adults, instilling them with moral convictions and an identity that the rest of the world must contend with.

    The concept of secular identities are be deceptive, as it is inherent in human nature to engage in acts of reverence or worship, whether directed at tangible objects, individuals, or abstract concepts. Consequently, modern individuals often exhibit a secular orientation that frequently lacks a well-defined, enduring framework for the transmission of their beliefs across generations. As a result, these beliefs serve not only as tools to shape all of one’s experiences but also as mechanisms for shaping the future. While we all hold beliefs about various things, it is the beliefs we identify with that propel us to action or inaction, and the identity that has endured longer than any other form of association is that of religious belief.


    Hence, if you choose to retain your lower form of identity, be prepared for the likelihood that what you presently perceive as just will probably be deemed as evil and archaic in the next two generations. However, if you believe yourself to be more morally upright than your predecessors solely due to being born in a more modern era, there may be little I can do to convince you otherwise.

    On a final note, the criticism of wars, killings, and oppression under religious kingdoms and doctrines can be disingenuous. Hundreds of millions of people were murdered in the name of communism, secularism, and the right to be a sovereign nation, just in the 20th century alone. Conflating the idea that nations use ideologies to conquer, kill, and manifest their destiny seems more like an inevitability not solely tied to a specific identity. War will persist as long as misunderstandings exist between people with different identities, particularly when the conception of identity is misaligned with secular identifiers. Moreover, even more conflict and loss of life may ensue if the highest form of identity is not recognized and reintegrated into the psyche of individuals through religious communities.

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
  • Clientelism: the Tracing of our Vasal State

    November 3rd, 2023


    Currently, there is not a single demographic in the United States of America that does not receive some form of benefit when they exercise their right to vote. Since the presidency of Lyndon B. Johnson (LBJ), the federal government has strategically harnessed this dynamic. If you feel marginalized or overshadowed by the interests of other groups, and you believe you’re not receiving fair representation from the federal government, it indicates that the interest group you’re associated with (which could encompass a wide range, such as teachers, unions, law enforcement, corporations, pharmaceutical companies, the military-industrial complex, oil companies, farmers, women’s rights advocates, proponents of child tax benefits, religious, and more) are not effectively utilizing its financial resources and voting influence compared to competing groups in the same domain.

    Essentially, you’re criticizing the fairness of a game without a full understanding of its rules. Can you genuinely argue for fairness when you lack knowledge about all the factors and rules that influence the game? The answer is no, or at least not from a standpoint of good faith.

    LBJ introduced the practice of exchanging votes for subsidies, setting a new benchmark in American politics. With the implementation of the Great Society programs, the driving force behind people’s choices in local politicians shifted away from communities of individuals. Instead, the criteria for selecting one’s representative became centered on whether voters were willing to support a single or double issue that could be connected to their interests, affecting their well-being, wealth, and finances. This shift in focus devalued the communal consensus based on shared values. This transformation gradually infiltrated small communities, where hometown individuals ventured to prestigious schools and returned with a vision of attracting clientele who could support their political aspirations.

    After decades of weakened communities, the composition of congressional and senatorial representatives now consists largely of individuals who align their votes with policies benefiting a substantial portion of their active and vocal constituents. Even though taxes are collected from those who don’t vote, politicians often convey to their supporters the idea that the eventual “ballot vote” will serve the interests of the coalition capable of promising votes and funds. This represents a significant departure from the original purpose of the American government and the principles on which the nation was founded. The initial intent was for elected representatives to safeguard the rights of their constituents against encroachments by the government and foreign entities. However, the federal government was never intended to serve as a tool that offers various coalitions specific advantages in the form of government subsidies and loans in exchange for money, votes, and status. Consequently, election winners are those who pledge to provide the desired benefits to the right groups.

    When a government realizes it can sway its constituents through incentives, it loses its restraint and will exploit its citizenry. The constitution of a country is designed to shield its citizenry from actions, rules, statutes, or mandates put forth by their “Governors” for personal gain. Local laws are intended to govern what an individual citizen is allowed or prohibited from doing within a specific locality. However, when a government utilizes its legislative authority to dispense benefits to those willing to vote for them, a significant issue arises.

    This is not to suggest that governments should never provide subsidies to various enterprises at different times. However, historical subsidies were primarily intended to support infrastructure development in an area. The deviation from this original purpose eventually gave rise to the modern concept of subsidies. To comprehend how subsidies evolved into an accepted part of everyday American life, one must examine the initial private monopolies (referred to as “Old School Subsidies”) in the United States.

    The First Form of a Subsidy –> Monopoly.

    “Old school subsidies” were those that conferred exclusive rights upon certain companies or individuals to exploit resources that might not have been developed due to a lack of funding necessary to initiate the enterprise without the aid of establishing a monopoly. This is particularly applicable in the context of projects such as bridge construction, railroad development, oil mining, exclusive maritime contracts, and, in the modern era, utilities like electricity.

    It seems that two primary rationales underpin the logic for allowing these monopolies to maintain control:

    1. These structures or enterprises would not be developed without monopolization.
    2. Adequate funding for such ventures would be lacking without monopolization.
    3. Hence, the government needs to grant exclusive monopolistic rights to certain investors for projects.

    These ideas can be dissected as follows:

    (1)  

    a. Investors expect a return on their investment.

    b. Initiating these projects requires a substantial investment.

    c. Without protection, other players can easily enter the market, diminishing the returns for initial investors.

    d. Secondary development becomes more feasible after someone has pioneered the way.

    e. Reduced returns deter future investments in such projects.

    (2)

    a. The enterprise or infrastructure is of critical importance or offers substantial benefits to the area or country.

    b. Engagement in these projects would yield considerable benefits to the region.

    c. If returns for investors in the infrastructure are insufficient, the crucial enterprise may not be constructed.

    d. The absence of such development incurs an opportunity cost.

    e. The exclusive license becomes the sole means for these projects to come to fruition in the near future.

    (3) Consequently, the government needs to grant an enduring exclusive monopoly right to investors in the respective project.

    Notably, there exists a logical gap between step 2 and step 3, as well as a somewhat abrupt transition between 2(e) and 2(d). The link between 2(e) and 2(d) serves as the initial crucial step in the emergence of lobbying. Individuals interested in building bridges or securing exclusive ferry rights could navigate from 1(a) to 2(d), initiating the rudimentary form of clientelism. This clientelism evolved and was set free into the world, a phenomenon that can be likened to omphalotomy (the act of cutting the umbilical cord). Figures such as the Rockefellers and Vanderbilts excelled in the art of lobbying, crafting a form of business royalty. They knew how to effectively influence legislation and manipulate administrative non-enforcement by skillfully engaging with the powerbrokers and purse strings.


    State-sponsored monopolies granted to private entities aren’t inherently negative. Many governments and kingdoms throughout history have employed such arrangements. The corporations of antiquity often served as extensions of feudal kingdoms, pursuing the economic objectives of the realm or religion as they expanded into new territories. Even the monopolies established in the United States have reasonable justifications. They emerged as world markets opened up, and nurturing and, at times, safeguarding industries became essential for global competition.

    However, when entities like Standard Oil were granted a 40-year exclusive right to oil, the Vanderbilts gained control over all transportation, and JP Morgan’s influence over the railroads went unchecked, issues arose. These monopolies persisted until additional anti-trust laws, such as the Sherman Act, were enacted to address the symptoms of the federal government’s willingness to either redistribute wealth from some individuals to others or grant exclusive rights to private companies and entities.

    The Shift to BAILOUTS

    The issue then transitioned into bailouts, where private companies could turn to the government to rescue them from imprudent decisions. The first recorded instance of a bailout in the United States occurred during The Panic of 1792 when the federal government intervened to stabilize the markets. At that critical juncture, Treasury Secretary Alexander Hamilton authorized purchases to prevent the securities market from collapsing. There are limited examples of government bailouts in the 19th century, with any potential instances likely related to the aftermath of the Civil War. However, bailouts primarily become a prominent issue when banking institutions grow too “large to fail”.

    In the early 1900s, particularly during the early 2000s, the financial landscape witnessed a series of bank runs and panics driven by extensive speculative investing—a situation that may sound all too familiar. In response to these challenges, the federal government entered the realm of financial control by establishing the Federal Reserve. (As a side note, I once aspired to be the Chair of the Federal Reserve, diligently reading every economics book I could find during my younger years.)

    However, the formation of the Federal Reserve did not prompt banks to adopt wiser or more secure banking practices. Subsequently, both the United States and the global economy took a perilous plunge, as people made investments and borrowed against assets with little to no tangible value, or assets that existed merely in theory. It’s almost reminiscent of the financial crisis of 2008, albeit happening some 80-odd years earlier.

    The Beginning of American Feudalism: The New Deal and old costs


    In the years that followed, FDR laid the foundation for a resurgence of a quasi-feudal system. In fairness, it was a challenging period in history, marked by a workforce with a staggering 25% unemployment rate, and genuine hunger among the population. This was not the statistical games played by modern economists, which indicate that 30% of Americans feel food scarcity. As I write this, I’m indeed feeling hungry, and I haven’t yet planned my dinner. However, this doesn’t equate to imminent starvation.

    FDR’s approach during a time of crisis has been a subject of debate. But he was a bully and a borderline tyrant. While his leadership during World War II and the subsequent economic prosperity for nearly two decades helped overshadow certain issues. FDR’s willingness to threaten to pack SCOTUS led to the resignation of Justices who opposed his policies, ultimately resulting in the New Deal.

    However, it’s worth noting that the New Deal wasn’t a cohesive, systematically designed solution to address the various aspects of the Great Depression. Instead, it took a multifaceted approach, dealing with different elements of the crisis in ways that sometimes appeared haphazard and occasionally led to contradictions. To delve deeper into the subject, one might consider reading the case of Wickard v. Filburn, which set a precedent in 1942 and granted the federal government a broader scope of authority to intervene in matters within the privacy of one’s home or property, a decision that I find repugnant.

    FDR’s New Deal established the Works Progress Administration (WPA), which employed various individuals, including artists, actors, and authors, to contribute to the construction of new schools, bridges, and other infrastructure projects across the country.

    “The National Recovery Administration attempted to check unbridled competition which was driving prices down and contributing to a deflationary spiral. It tried to stabilize wages, prices, and working hours through detailed codes of fair competition. Meanwhile, the Agricultural Adjustment Administration sought to stabilize prices in the farm sector by paying farmers to produce less. Finally, over the course of the New Deal, the administration addressed questions of structural reform. The Wagner Act, which created the National Labor Relations Board in 1935, was a monumental step forward in giving workers the right to bargain collectively and to arrange for fair and open elections to determine a bargain agent, if laborers so chose. The Social Security Act the same year was in many ways one of the most important New Deal measures, in providing security for those reaching old age with a self-supporting plan for retirement pensions. But there were other reform measures as well. The Securities and Exchange Commission and Federal Deposit Insurance Corporation were new.”[1]

    Rather than allowing the country to self-reflect on its challenges and encouraging local communities to collaborate for stability, FDR introduced government programs that fostered dependency on government intervention to rectify issues when they arose. This inadvertently incentivized risk-taking in situations where caution might have been the norm to avoid catastrophic consequences. It’s worth noting that government-funded programs, in essence, involve taking money from some individuals to provide for others. In a society, if individuals wish to offer charity, they should do so based on their own initiative. Historically, if a community believed that a neighbor wasn’t contributing adequately, they could address the issue through rebuke, discussion, or even exclusion.

    However, with the government’s involvement, communities were left less equipped to deal with adversities such as famine, plagues, or economic downturns as they had traditionally. The federal government began mandating and litigating against farmers for producing more than their allotted quotas, notably in the case of Wickard. Government interference with businesses and their ability to compete, was then set in motion.

    The concept of unions having the ability to influence their employers through collective bargaining altered the dynamics of work, discouraging the need for individuals to improve their skills in order to earn higher wages. This shift undermined meritocracy. Furthermore, it facilitated the formation of coalitions that, while not legally considered mafias, were sanctioned by law and could leverage their collective interests to influence various policies. It’s important to note that many people within specific professions often share similar worldviews and temperaments, which not only make them suitable for the job but also contribute to their job satisfaction.

    Social security was destined to favor the older generation, but it went beyond that, fostering a sense of financial irresponsibility and diminishing the commitment to raising children. Both parents and children began to believe that the government would ensure financial support in old age, leading to a reduced emphasis on nurturing intergenerational relationships that form the cornerstone of communities and societies.

    The SEC is uses its authority to categorize certain matters within its jurisdiction and potentially take actions against those it disapproves of. Similarly, the FDIC has sometimes been viewed as a tool that banks exploit to take more risks with their clients’ funds, particularly when most clients fall below the threshold for significant losses of banked funds. This can create a false sense of security, as even though there’s insurance, the money is still at risk and subject to speculation.

    The New Deal marked a transformative shift in the role of the Federal Government, which previously had minimal involvement in people’s lives beyond the collection of income taxes, mainly once a year. However, with the New Deal, everyday activities such as working, banking, and saving became subject to government regulation and intervention. Moreover, a growing number of individuals, including those in specific occupations like the military or farming, as well as the unemployed, could claim government benefits.


    However, this was merely the mechanism through which clientelism became a pervasive issue, as policies and representatives were no longer elected solely based on their qualifications as representatives of the United States. Over the ensuing decades, the focus shifted towards leveraging agency to secure rights and benefits from the government, as these benefits were increasingly being distributed by the government itself.

    “Ask not what your country can do for you, but what you can do for your country…” These words, articulated by JFK Jr., were not spoken in isolation from the civil rights movements. Instead, they aimed to reinforce the existing sentiment that the government provides citizens with opportunities to assist one another, if taken with a benevolent interpretation.

    All that remained for LBJ was to secure unwavering support from the Black vote in the United States, enabling the Democratic Party to advance its agenda of exchanging benefits for status. Consequently, all the Democrats needed to do was pledge positive outcomes and formulate policies that appeared to favor the Black community, regardless of the outcome. Following suit, Republicans did the same thing with rural and religious collisions. These effects continue to resonate to this day.


    [1] https://www.banking.senate.gov/imo/media/doc/WinklerTestimony33109TheNewDealSenateTestimony.pdf

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
  • The Mandela Effect of the Civil Rights Movement

    August 20th, 2023

    Origins of the Term

    The Mandela Effect, coined in 2009 by paranormal researcher Fiona Broome, refers to a phenomenon where large groups of people remember an event or fact differently from the way it occurred. Broome’s realization stemmed from her incorrect belief that Nelson Mandela died in prison in the 1980s, even though he was released in 1990 and went on to become the president of South Africa and passed away in 2013. She then speculated on the possibility of collective misremembering: a convergent, yet false consensus within the social consciousness. It became clear to her that this wasn’t an isolated case; many individuals recalled historical events differently from actual records.

    Moreover, if a distorted memory can remain consistent across a large enough set, can that distortion simply command the dominant opinion?

    The Mandela effect does not attempt to find answers, but simply points out a phenomena. While the Mandela Effect is intriguing and sometimes unsettling, most psychologists and memory researchers believe that it can be explained by the quirks and imperfections of human memory rather than more fantastical theories like parallel universes. However, regardless of its origin, the Mandela Effect is a fascinating insight into the collective human psyche and the nature of memory.

    Often, the memories influenced by the Mandela Effect aren’t pivotal enough to reshape human history, even if they were accurate. Yet, they’re consequential enough to unsettle those grappling with the reality of their misrecollections and alter the course of human history. However, if a misremembered zeitgeist possesses enough collective significance, it stands to reason that its residual fingerprints will also influence personal behavior, in varying degrees of importance. The most captivating aspect of the Mandela Effect is that it commonly pertains to details that, while peripheral in our daily lives, hold a certain abstract significance. These memories might be cherished for storytelling, entertainment, or simply to feel included in a shared cultural experience. The Mandela Effect thus serves as a fascinating exploration into the malleability of human memory and the ways in which our perceptions of reality can be subtly, yet profoundly, shifted. It’s a testament to the complexities of the human mind, showing how even seemingly insignificant details can become central in discussions about reality, perception, and truth.

    Attribution plays a pivotal role in the Mandela Effect, extending beyond the theory itself. Our minds naturally strive for coherence, and in doing so, sometimes twist or simplify information to fit into a familiar or easily understandable narrative. These narratives may be informed by cultural references, repeated misquotations, or simply the way human memory tends to streamline information. One explanation lies in the narratives we craft to give meaning and context to our memories, both for our own understanding and when sharing with others. Take, for example, the frequently misquoted line from Snow White; while many recall it as “Mirror, mirror on the wall,” it’s actually “Magic mirror on the wall.” Similarly, in the Star Wars saga, the commonly remembered phrase “Luke, I am your father” is, in fact, “No, I am your father.” These discrepancies highlight the interplay between collective memory and the narratives we construct. Further, these misquotes have been repeated so often in pop culture references, parodies, and casual conversations that they’ve overshadowed the original lines. The familiar becomes the “correct” in the collective memory, even if it’s not accurate.

    Attribution, in this context, goes beyond just pinpointing the source of a memory. It’s about understanding the narratives and stories that shape our collective understanding of events or phrases. By recognizing the role of narratives in the Mandela Effect, we gain insights into the malleability of human memory and the influence of cultural storytelling.

    Memory isn’t a static repository of facts but a dynamic, reconstructive process. Each time we recall an event, we don’t access an exact imprint or “video recording” of it; instead, we’re reconstructing that event based on the traces it left in our brain. This reconstruction is influenced by myriad factors: Current Emotions and Beliefs, Subsequent Experiences, Social Influences, Cognitive Biases, Aging and Maturation. Each time we revisit a memory, we’re not just passively observing it but actively reconstructing it. This reconstruction, influenced by our current self, beliefs, and context, can subtly alter the memory. Over time, these alterations can accumulate, leading to significant distortions. This understanding of memory has profound implications for many areas, from everyday misunderstandings to the legal system’s reliance on eyewitness testimony. It underscores the importance of recognizing the fallibility of our memories and the factors that can shape and reshape them.

    Memory, and the context in which it’s framed, is crucial. However, it’s often influenced by our present perspective. Rather than being a direct recollection of an event, memory is more about recalling the last time we pondered that particular sequence of events, leaving an imprint on our mind. This means we do recall certain elements of an event, but every time we revisit that memory, our current perspective can reshape it. Consequently, the nature of a memory can shift and evolve over time. As individuals undergo change, growth, and gain new experiences, the way they interpret a past memory is likely to change, highlighting the fluidity and malleability of our recollections.

    Consequently, the Mandela Effect becomes an interesting idea when applied to collective memories. With many such events, the warped version can merely be an innocent retelling, or a poorly worded version being rewritten by the collective conscious of people to add a little umpff. Lets start with nelson Mandela because well his name is the namesake of the effect.

    To reiterate, the thought of the Mandela effect arose because the coiner remembered very distinctly believing that Nelson Mandela died inside of a prison in 1980. However, if you look at Mandela from a purely political standpoint, he was a violent ethno-nationalist. He believed in an homogenous ethnic state,  he was willing to use any means necessary to achieve that end, he sought for his ethnic demographic  to be the sole controllers of the state and administration, and he supported redistributing resources through the national government by leveraging its power. Violence was a mere instrument to achieve the means that he believed at his core. In many ways, Nelson Mandela held many of the same principles that a National Socialist held in Nazi Germany.

    “One of the great mistakes is to judge policies and programs by their intentions rather than their results.” -Milton Friedman

    Delving deeper into Nelson Mandela’s[1] history reveals his leadership role in the African National Congress (ANC). The ANC, while advocating for an end to apartheid, did not shy away from employing violent tactics, including bombings, to achieve its aims. Given the backdrop of South Africa’s complex political landscape, Mandela’s imprisonment can be seen as a result of his association with such activities. In many countries during that time, the accusations leveled against Mandela might have led to even harsher consequences. Still, public perception often simplifies his story. Many recognize Mandela as the stalwart who ended apartheid but may not fully understand the circumstances of his incarceration. Notably, his time in prison was influenced by a deeply divisive political regime with racial undertones.

    However, the widespread mis-remembering that he was responsible for the deaths of dozens appears to be a common mis-recollection, possibly verging on intentional distortion.

    When someone misremembers a line from a movie, I see it as either a simple lapse in memory or a subconscious yet sharper callback with added flair. It could be an innocent mistake or, as Bob Ross might say, a “happy accident.” However, when we shift to significant cultural moments like the civil rights movement or universally acknowledged principles concerning Mandela, understanding the reasons behind these misconceptions becomes more complex. These distortions, taking root over generations, pose a challenge to determine. Still, I can at least highlight factual inaccuracies, even when many adamantly believe otherwise.

    Desegregation was a Purely Religious Movement.

    Check how many times MLK jr said “Christian”, “Faith”, and “Church”, in the Famous Letter from a Birmingham Jail.[2] Then read hisI have a Dream speech, and see how he talks about faith after his five declared dreams, or how he refers to “God’s Children”.[3]

    Here are some excerpts from other speeches where he talks about the philosophy:

    Conquering Segregation: “Racial segregation is a blatant denial of the unity which we have in Christ. Segregation is a tragic evil which is utterly un-Christian.”. . .  “The Philosophy of Christianity is strongly opposed to the underlying philosophy of segregation. Therefore, every Christian is confronted with the basic responsibility of working courageously for a non-segregated society. The task of conquering segregation is an inescapable must confronting the Christian Churches.” (The Role of the Church in Facing the Nation’s Chief Moral Dilemma, April 25, 1957)[4]

    Church and State Relations: “The church must be reminded that it is not the master or the servant of the state, but rather the conscience of the state. It must be the guide and the critic of the state, and never its tool. If the church does not recapture its prophetic zeal, it will become an irrelevant social club without moral or spiritual authority.” (A Knock at Midnight, June 11, 1967).[5]

    One of the most notable misconceptions, reminiscent of the Mandela effect in recent history, is through viewing the civil rights movement chiefly as a secular initiative, overlooking its profound religious underpinnings. The forthcoming release of the FBI files might shed more light on this matter.[6]

    MLK Jr., a Southern Baptist, viewed segregation as morally wrong based on his religious beliefs. He shared the concern of many religious leaders and followers of his time that the nation’s once-prevalent religious ethos[7] was fading into secularism. [8]

    People often point out, “Christians in the South supported Jim Crow and slavery.” Indeed, many did, but it’s essential to recognize that the driving force behind the call for the rapid abolition of slavery stemmed from the very same Christian doctrine. The compelling moral foundation of the abolitionists’ appeal to Christianity ultimately overshadowed the Baptist’s arguments in support of slavery. Should we offer them forgiveness for previously upholding beliefs and legislating in ways they thought crucial to their way of life, or should we harshly condemn them for it?

    Discussing broad and vaguely defined beliefs often obscures the crux of the matter; the devil is in the details. Trying to equate today’s “liberal” with the definitions from 20, 40, or 60 years ago is like comparing two entirely distinct species: though they might attempt to intermingle, ultimately, they’re incompatible. The same challenge arises when reconciling various religions or even different sects within a religion, like Christianity with its plethora of denominations and traditions. If one can’t specify the time, place, or the central figure behind a movement, their argument is as uninformed as saying that slavery no longer exists globally, or that the USA was the last to abolish it. If anyone doubts the accuracy of these statements, they might not be prepared for deeper discussions on the topic.

    Slavery

    If someone wishes to discuss slavery in the context of Christianity, it’s essential to specify the denomination and estimate the number of adherents who truly held that belief. I argue that it was a dominant minority who utilized religious justifications to defend their “rights” in the South. However, considering that most individuals did not own slaves (primarily due to economic constraints) and were more concerned with federal interference (prioritizing state rights over federal authority), they rallied against perceived threats, leading to the Civil War. With this in mind, how do you then view Islam, given that a significant portion of modern-day slavery, which now exceeds numbers from before 1870, is predominantly in countries with Muslim majorities? [9] [10]

    It’s evident why the Southern Baptist’s defense of slavery using scripture paled in comparison to the Abolitionists’ arguments. Beyond the negative reputation of slavery, the practice of American chattel slavery bore little resemblance to the concept of slavery mentioned in the Bible. American slavery was fundamentally free trade at its darkest, devoid of any moral framework. In contrast, biblical standards for slavery implied a mutual responsibility: acquiring a slave meant also acquiring a duty to them. The Bible mandated fair treatment of slaves, including fair wages, rest during the Sabbath, and a prohibition against harsh or severe treatment. [11][12]  An “owner” had a moral duty to recognize the humanity of their slave. If a slave was unjustly beaten or harmed, they were granted their freedom. In essence, the Bible emphasized treating slaves as human beings, acknowledging that their servitude was a product of circumstance, not a reflection of their inherent worth.

    While comparing historical standards of living to modern conditions might seem inappropriate, it’s worth noting that many contemporary practices, like eugenics, raise ethical concerns. Consider the discussions around the disproportionate representation of minorities in prisons; a parallel can be drawn to the demographics most affected by abortions. It’s a misconception to claim that southern states wish to restrict abortion with the intent of harming black communities. [13] In reality, fewer restrictions on abortion could result in more minority births, seemingly counterintuitive for states often labeled as “racist.” [14] Meanwhile, states often regarded as “tolerant” and “open-minded” have higher abortion rates among minorities, both in numbers and proportions.

    Segregation

    If you believe in the giant melting pot, you are a crack pot. The melting pot provides no substance. The concept of the “melting pot” is often romanticized, but such homogenization can lead to cultural erasure. While the imagery of a pot suggests richness and substance, the reality of assimilation often strips away cultural depth. Many communities have sacrificed their customs, traditions, and practices for the sake of conforming, this conformity has cost more lives than the number of lives lost in wars. The term “genocide” directly translates to “the destruction of nations.” [15] Broken down, its components, “geno” and “cide,” evoke the image of “kneeling down.” It’s a somber reflection: assimilation can be akin to a community bowing down, surrendering its identity to a dominant culture or tradition.

    Dreams come with their own set of trade-offs. If the aim is universal harmony, one might need to let go of deeply-rooted traditions or delineate between in-groups and out-groups. In reality, these classifications help individuals discern between values that hold merit, those that can be detrimental, and those unduly prioritized by others. Such distinctions have historically been the compass guiding personal and societal alignments.

    Indeed, people instinctively lean towards “in-groups” and “out-groups.” This raises the question: which practices or values from the in-group should I respect and which habits or lifestyles from the out-group should I steer clear of, especially when striving for “morality”? The term “morality” is so wide-ranging that it can often seem void of clear meaning unless it’s based on specific sentiments or values. While there’s nothing inherently wrong with intuitive feelings, it’s more constructive to align oneself with individuals who resonate with our aspirations, rather than relying solely on what “just feels right.”

    So, as traditional religion and communities weakened due to desegregation, bussing, and further strengthened by initiatives like LBJ’s “Great Society”, people started to align themselves with broader, often superficial markers like class, region, sports, politics, race, and, more recently, “gender identity” and orientation. However, these identifiers are as shallow as bonding over a shared eye color when it comes to navigating deep value-based disagreements. While there’s a plethora of literature, theories, and shared experiences within these categories, they don’t provide clear guidelines to distinguish the “in-group” from the “out-group” or explain why certain practices are deemed virtuous and worthy of emulation. Most of the terms we use, such as “good”, “bad”, and “evil”, originate from distinct philosophies, religious beliefs, or codes of conduct. Yet, these terms are frequently employed without a full grasp of their profound implications. This generalization is why definitions of concepts like “evil” vary widely, and those less articulate might simply lean on popular opinion without understanding or conveying the term’s deep-rooted essence.

    The Slippery slope of liberalism

    Liberalism, with each evolution, redefines what’s considered mainstream or acceptable. However, this evolution faces two primary challenges:

    • What was pivotal in a movement two decades ago can now be seen as extreme or outdated. A once-central idea can drift to the margins, dismissed as passé or as an “ist”. This fluidity works if societal perspectives remain youthful and adaptable. But while change and openness to new concepts are encouraged, the broader aim seems to be ever-increasing inclusivity.
    • What many overlook is that tolerance has its limits. Think of it as a bowl: it can only hold so many ideas or values. As you determine which concepts deserve space in this ‘bowl of acceptance’, inevitably, other ideas or practices get pushed out or are deemed less valuable. They aren’t prioritized or taught, and thus, over time, become viewed as less important. And just like a bowl can only contain a certain volume, our capacity for tolerance has boundaries. Anything that doesn’t fit into our defined scope of acceptance eventually gets perceived as lesser in our evolutionary mindset, lying outside our “bowl of tolerance”.

    Passive Assimilation

    Most current residents of the United States descend from immigrants. Historically, the decision to journey thousands of miles to settle in a new land was typically driven by the lack of promising prospects in their homeland. However, if someone migrated before the 20th century, assimilation wasn’t typically on the agenda. These individuals, along with their families, sought to maintain and live by the customs, values, and traditions inherent to their lineage. It was only in the early 20th century, when economic motivations began to dominate immigration reasons, that this mindset began to shift. This evolution in perspective is reminiscent of the Mandela Effect in contemporary times.

    As the “melting pot” concept grew more prominent in contemporary culture, a shift in mindset occurred. Individuals began to prioritize ensuring a better life for their children, wanting to shield them from the challenges and adversities that prompted them, their parents, or even grandparents to emigrate. The idea was to integrate their offspring seamlessly into this melting pot. After all, the melting pot is supposed to be an equalizer, treating all its components fairly and without prejudice, right?

    Many of us inherited the resilience and adaptability genes, traits honed for thriving in evolving circumstances. Our parents, in their aspiration for a better world, instilled in us the belief that being kind and treating everyone equally was of paramount importance. They envisioned this as the foundation for a harmonious society. In retrospect, this perspective was overly optimistic.

    Such a perspective is naive because it overlooks the importance of community and a sense of belonging. Without these, it’s challenging to find a purpose or anchor during trying times. Being part of a group that shares traditions, culture, and background, or aligning with a religion that offers meaning in the face of adversity, can act as a bulwark against the bleak void of nihilism.

    Without the distinction that separates one group from another, the melting pot becomes a jumble of diverse ingredients. However, individual preferences, sensitivities, and aspirations act as filters in this pot of varied elements. People possess unique tastes, dietary restrictions, or visions for the desired outcome. This inevitably leads to the pot fragmenting. The dichotomy between liberals and conservatives emerges as essential seasoning for the soup, each offering distinct flavors and nourishment. However, both pots often become cluttered with ingredients not for enhancing the overall flavor, but for the sake of aligning with specific groups. The pots lose their original purpose as they’re influenced by the desire to appease diverse associations, a departure from their intended essence.

    Groups responsible for the “pot” should clearly define what ingredients they’re adding and why. Historically, these choices were guided by tradition or a higher moral imperative rooted in religious beliefs. Such guidance influenced leaders to advocate for protections of specific behaviors, impose sanctions against others, or deem certain actions as matters of personal discretion.

    The Mandela effect has, in some ways, reshaped the collective memory of Americans regarding MLK’s actual mission. He advocated for a religious ethos to be central in American discourse, practices, and the rule of law. Today, many often cite his reference to judging by “the content of someone’s character,” yet there’s little reflection on the standards by which this content should be evaluated. MLK wasn’t just alluding to basic civility or pleasantness. He was unequivocally pointing towards living a life aligned with Christian virtues or sincerely following one’s faith.

    This reminds me of a personal observation: the idealized concept of the melting pot has, in some instances, devolved into the transmission of degenerate values.

    Both of my grandmothers prioritize two primary characteristics when it comes to potential partners for me: attractiveness and agreeability. Reflect on that for a moment. In their eyes, these traits supersede all other attributes in a woman I might introduce to them or even consider for marriage and parenthood. To me, this prioritization is astounding. While these traits may be significant, they shouldn’t overshadow the core attributes that genuinely matter in a lifelong partner. The critical questions should be: Do their values align with yours? And could you accept the possibility of your children inheriting their most challenging character flaws?

    This underscores a pivotal issue stemming from the civil rights movement. It’s fairly reasonable to argue against categorizing people as “the other” based solely on superficial differences, asserting that distinctions should be more rooted in values (or beliefs). Society has oriented itself on the concept that all persons are created equal, which was referring to the fact that everyone has the same civil liberties. However, when we oversimplify the concept to assert that everyone is inherently equal in all aspects, problems arise. For those who deviate significantly from the average — be it above or below — this blanket notion of equality can seem nonsensical. Those on the lower end of the spectrum might fixate on visible differences, while those on the higher end tend to abstractly differentiate based on nuanced variations.

    Segregation was indisputably wrong, but not only for the reasons often touted from a secular perspective. The issue wasn’t merely the separation based on visible differences. It was fundamentally flawed because it was a government-enforced division based on race, which lacked a genuine understanding of human value or the innate human desire for connection. This made it arbitrary and inherently unstable. Labeling the civil rights movement as purely secular is an oversimplification, particularly when considering the push for immediate “anti-racist” laws. Secular ideologies about life’s proper course can shift over time, lacking enduring anchors. In contrast, regardless of one’s personal stance on religion, religious values typically advocate for timeless principles, intending to guide not just the present but also future generations, ensuring their continuity and growth.

    The most accurate portrayal of the civil rights movement is as a religious endeavor. While it parallels many secular narratives, it’s deeply rooted in spiritual beliefs and perspectives. The core essence of this movement, now somewhat distorted by Mandela effect-like social memory, emphasizes that what’s truly invaluable is often oversimplified in popular recounting. At its heart, the message is clear: GOD DOESN’T CARE ABOUT YOUR IDENTITY AS LONG AS YOU WALK THE RIGHTEOUS PATH. Conversely, if society becomes hyper-focused on racial lines, which inherently emphasizes personal experiences and subjective perspectives, we venture into treacherous territory. This perspective only holds when individuals perceive the world similarly, within a close range of experiences or beliefs. It circles back to the inherent danger of assuming everyone attributes the same importance to an event, ideology, or interpretation as one personally does.

    This sentiment reminds me of the notion, “I might be mistaken, but until someone challenges my beliefs and proves otherwise, I’ll stand by them.” This underlines the duality of my call to action: (1) I strongly discourage any physical harm or injury based on my statements, and (2) if you genuinely disagree, articulate your counterargument precisely. Merely labeling my viewpoint as “offensive” or “potentially misunderstood” isn’t a substantive rebuttal.

    Regarding point (1), I genuinely hope no physical harm comes to anyone based on these statements. However, if emotional or psychological certainties — or any other deeply held perceptions of truth — are challenged in a purely conceptual manner, that’s a consequence I can accept. Often, groups decrying “hate speech” paradoxically do so from a stance of power, masking a genuine fragility and employing manipulative tactics. If disagreements are rooted purely in beliefs, they should be expressible. It’s an alarming conceit to consider discourse beneath oneself. Using broad labels like “anti,” “phobic,” “ist,” “ism,” or any other sweeping categorizations to dismiss dissenting views is, in itself, a direct ideological assault.

    Such terms have their place, particularly as summative descriptors after a comprehensive explanation of why a particular belief or action might be objectionable to a group. However, three main issues render these catch-all labels less credible: (1) they often bypass standalone, substantive arguments; (2) instead of focusing on the nuanced reasons behind a disagreement, they rely on broad categories of approval or disapproval, only occasionally referencing actual historical conflicts between groups; and (3) their primary aim is to provoke an emotional reaction. If these terms were genuine calls to action, the general populace would either rally behind them or vehemently oppose them, making them integral to our shared discourse.

    Perhaps that’s the pitfall many men in my lineage succumbed to: they became too engrossed in another’s narrative and medium, neglecting to forge their own based on their life experiences. The challenge, it seems, lies in introspection.


    [1] While Nelson Mandela did not personally kill anyone, he did advocate violent resistance against the South African government by creating a militant wing of the African National Congress. This organization set fires and bombed cars, killing several dozen people over a number of decades. Some then attribute these deaths to Mandela’s leadership while others focus on his nonviolent work and writing. Given the political environment and the anti-apartheid cause, it is difficult to make a moral assessment of these parts of Mandela’s life. https://www.heraldsun.com.au/news/opinion/the-dark-side-of-nelson-mandela/news-story/68f4acdbf2b0b4e6c799e458a55e6cb2

    [2] https://www.africa.upenn.edu/Articles_Gen/Letter_Birmingham.html

    [3] https://www.npr.org/2010/01/18/122701268/i-have-a-dream-speech-in-its-entirety

    [4]http://okra.stanford.edu/transcription/document_images/Vol04Scans/184_1957_The%20Role%20of%20the%20Church.pdf

    [5] https://kinginstitute.stanford.edu/king-papers/documents/knock-midnight-0

    [6] https://nymag.com/intelligencer/2019/06/martin-luther-king-fbi-files.html

    Martin Luther King Jr.

    [7] https://mixedadvocate.com/2023/05/28/weapons-of-mass-distractions-the-technologies-that-distracted-communication-skills/

    [8] https://mixedadvocate.com/2023/05/14/secularism-the-tradition-ender/

    [9] https://www.arisefdn.org/slavery-today?gclid=CjwKCAjwkeqkBhAnEiwA5U-uM5Kdr0ukWSSMAw7qbfSb3n3Vn0HyZg0Ky4y6wWAgDhLQPVROoSODvxoCNCgQAvD_BwE

    [10] https://worldpopulationreview.com/country-rankings/countries-that-still-have-slavery

    [11] (Ex 20:10, Job 31:13-15, Deut 24:14-15, Lev 22:11, Mal 3:5, Lev 19:20-22, Ex 21:20-21, 26-32,

    [12] https://www.dbu.edu/mitchell/early-modern-resources/biblesla.html#:~:text=Likewise%2C%20slaves%20were%20to%20be,26%2D32%2C%20also%20cf.

    [13] https://www.pewresearch.org/short-reads/2023/01/11/what-the-data-says-about-abortion-in-the-u-s-2/

    [14] https://www.kff.org/womens-health-policy/state-indicator/abortions-by-race/?currentTimeframe=0&sortModel=%7B%22colId%22:%22Location%22,%22sort%22:%22asc%22%7D

    [15] It is important to parse out the Difference between causalities of war from people who died of disease during war, but further to separate the amount of people who died in war from the battle that actually kills the culture which is when the literal blood bath is over. The actions that kill cultures are what happens immediately after was via deportation or the intentional destruction of the primary city of a culture/nation, actions that are not really war but certainly not peace. The Assyrains would often deport and spread out the survivors of a city to different corners of their empire after took it,. After a  general massacre, such an action was taken once formal hostilities ceased but was clearly more than passive assimilation. Passive assimilation is more like what the romans did to the Gauls, and is what is currently taking place in the western world.

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
  • The Collapsing Betrothal: Part 2, Big Brother Replacing Weak Communities

    June 26th, 2023

    Aid to Families with Dependent Children (AFDC) was initially established as part of the programs introduced under President Franklin D. Roosevelt’s administration, aiming to expand the social safety net. By the early 1960s, the program underwent desegregation to ensure that black households had access to these benefits. However, a lesser-known aspect of AFDC was the implementation of a “man in house rule.” The original intent behind this rule was to prevent able-bodied men who were capable of working from exploiting the system by solely relying on government assistance. In 1968, the case of King v. Smith (392 U.S. 309) was brought before the court to address the legality and implications of the “man in house rule.”

    In summary, the case involved Mrs. Smith, a mother of four children, three of whom had lost their father. She was in a relationship with a man who had nine children of his own. The AFDC sought to terminate her benefits due to the presence of this man in her household. The court ruled that since the man had no legal obligations towards Mrs. Smith and her children, they should not be penalized by losing their benefits. Consequently, while the “man in the house rule” was deemed unconstitutional, it inadvertently created a disincentive for individuals to remain in relationships with the fathers of their children, as government support would only be provided if the father or husband was absent from the household.

    The AFDC program provided assistance to individuals who were widows or leaving abusive relationships, offering a crucial support system. However, the scope of this aid extended beyond those specific recipients, as anyone who met certain age requirements could access the program. As a result, the program’s original intention, which was rooted in the biological sense of ensuring security before engaging in risky behaviors like sexual relationships, was weakened. This was because women now had the government as a surrogate, albeit providing a minimum level of financial support, which diminished the incentive for personal financial responsibility.

    The BIG Three

    Welfare and the war on poverty had three significant effects on women. Firstly, it provided a means of escape for women trapped in abusive marriages or abandoned due to circumstances like the death of a spouse or irresponsible behavior by men. Secondly, it inadvertently created a disincentive for marriage among those living in poverty. And thirdly, it facilitated a safety net that made it easier for women to recover from difficult situations and regain stability in their lives.

    • The Wager: Most single women who received benefits from government programs like food stamps, housing, etc were not in this category.

    While I don’t have specific data on the number of women who took advantage of welfare programs while in abusive relationships or as widows, I believe that the responsibility for addressing these issues lies primarily with collapsing communities that failed to self-correct internally. As discussed in Part 1, the breakdown of marriages due to abuse, infidelity, and other marital issues can be effectively addressed when individuals share the same value system and submit their marriages to a higher authority. Similarly, a well-connected and proactive community tends to offer support, whether through providing food, assisting with household tasks, or local fathers mentoring and guiding young orphans. Unfortunately, communities that lack even the most basic structure, as described in Part 1, are unlikely to have other systems or procedures in place to help individuals during unfortunate events that may occur with some frequency, such as war, famine, or plague.

    As communities became less involved and engaged, it is possible that the decline can be attributed to the shift towards the nuclear family model, where new families often settled in places without established community structures. Alternatively, it could be that older communities gradually secularized and prioritized abstracted desires over traditional values. In either case, the weakening of communities created a void where the need for assistance and support arose. Programs like AFDC, developed under the Economic Opportunity Act, expanded welfare to encompass various areas such as education, health, housing, employment, and standard welfare. Consequently, the government’s role grew, assuming the position of a surrogate community. However, during the 1950s and 1960s, this approach remained on the fringes and did not fundamentally alter how entire communities perceived the roles and dynamics between genders. On the other hand, increasing the costs associated with marriage can have a more significant impact on shaping the way different groups within a community view the opposite gender.

    • Easiest Route to a Reward wins

    In “The Ethos of the Great Society,” I discussed a concept called “The Government Handout Gap,” which outlines how the government can inadvertently create a cycle of dependence for its recipients. When individuals receive government assistance, they are often required to adhere to certain rules and restrictions. For example, if someone receives government-subsidized housing in a specific area, their monthly income cannot exceed $1,800 in order to qualify for an apartment that, without government subsidies, would typically cost around $1,500 per month.

    However, in the broader rental market, most landlords require tenants to demonstrate an income of 2.5 to 3 times the monthly rent to cover their expenses. Therefore, in order to move out of government housing and afford a similar apartment, the individual would need to earn a minimum of $2,750 per month. As their income surpasses the $1,800 threshold, they start losing other benefits such as food stamps, government phone assistance, subsidized utilities, and training credits. Additionally, they would now be subject to paying taxes on a portion of their income. It is challenging for most people to experience an immediate 33% increase in their income within a single month. Even if they could achieve this, individuals who rely on low-income housing would need to work significantly harder to maintain the same level of comfort they had with government benefits. This creates a difficult predicament where the transition away from government assistance may require increased efforts and potentially result in a lower overall standard of living.

    This critique raises an important point about the potential consequences of raising benefits to narrow the gap. One concern is that individuals who are hovering near the poverty line may be more inclined to remain in poverty if it offers greater benefits than striving for self-sufficiency. The underlying challenge that welfare programs have struggled to address is the issue of freeriding. Regardless of how society categorizes its population, there will always be a certain percentage of individuals who are content with minimal necessities such as food, shelter, and entertainment, especially when these needs are relatively affordable and subsidized by the government. However, rather than delving further into this particular issue, let’s return to the impact on marriage, as previously mentioned.

    The early drive for men to work hard and establish their livelihoods at a young age stemmed largely from necessity. While it is true that there is a valid critique regarding the relative ease with which men could find employment outside the home, there is more to the story than this narrow perspective suggests. Men felt the need to prove themselves as dependable providers from an early stage in life if they wanted any chance of being considered worthy suitors, whether for marriage or even just to secure a date without being ridiculed by both women and their families.

    Conversely, young women faced significant challenges in terms of limited job opportunities and the potential life-altering consequence of letting a Hen raid the eggs. Pregnancy. The availability of contraception only became widespread in the 1960s, initially restricted to married women, before eventually becoming more accessible to all. However, it still took several decades for young women to have widespread access to contraception. To illustrate, one example from my own family history is that my grandparents’ marriage in the 1960s could be described as a “shotgun wedding” due to an unexpected pregnancy.

    Even with limited access to contraception, many women became aware of programs that provided assistance if they fell below a certain income threshold. Scholars have debated whether the decision to have children out of wedlock was a conscious choice made with the intention of obtaining government benefits or if there were other underlying factors at play. Nevertheless, an undeniable reality is that the path that leads most individuals out of poverty is often referred to as the “Success Sequence.”

    The Success Sequence is often described as a pathway to upward mobility, involving three key steps: (1) completing high school, (2) securing a full-time job after graduation, and (3) getting married before starting a family. Depending on one’s ideation, some may view this sequence as a straightforward and achievable formula, while others recognize that it can be challenging without the necessary support structures in place. This is precisely why the concept of COMMUNITY plays a central role in every aspect of this narrative.

    These programs targeted communities with existing vulnerabilities and attempted to compensate for those weaknesses by offering government handouts, masquerading as a form of community support. However, it is important to recognize that the primary motive behind these government handouts is to secure votes and maintain political power, rather than genuinely uplift individuals and communities. (Clientelism)[1]

    These welfare programs did not prioritize or provide sufficient incentives for individuals to attain basic milestones such as completing high school, securing full-time employment, or fostering stable marriages. Instead of encouraging individuals to meet a standard of self-sufficiency, the programs perpetuated a cycle of dependency, gradually increasing reliance on government assistance. Consequently, individuals often found themselves trapped, needing extraordinary circumstances or luck to break free from the entanglement of perpetual reliance on the “free” resources they were receiving.

    Consider this scenario: You’re a young woman who has recently graduated from high school and currently holds a part-time job. While you receive $1,200 in benefits, you’re also dating a man who shares a similar background and values, hailing from the same area. However, he earns $2,500 per month, which would place you just beyond the threshold to continue receiving your benefits if you were to get married. In such a situation, you might question the incentive to marry him.

    In today’s society, the traditional community pressure to marry has significantly diminished, and many people you know may have experienced divorce or other challenges in their relationships. Given these circumstances, you might question the purpose of marriage. Additionally, if you were to have a child with your partner without him moving in, you could potentially receive additional income or benefits. This highlights the complex dynamics that individuals face when making life choices. Factors such as financial considerations, the changing nature of community support, and the incentives created by welfare programs can all influence decision-making processes regarding marriage and family formation.

    Whether intentional or unintentional, welfare programs often had detrimental effects on individuals living in poverty due to poorly designed incentive structures. For instance, if you are a strong and independent woman who can support herself, why would you choose to leave the security provided by government assistance for a man who can barely make ends meet above the poverty line?

    In an ideal scenario, a strong community support system would offer alternative avenues and resources. However, when the only support available is through government programs, it becomes understandable why individuals may hesitate to enter into relationships that could potentially compromise their financial stability. The reliance on these programs can create a sense of dependence and reinforce the belief that self-sufficiency is best achieved through maintaining a relationship with “big brother.”

    It is important to recognize the complexities and challenges faced by individuals in poverty, as well as the limitations of welfare programs in fostering empowerment and self-reliance.

    MARRIAGE PUNISHMENT

    Marriage plays a critical role in our economy, serving as a foundational pillar. [2]  However, programs like the AFDC (Aid to Families with Dependent Children) inadvertently discouraged impoverished couples from getting married. In 1975, the EITC (Earned Income Tax Credit) program was introduced as an attempt to address this issue. However, one crucial oversight is that receiving a larger tax refund at the end of the year does not address the immediate financial needs of individuals accustomed to a regular monthly income. [3][4]

    The reality is that individuals in poverty require immediate financial support, and relying on an annual tax refund does not provide the same level of stability as a consistent monthly check or government subsidy. Even recent reports from the Congressional Service in 2022 highlight this fact. [5] [6][7][8]  Furthermore, certain aspects of programs like the EITC may inadvertently disincentivize married women from seeking employment. [9] It is crucial to consider the immediate financial needs of individuals in poverty and ensure that welfare programs and policies are designed to provide timely support and incentives that align with their circumstances and aspirations.

    Penalties, penalties, penalties. [10][11] It’s important to me to acknowledge that summarizing other people’s work and cherry-picking statistics can lead to skepticism or a tendency to discount information. I understand that you may prefer to conduct your own research and form your own conclusions. It is worth noting that within the realm of welfare programs and policies, there have been discussions surrounding penalties and unintended consequences. If you’re interested in exploring this further, I encourage you to delve into unrefuted studies on the subject. Engaging with reliable research can help you gain a more comprehensive understanding of the complexities and potential drawbacks associated with welfare programs.

    However, it raises significant concerns when we recognize that marriage is widely acknowledged as the cornerstone of society, communities, population stability, wealth accumulation, and economic growth. In light of this, it is disheartening to acknowledge that government programs have been aware for decades that these initiatives adversely impact the incentives to marry, particularly for those in poverty who tend to have a higher number of children per capita. This suggests that either some of the aforementioned benefits associated with marriage were intentionally targeted, or, at the very least, there was a troubling level of negligence in addressing the consequences of these programs on the institution of marriage.

    • “ESCAPE NOW” *BuTTon*

    It is noteworthy that the majority of divorces are initiated by women, and many of these divorces stem from a perceived lack of commitment rather than infidelity. [12][13][14] In light of this observation, I propose that welfare programs, coupled with a decline in community support, have made it easier for women to seek an exit when they no longer wish to remain in their marriages. Contrary to the feminist narrative that emphasizes independence for single women as a means to escape poverty, statistics indicate that marriage is the most effective path for women to elevate themselves out of economic hardship. [15] It is important to note that this observation applies not only to women who are already in the middle class or above, but also to women experiencing poverty. Various metrics consistently demonstrate that marriage has a greater positive impact on women’s economic status than any other choice they could make.

                Based on the 2021 billionaire census, it is evident that women comprise only 11.9% of the billionaire population. It is noteworthy that slightly over half of the female billionaires are heiresses, while another 30% have a mix of inherited and self-created wealth. Within the broader female billionaire group, 16.9% can be classified as self-made billionaires, while 53.5% have accumulated their wealth through a combination of inheritance and self-generated means as of 2017. [16]  These statistics provide insight into the composition and sources of wealth among female billionaires, and that the true “Boss Babes” are or were married.

    Extensive research has consistently demonstrated that the presence of fathers in a neighborhood plays a crucial role in predicting upward income mobility for the children growing up in that community. Even when accounting for various factors such as the quality of schools, racial composition, or ethnicity, the influence of fathers remains significant. This well-established fact highlights the importance of intact families and strong parental involvement in promoting economic advancement.

    However, it is unfortunate that there is often a lack of financial incentive to foster family unity. Fragmented families tend to spend more on individualistic pursuits since the absence of togetherness necessitates individual expenditures. Drawing upon the concept discussed in “Degenerate Ad Men: The Mythos of the Nuclear Family,” it becomes evident that consumerism thrives on the fragmentation of families. In such circumstances, individuals are compelled to acquire their own possessions for consumption, as opposed to sharing or borrowing within a unified family unit. This consumer-driven dynamic capitalizes on the disconnection and absence of familial cohesion. By prioritizing family unity, not only can individuals enjoy the advantages of shared resources and reduced expenses, but it also nurtures stronger communities and facilitates upward mobility for future generations.

    In contrast to the previous section, this aspect revolves more around convenience or even a dislike for one’s partner. The availability of an exit strategy means that individuals may consider walking out the door. In the past, there was a higher threshold for justifying such a decision, as it required explaining oneself to friends, family, and community members. Breaking up a family demanded a stronger justification or excuse. Furthermore, as mentioned in Part 1, the involvement of a “higher authority” in the form of a marriage minister could provide guidance and boundaries.

    However, in the absence of a cohesive extended family, a dwindling sense of community, and friends who can easily be dismissed if they question one’s unwise or foolish decisions, there is no longer a system in place to hold individuals accountable for their behavior or newly adopted mindset. The witnesses who once acted as guardians have lost their influence, and now the government provides guaranteed funds once a husband or partner is pushed out. This situation creates an environment where individuals can benefit from a husband who pays the bills without having to deal with his emotions, a scenario that some may consider a feminist ideal of independence. Yet, it is a dream that society is now paying the price for. It is important to reflect on the unintended consequences of such shifts in societal dynamics and consider the broader implications for family structures, community support, and personal responsibility.

    I want to clarify that I am not endorsing the idea of trapping individuals in undesirable situations. However, it is worth noting that having a backup plan often undermines the success of the primary plan. In the context of relationships, when there is an easy way out or an alternative option readily available, the commitment made in Plan A tends to lose its significance. In today’s world, where instant gratification is prevalent, commitment is often undervalued because we have become accustomed to immediate satisfaction from our smartphones, food, and various sources of instant gratification.

    Communities are built upon shared values and principles that guide the behavior and interactions of its members. When communities fail to transmit and uphold these values, they can experience a decline in cohesion and effectiveness.

    Marriages, as a fundamental institution within communities, reflect and embody these shared values. They serve as a microcosm of the community, representing the commitment, trust, and mutual support that are essential for its overall well-being.

    Just as weddings have witnesses who attest to the commitment made by the couple, marriages themselves require the ongoing support and involvement of the community. These witnesses symbolize the role that the community plays in upholding the values and principles that underpin successful marriages.

    When marriages face challenges or break down, it can have far-reaching effects on the community. Divorce or marital discord can erode trust, weaken social connections, and disrupt the stability of the community. Conversely, strong and healthy marriages contribute to the vitality of the community by fostering positive family dynamics, providing role models for future generations, and promoting a sense of unity and purpose.

    Marriage, therefore, serves as a foundational institution within communities. It provides a structure for building and maintaining strong relationships, transmitting values across generations, and nurturing the social fabric of the community.

    To ensure the strength and resilience of communities, it is crucial for individuals, families, and community institutions to prioritize and support healthy marriages. This involves teaching and reinforcing the values that underpin successful relationships, providing resources for relationship education and support, and fostering a culture of commitment and mutual respect.

    By recognizing and valuing the role of marriage as the foundation of communities, we can work towards building stronger and more cohesive societies that thrive on shared values and meaningful connections.

    Community and proper child rearing should be fundamental priorities in society. Reliance on government assistance can potentially limit individual autonomy and lead to a sense of being part of a client class. Breaking free from this mindset requires personal reflection, repentance, and a commitment to reform from the perceived mental servitude to an overarching authority. Building strong communities and fostering effective child rearing contribute to self-reliance, mutual support, and a sense of freedom. Receiving financial assistance from the government implies becoming a recipient within its support system, which may affect one’s sense of personal freedom until a mindset shift occurs. [17][18] This shift involves reassessing one’s dependence on external support, considering personal responsibility, and pursuing reform to overcome any perceived mental servitude to a larger authority. CLIENTELEISM.

    In the upcoming weeks, we will explore several related topics, including the origins of population regulation through birth control, the Mandela effect on the civil rights movement, and the phenomenon of religious leaders becoming followers of trends.


    [1] https://www.britannica.com/topic/clientelism

    [2] https://books.google.com/books?id=2CQ46PgqyKsC

    [3] https://welfareacademy.umd.edu/pubs/family/Marriage_Penalties_in_the_Modern_Social-Welfare_State.pdf

    [4]https://go.gale.com/ps/i.do?p=AONE&u=googlescholar&id=GALE|A20952678&v=2.1&it=r&sid=AONE&asid=58b71d99

    [5]https://crsreports.congress.gov/product/pdf/IN/IN11843#:~:text=EITC%20marriage%20penalties%20occur%20when,were%20unmarried%2C%20as%20illustrated%20below.

    [6] https://www.brookings.edu/wp-content/uploads/2016/06/20000920.pdf

    [7] https://www.taxpolicycenter.org/taxvox/bidens-expanded-eitc-adds-significant-marriage-penalties

    [8] https://fordschool.umich.edu/news/2021/earned-income-tax-credit-affects-intergenerational-marriage-and-childbirth-decisions-says

    [9] https://www.nber.org/digest/apr99/married-women-work-less-because-eitc

    [10] https://www.acf.hhs.gov/sites/default/files/documents/ofa/hmrf_marriagepenalties_paper_final50812_6_19.pdf

    [11] https://www.aei.org/articles/welfare-reform-and-marriage/

    [12] https://www.wf-lawyers.com/divorce-statistics-and-facts/

    [13] https://www.asanet.org/women-more-likely-men-initiate-divorces-not-non-marital-breakups/

    [14] https://www.pewresearch.org/short-reads/2022/03/11/rising-share-of-americans-see-women-raising-children-on-their-own-cohabitation-as-bad-for-society/

    [15] https://www.aei.org/articles/welfare-reform-and-marriage/

    [16] https://www.forbes.com/sites/denizcam/2021/04/06/the-top-richest-women-in-the-world-in-2021/?sh=7863879e4598

    [17] https://www.sciencedirect.com/topics/social-sciences/clientelism

    [18] https://academic.oup.com/edited-volume/35474/chapter-abstract/303821587?redirectedFrom=fulltext

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
  • The Collapsing Betrothal: Part 1, Marriage Traditions, and Dating

    June 18th, 2023

    Mawage, Mawage is wot bwings us together today. Mawage, that blessed awangement, that dweam wifin a dweam. And wuv, tru wuv, will fowow you foweva. So tweasure your wuv.

    What is marriage? Who is it between? What is it purpose?

    Questions that now seem complex although they were all to simple for most of history. Each of these questions and their answers all seem to have political and ideological connotations now. So let’s approach it from a functional sense.

    Marriage has long been considered fundamental in community building. Throughout history, communities have been formed by married couples who raise children together. The presence of two parents in a child’s life, each bringing their unique perspectives and roles, has often been seen as beneficial for the child’s development. However, even before considering the aspect of raising children, the establishment of the institution of marriage itself requires certain social conventions.

    In contrast to the fictional scenario depicted in “The Princess Bride,” where individuals may be kidnapped and forced into marriage, most of our ancestors entered into marriage through more conventional means. When speaking to my grandparents, for example, they often mentioned the importance of ensuring that the person they dated had stable employment and owned a car. In the generation of my great-grandparents, factors such as the communities individuals came from, societal expectations, and cultural norms played a significant role in determining whom one could or should date. Additionally, the duration of the courting period was often a consideration, as it reflected the seriousness and commitment of the relationship.

    Taking a functional perspective, we can understand marriage as a primary social institution that serves several purposes in community building. It provides a framework for establishing committed partnerships, fostering stability, facilitating the formation of families and providing the basis for the best support system for childrearing, a connected extended family. The social conventions surrounding marriage, such as employment and societal expectations, have historically contributed to the maintenance of social order and cohesion within communities. While societal norms and expectations have changed over time, the functional role of marriage in community building remains an imperative.

    The discussion now focuses on the interplay between marriage and the community, rather than delving into the morality and ethics of shifting societal norms in recent generations. In this context, the term “community” refers to the individuals who attend a wedding, appear in wedding photos, and express happiness for the couple. However, an issue arises when these same individuals, who were once an integral part of the wedding event, become distant figures in the ongoing union known as marriage.

    The transition from a wedding ceremony to the reality of married life often involves a shift in the dynamics between the couple and their community. During the wedding, the community plays a supportive and celebratory role, rejoicing in the couple’s union and offering well-wishes for their future. However, as time progresses, the level of involvement and interaction between the couple and their community can diminish. The once-prominent figures in the wedding photos may become distant acquaintances or mere spectators in the couple’s married life.

    This shift can be attributed to various factors. In some cases, geographical distance or changes in personal circumstances may contribute to a decrease in regular contact with the community. Additionally, the initial excitement and novelty of the wedding may fade, leading to a natural waning of the community’s active involvement. The couple, too, may become more focused on their own journey as they navigate the challenges and joys of married life.

    While it is not uncommon for the community’s role to change over time, the issue arises when there is a sense of disconnect or isolation felt by the couple. They may long for the continued support, understanding, and camaraderie that was prevalent during the wedding celebration. This highlights the importance of nurturing and maintaining strong social connections beyond the wedding event, both within the community and within the couple’s own efforts to foster meaningful relationships.

    Ultimately, the interplay between the community and the marriage is an evolving dynamic that requires attention and effort from both sides. Cultivating a sense of community and fostering ongoing connections can contribute to the longevity and fulfillment of a marriage, as the support and engagement of loved ones can provide a valuable foundation of emotional support and shared experiences throughout the journey of married life.

    There is a compelling argument that couples need time and space to develop and establish their own identities within the context of marriage. However, it is important to acknowledge that this approach deviates from the traditional practice of mentoring both the husband and wife, guiding them through the process of understanding and modeling their roles. The assumption that individuals will naturally know what is right or how to navigate the complexities of relationships, especially in a context encompassing hormones, sexuality, intimacy, and youth, can be seen as a combination of hubris, neglect, and laziness.

    Traditionally, societies recognized the significance of mentorship and guidance in shaping individuals’ understanding of their roles and responsibilities within marriage. Elders, experienced couples, and the broader community played an active role in imparting wisdom, providing advice, and setting examples for newly married couples. This mentoring process served as a valuable resource for young couples, offering insights and guidance on matters related to communication, conflict resolution, emotional intimacy, and the intricacies of a committed partnership.

    In the present era, the emphasis on individualism and personal autonomy has led to a shift away from the structured mentorship model. Many couples are encouraged to discover their own paths and define their roles without extensive external guidance. While this approach promotes independence and self-discovery, it can also neglect the importance of collective wisdom and communal support in navigating the challenges and nuances of a successful marriage.

    Expecting individuals to instinctively grasp what is right or to effortlessly navigate complex relationship dynamics may be unrealistic. The primordial mix of emotions, desires, and youth can create a volatile environment where guidance and mentorship are invaluable. By neglecting the role of mentors, society risks leaving young couples to stumble through the early stages of their marriage, potentially encountering avoidable pitfalls and challenges.

    To strike a balance between individual growth and communal guidance, it is essential to reconsider the value of mentorship within marriage. Encouraging experienced couples and trusted community members to serve as mentors can provide invaluable support for newlyweds. By offering advice, sharing personal experiences, and modeling healthy relationship behaviors, mentors can contribute to the development of strong, fulfilling marriages. Recognizing the complexities involved in navigating intimate relationships and the transitional phase of youth, societies should place renewed emphasis on mentoring couples, allowing for a healthy interplay between personal development and the wisdom of communal support.

    It is often overlooked why the ancient tradition of having witnesses for a marriage has persisted throughout history. Some may assume that witnesses are simply present to substantiate and confirm the occurrence of the marriage and the consent given by both parties. While this practical aspect of having witnesses for confirmation holds true, there is a deeper significance to their role. Witnesses are meant to serve as guardians for the married couple, protecting them not only from each other but also from their own destructive thoughts, patterns, innate tendencies, and learned behaviors.

    Marriage is a profound union that involves two individuals intertwining their lives, emotions, and vulnerabilities. It is a journey that requires continuous support, guidance, and accountability. The presence of witnesses signifies a commitment from the community to stand alongside the couple, serving as protectors and mentors in their marital journey.

    Witnesses, ideally chosen from those who care deeply for the couple, play a crucial role in holding them accountable to their vows and commitments. They provide a source of guidance and wisdom, offering a perspective beyond the immediate emotions and challenges that may arise within the marriage. By acting as guardians, witnesses help the couple navigate the complexities of their relationship and prevent them from succumbing to self-destructive behaviors or negative patterns that could harm their bond.

    In addition, witnesses act as a source of external affirmation and encouragement for the couple. Their presence and support serve as a reminder that the couple is not alone in their journey. They provide a sense of community, reminding the couple of the collective wisdom and experience that surrounds them. It is essential to recognize the significance of witnesses beyond their role as mere confirmers of the marriage. They are entrusted with the responsibility of safeguarding the well-being and growth of the couple, fostering an environment of love, trust, and accountability. Their presence symbolizes the interconnectedness of individuals within a community and reinforces the importance of collective support in sustaining and nurturing healthy marriages.

    In today’s fast-paced and individualistic society, the role of witnesses in marriages has somewhat diminished or been overshadowed. However, revisiting the essence of this ancient tradition can remind us of the importance of communal involvement and guidance in sustaining strong and fulfilling marriages.

    *WHY DO WE ALWAYS THINK WE KNOW BETTER THAN THE PAST?* (Perhaps it’s the curse of youth and lazy parenting.)

    The issue arose when the role of witnesses and participants in a wedding ceremony became merely ceremonial. Many individuals attending weddings started focusing solely on the perks of free food, drinks, music, and good vibes, rather than recognizing the deeper meaning behind the exchange taking place. In essence, by partaking in the celebration, there is an unspoken agreement that you owe a duty to the bride and groom to support and encourage them in their commitment to be the best versions of themselves within the sacred unity they have entered into with each other and a higher power they believe in.

    Attending a wedding goes beyond simply enjoying the festivities; it involves embracing a responsibility to uplift and guide the couple in their marital journey. The exchange of food and drinks symbolizes a bond of mutual support and accountability. By accepting the hospitality of the couple, guests are, in turn, expected to play an active role in helping the newlyweds navigate the challenges and joys of their union.

    This duty extends beyond the wedding day itself. It requires actively pushing the couple to grow, thrive, and honor the vows they have taken. It entails encouraging them to uphold their commitment to each other and reminding them of the sacred nature of their union. By fulfilling this duty, witnesses and participants contribute to the long-term success and happiness of the couple, fostering an environment of love, respect, and personal growth. It is important for both guests and the couple themselves to remember the significance of the exchange that takes place during a wedding ceremony. Beyond the celebration, there exists a profound commitment and a sacred bond that calls for ongoing support, guidance, and encouragement. By upholding their duty, witnesses and participants contribute to the transformative power of marriage, nurturing a strong foundation for the couple’s lifelong journey together.

    The oversight of the duties owed to the couple by witnesses and participants in a wedding ceremony may be attributed to a lack of emphasis and communication within communities. Perhaps this oversight took place because the obligation was not something often portrayed in movies. But it is more likely that communities forgot to take about it. The focus on the superficial aspects of the celebration, rather than the deeper commitments and responsibilities, has overshadowed the true essence of a wedding. It is crucial to remember that supporting the couple, providing guidance, and nurturing their growth is an integral part of the wedding journey. Restoring the understanding of these duties is essential in ensuring a meaningful and transformative marital experience.

    This structure was intended to serve as the foundation for building families. The designated witnesses acted as guardians, while the officiator assumed the role of a high counselor. When spouses encountered internal issues, they sought guidance from their chosen witness, who shared their values, to gain perspective, address concerns, or find a healthy outlet. The officiator, as the judge and mediator, resolved conflicts, not just determining who was right, but also providing guidance on what each party needed to do to restore harmony. By respecting the officiator’s role in the marriage ceremony, their decisions held authority. Without this structure, couples risk ongoing disputes or allowing issues to fester, akin to an untreated, festering wound. Either they painstakingly address and resolve the issues or they risk the consequences of an unresolved rupture.

    To say the least, this structure was nonexistent, in many (failed) marriages. There are numerous reasons for the breakdown of communities and structures, which I attempt to address in my other pieces. I cannot pinpoint the initial domino, but these were once upright dominos firmly interlocked.

    However, in the absence of these structures—whether it be family, friends, religion, or community—the mundane and inherent conflict of opposing forces attempting to unite can result in leakage. Without a robust structure to offer external support when needed, the united forces redirect their energies towards a civil war instead of conquering the world together.

    As communities started to crumble and shifted their focus towards the nuclear family and secularism, and as the “great society” disrupted traditional roles within communities, the integrity of marriages weakened. It is difficult to determine which came first—the weakening of communities or marriages. However, this collapse led to a devastating divorce rate that coincided with the decline in intentional dating, marriages and people having children, though admittedly there are other factors. Unfortunately, the decline did not stop there.

    The Age of Dating with a Vague Purpose

    As mentioned earlier, dating used to serve as the pathway to marriage, and it still does in the Western world, even in modern “arranged marriages.” However, the concept of arranged marriages has evolved. It is no longer about trading one’s daughter for a goat; instead, it involves going through a matchmaker who assesses your personality, values, and interests to find a compatible match. The matchmaker system goes beyond superficial traits and delves into the core values, ensuring that potential matches already demonstrate compatibility on a fundamental level. Thus, even within the context of dating for personality or love, there exists a structured approach that does the groundwork of assessing compatibility based on shared values.

    Then the shift in communities and traditions of marriages was weakened by modernity. The shift in dating dynamics led to individuals pursuing relationships with the indirect goal of eventually reaching the right place, but lacking clear guidance on how to navigate the process. Parents and communities often failed to provide comprehensive guidance beyond superficial criteria like looks, wealth, and emotions. Consequently, the focus shifted towards the vague objective of seeking marriage and starting a family, without a structured roadmap or step-by-step goals that are essential for long-term endeavors.

    In the modern era, the shift in dating has resulted in a common response when asked about dating preferences: “I’m looking for a connection, love, or a good vibe.” However, this approach often resembles fishing, using one’s physical appearance as bait. People are uncertain if they are caught in someone else’s grasp or if someone else is caught in theirs until months or even years later, once the initial allure of pheromones and primal instincts fades away.

    In contrast, the traditional model of dating prioritized discovering shared values as the primary objective, with compatibility and mutual understanding considered secondary to the alignment of core principles.

    However, since the latter half of the 20th century, the focus in dating has shifted towards a more feelings-based approach. The emphasis became centered on whether a person makes you feel good, gives you butterflies, and creates a sense of specialness or love. Yet, these factors hold little consequence if the individuals involved do not share similar values.

    When we speak of similar values, we refer to the presence of reconcilable differences and a shared understanding of how life should be lived. It means that both individuals can make slight adjustments to their viewpoints without feeling inauthentic. Alternatively, one of the parties may reframe their prior beliefs to align with the collective direction the couple wishes to move forward in their familial unit.

    However, in a society where only a nuclear family or a partially extended family exists, and genuine community connections are lacking, external guidance becomes minimal due to physical distance or a new norm of hiding problems from potential helpers. Uncomfortable conversations, necessary for addressing any issues, are often avoided. Instead, individuals turn to television, radio, or sports to distract themselves.

    As a result, when seemingly irreconcilable topics arise, couples rarely have experience working together to find healthy resolutions. Their values were never thoroughly discussed or prioritized in the relationship; instead, good feelings took precedence, making negative emotions the moral enemy of their relationships. Most significantly, they lacked a “higher authority” to turn to for guidance, someone who could help resolve conflicts without assigning blame to individuals but instead referring to established order, rules, or customs that explain the reasons behind the conflicts.

    Without the support of an extended family or community leaders who could mediate such disputes, which arise in almost all marriages, the weakest marriages that failed to establish a solid foundation or closed themselves off from much-needed assistance found themselves in dire straits.

    At the very least, individuals in this stage had a vague understanding that they wanted to find a partner with whom they could marry and start a family. Although the specifics may not have been discussed thoroughly, intuition, social conditioning, or outside influences guided them towards something difficult to articulate. Nevertheless, there was an underlying, unconscious goal that provided purpose and meaning, regardless of the unpredictable nature of the path it led them on.

    Dating with No Purpose: Relationship Masturbation

    Masturbation, beyond its literal definition, involves using something of inherent value solely for the purpose of self-gratification. It manifests in various forms: physical masturbation, often discovered during adolescence; verbal masturbation, where individuals gather to use words to boost their own ego; and relationship masturbation, both platonic and “romantic,” where people seek validation from one another without actively pushing each other towards their ideals or striving for personal growth within their shared value system.

    Now the Etymology of the word “date”.The word “Date” derives from the Latin root “dare,” meaning “to give or deliver,” and the root “data,” which encompasses qualities, characteristics, or symbols. When combined, the concept of dating implies exchanging information about oneself with the other person who is also doing the same. Therefore, if one engages in casual dating without a serious intent, they are essentially undervaluing and giving away their true self for fleeting enjoyment. This pursuit of short-term pleasure can have lasting effects on both male and female psychology, influencing their perception of risk, reward, pleasure-seeking, and avoidance of pain.

    However, rather than coming across as prudish, the government began to encourage and subsidize this behavior. There’s an old saying that trends usually start at the bottom and emerge out of necessity. However, it’s when influential individuals with money start imitating these behaviors that they become popularized, leading more people to adopt them.

    It is argued that aimless dating has always existed among the upper echelons, but the difference lies in the fact that if individuals of status made a mistake, they often had obligations towards the child and maintained their marital duties. However, the concept of single individuals dating without specific goals emerged and gained momentum through welfare programs.

    In my article on The Ethos of the Great Society, I touched upon the general aspects of the programs. Amongst other things it discusses the roles created by subsiding many lifestyles which in turn created a new form of clientelism. In next week’s article, I will provide a more detailed analysis of the government programs that discouraged and sometimes even financially penalized couples for getting married or living together.

    End of Part 1

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
  • The Ethos of the GREAT SOCIETY

    June 11th, 2023

    The 36th president Lyndon B. Johnson’s called for creation of the Great Society. The Great Society program became Johnson’s agenda for Congress in January 1965: aid to education, attack on disease, Medicare, urban renewal, beautification, conservation, development of depressed regions, a wide-scale fight against poverty, control and prevention of crime and delinquency, removal of obstacles to the right to vote.

    However, these programs ultimately undermined the essence of the American dream, which envisioned individuals living within their communities free from excessive government interference or the imposition of others’ beliefs. Most research on the Economic Opportunity Act of 1964 and the Titles under the Great Society primarily relies on statistical analysis to assess their impact. Conversely, statistical models fail to capture the full spectrum of values at play.

    I engaged in a significant internal struggle when contemplating the Great Society initiative. In my quest for understanding, I delved into various papers and books that discussed the program’s incentives and penalties, weighing the alleged positive and negative effects, as well as the intended versus unintended consequences. However, I found myself torn because most researchers and authors primarily emphasized tangible outcomes while acknowledging the substantial impact of intangible factors. This juxtaposition left me conflicted.

    The dilemma I faced was how to effectively communicate these intangible aspects in a meaningful manner. Within academia, there are scholars who meticulously analyze this information and data, with varying degrees of accuracy, avoidance, or even dishonesty. Moreover, depending on a researcher’s affiliations, some individuals may dismiss anything they say outright. Therefore, providing data alone holds limited value, as I am attempting to address something intangible.

    A grasp on the INTANGIBLE

    Quantifying the value of a community in personal development is a challenging task. The saying “it takes a community to raise a child” encapsulates this concept, but it also raises a multitude of questions. How many individuals constitute a community? What defines the boundaries and characteristics of a community? Why is a community essential for raising a child? And what happens if a community is absent? These inquiries all attempt to quantify something that resides on the fringes of the intangible realm. Nonetheless, despite the difficulties involved, the pursuit of finding answers to these questions remains a worthwhile endeavor.

    However, in order to address those questions, it is necessary to paint a broader and more vivid picture that captures the stark contrasts. The specifics of answering these questions demand a shared comprehension of concepts, ideas, traditions, history, and even our instinctual, animal-like reactions to various stimuli and systems. By encompassing these elements, we can gain a deeper understanding of the complexities involved in the dynamics of community and its impact on personal development.

                Thus, the creation of the “Great Society” had a detrimental impact on the intangible value of communities, which is inherently challenging to articulate. Despite its inherent difficulties, the aim is to provide a semblance of concreteness and clarity to a subject matter that is inherently elusive and hard to define.

    The Hidden Price Tag

    The “Great Society” initiative imposed a highly liberalized and government-centered structure that influenced the mindset of federal employees, communities, and individuals to become overly dependent on government. While Franklin D. Roosevelt laid the foundation for welfare programs like social security and a few others, even after World War II, Americans did not perceive the government as their “Big Brother.” There was still a sense of unease regarding governmental interference or excessive involvement in everyday life. Therefore, Lyndon B. Johnson had to first sell a dream before implementing these programs. Consequently, it is important to examine the underlying themes of its creation and the resulting outcomes when evaluating the impact of the “Great Society.”

    When examining the ripple effects of a policy or initiative, the intentions behind it often hold little significance compared to tracing the actual causes and outcomes. As the saying goes, “the road to hell is paved with good intentions.” Therefore, it is crucial to consider the underlying ideas on which something is based and whether the resulting outcomes align with those ideas. This alignment can provide valuable insights into the true intentions beyond the use of eloquent language designed to garner unquestioning adherence and reverence, masking policies that ultimately benefited only a select few. It is akin to a poisonous sugar coating meant to attract the diligent worker ants and middle management, while the true implications remain obscured.

    To be fair, I have experienced both the role of a sheep and an ant. There have been instances where I have embodied the qualities of an ant, diligently working within certain contexts. However, the sheep-like tendencies within me have shed their wool, and I have sharpened my teeth, metaphorically speaking. This transformation has allowed me to approach things with a greater sense of purpose.

    Sheep and Ants

    The question that calls out from the page is “Why use Sheep and Ants, isn’t that dehumanizing?”

                Sheep and ants serve as analogies to illustrate the interplay of nature and nurture in the development of personalities. The comparison stems from the fact that both sheep and ants possess behavioral mechanisms that we also have, albeit with some differences. While ants lack frontal lobes like ours, their behavior is influenced by their environment. On the other hand, sheep do possess frontal cortexes, but their primary drive is the pursuit of security and a sense of safety.

    The prefrontal cortex sets us apart, granting us advanced capabilities and the potential for more meaningful societal interactions, among other things. However, this does not imply that the more primitive parts of our brain cannot be activated. These regions hold precedence in how most brains interpret information. When confronted with danger, fear, or disgust, these primal responses can take over, resembling the constant state of being for sheep and ants. Sheep and ants serve as powerful examples of the outcomes that arise when these responses dominate in individuals.

    Nevertheless, it is crucial for everyone to strive to rise above these innate aspects of our nature. While some individuals may be predisposed to succumb more readily, proper nurturing and a supportive community can mitigate these tendencies and teach individuals how to control their animal instincts. It is through this process that we can develop self-regulation and transcend our primal inclinations.

    Sheep exhibit a tendency to seek a source of authority that can guide them and provide direction. They are not particularly concerned about the identity of the authority figure, as long as they can follow their own way and find comfort within the safety of a larger structure. Their desires are modest, content with what they have, often reflecting the sentiment of “I love and trust the government.” Such individuals possess an inherent trust and a sense of security within the system they belong to, as long as their basic needs are met and physical threats are minimal. They may often express a desire for change, as the monotony of the same old grass can become tiresome to them.

    In contrast, ants exhibit unwavering loyalty to their queen. They possess a deep understanding of their roles within the colony and respond to authority with a devotion akin to religious fervor. However, this authority cannot be feigned or manipulated. At a subconscious level, ants possess the ability to discern genuine authority without being able to articulate it. This innate instinct is the reason why these creatures do not aspire to be more than what they are. They diligently carry out their intended tasks and find contentment and happiness in serving the hive. These ants represent individuals who are resistant to change, preferring stability. However, if change is to occur, they desire it to come from an authority figure they respect.

    It is rare to find individuals who solely embody the characteristics of either an ant or a sheep. Instead, most people tend to lean towards ant-like or sheep-like justifications depending on the topic, issue, or occasion at hand. Presently, there is a prevailing inclination towards sheep-like tendencies among many people, whereas in historical context, individuals were often shaped into embodying more ant-like qualities.

    This observation leads to the conclusion that being an ant is generally regarded as more respectable than being a sheep. The American dream, at its core, promotes autonomy and the freedom to exist within one’s community without being dictated by a shepherd or an alleged queen ant. Unfortunately, in contemporary American politics, many individuals on the right exhibit ant-like tendencies, while many on the left lean towards sheep-like tendencies.

    Throughout history, feudal systems and other rigid hierarchies have exploited both the ant-like and sheep-like mindsets prevalent among people. The sheep, in this context, are the ones who revolt when they encounter a new shepherd promising greener pastures. This revolt can occur due to the current shepherd’s incompetence, corruption, or simply because they perceive better opportunities elsewhere. On the other hand, the ants revolt when their leader fails to demonstrate competence or stops emitting the appropriate pheromones at the right frequency. These revolts among ants often arise from a desire for effective leadership and the maintenance of a well-functioning colony.

    While there is a highlighted division between these two groups, it is important to note that individuals or groups do not exclusively fall into one category. As mentioned earlier, most people react to different situations in contrasting ways, exhibiting traits of both ants and sheep depending on the circumstances.

    Moreover, some of the bloodiest revolts in history have occurred when a charismatic figure manages to unite people from both sides, overthrowing the old system. However, the aftermath of such revolutions presents a challenging picture. The new leader may attempt to establish a new order, but if the ants do not respect this person or if the new leader was simply a means to remove the previous incompetent leader, the sheep may not embrace the replacement. This can result in a tumultuous period where ants and sheep clash in their visions for the future. Achieving a balance and stability in such situations often takes decades or even centuries as the pendulum swings back and forth.

    SO why go into these weird examples of free individuals, sheep and ant mindsets?

    The argument presented here revolves around the American philosophy and ethos, which emphasizes a government built upon the principle of separation of powers. Each branch of government was designed to be a rival to the others, driven by its own self-interest, with the overarching goal of maintaining the nation’s integrity from its unique perspective. This philosophy aimed to transcend the false dichotomy of sheep and ants.

    However, the concept of the Great Society reintroduced a dual system that contradicted the intentions behind the Constitution and Bill of Rights. It established a more unified governmental body and allocated funds to support the roles of both sheep and ants within society. This approach encouraged and embraced the division between these two groups, recognizing that their differing perspectives and contributions provide the society with the variety necessary to propel the government towards its goals.

    A Unified Governmental Body

    The term “unified” in relation to the government can sometimes be seen as ironic, given the bureaucratic complexities involved. The Madisonian concept of the federal government was based on the idea that the executive, legislative, and judicial branches should act as checks and balances on one another, maintaining separation and conflict. However, in practice, there has been a shift in the dynamics.

    The judicial branch, instead of strictly evaluating the constitutionality of matters, has often engaged in addressing social issues and appeasing the executive and legislative branches. The legislative branch, reluctant to delve into the intricacies of policymaking, often passes broad legislation, granting significant power to administrative agencies. These agencies, represented by boards and committees, essentially interpret the scope and intent of congressional bills, allowing for substantial influence over the laws they are responsible for enforcing. This issue of agencies having significant lawmaking authority is commonly referred to as non-delegation.

    Overall, the original intention of the Madisonian framework for the separation of powers has undergone significant changes, with the executive agencies, legislative branch, and judiciary interacting in ways that were not initially envisioned.

    All the aforementioned issues can be seen as challenges to the principle of separation of powers. The increased use of executive orders by presidents has blurred the line between legislative and executive actions, creating concerns about the concentration of power. This trend moves the presidency closer to resembling a monarch or autocrat rather than a legal enforcer of the executive branch’s responsibilities.

    While the president does hold the role of commander-in-chief of the military, this primarily pertains to matters beyond the continental states, serving as a practical necessity rather than reflecting the spirit of the position. However, since the time of FDR, the presidency has gradually shifted from its executive nature to one that resembles actions more akin to a king, emperor, or Caesar. This transformation raises concerns about the expanding powers of the presidency and its resemblance to authoritarian leadership rather than a purely executive role.

    Indeed, there are additional social, political, and corruption issues associated with the expansion of executive powers, particularly through the proliferation of enforcement agencies. These concerns can raise alarms for individuals who oppose authoritarian regimes, and debates about these matters often involve discussions among libertarians and proponents of limited government. However, shifting the focus to the “Society” rather than the government itself, it is worth noting that the role of the commander-in-chief serves as a transition to the main topic at hand.

    Government Funded Roles

    The founding of this country was rooted in a strong opposition to aristocracies, where privileges and special status were inherited through lineage, whether it be in the form of titles or surnames. However, it is important to recognize that aristocracies can take various forms beyond mere titles. They can be perpetuated through rigid scholastic customs, religious beliefs, cultural practices, and traditions. In American history, we can observe that these forms of aristocracy have often been transmitted through religious affiliations. In fact, non-Protestant presidents faced scrutiny and skepticism until relatively recently in the 21st century. This highlights how deeply ingrained religious traditions and biases have historically influenced perceptions of leadership and eligibility for public office.

    Indeed, in a more secular context, the transmission of privilege and influence has occurred through wealth, industry, and social connections, rather than solely based on one’s surname or heritage. It is important to acknowledge that aristocracies thrive on maintaining distinct roles for individuals to fulfill and care for, as their reputation and position within society hold significant value.

    The perpetuation of aristocratic systems often relies on the preservation of these roles and the expectations associated with them. This includes the preservation of wealth and social status, as well as the cultivation of a certain image and reputation within the elite circles.

    Various systems of governance do involve the passing on of status between families, but the degree of mobility within these systems can vary. In an aristocracy, there is often a lack of social mobility for certain groups based on their birth or membership in those privileged circles. In feudal systems, it was indeed challenging for individuals of lower social status to change the class into which they were born. While it was difficult, it was not entirely impossible.

    It’s important to note that aristocracy does not necessarily imply a rigid and unchanging social structure. In some cases, exceptional individuals from lower classes could perform heroic acts or demonstrate remarkable talents, which occasionally led to their being rewarded with knighthood, titles, or even lands. However, such instances were rare, even though they were not completely unheard of.

    Indeed, one of the issues with feudal systems was that the entrenched aristocracies often made it difficult to replace incompetent leaders, as power and land were concentrated within their ranks. However, in modern Western societies, the ideal is to establish a meritocracy where status and positions are based on individual merit rather than inherited privileges.

    A key aspect of a meritocratic system is the independence of land, property, banking, and the treasury from the government. This separation is essential because in feudal systems, the lands and resources belonged to the monarch, and the titles and positions were granted in exchange for loyalty and service. By separating these entities from the government, a meritocracy aims to ensure that individuals can rise or fall based on their own abilities and achievements, rather than relying solely on inherited wealth or connections.

    In a true meritocracy, opportunities should be available to all individuals regardless of their social background, allowing them to compete on a level playing field and be rewarded based on their own talents, skills, and hard work. However, government provided subsidies degrades a merit based system, since merit of subsidies are being decided based on current opinions of the elected officials and the elected officials goals of what they get in exchange for getting an industry subsidized.

    Indeed, in feudal systems sure many were corrupt but the king would delegate authority and land ownership to different ranks of nobility, creating a hierarchical structure within the society. Each rank, from duke to marquess, earl, viscount, and baron, had their own lands and responsibilities. They were motivated to cultivate their lands and generate revenue not only for themselves but also for the higher-ranking nobles to whom they owed allegiance. This system of land ownership and loyalty created a chain of command and ensured a flow of resources and taxes up to the king.

    While there might have been some meritocracy within the peerage, where individuals could rise in rank based on their achievements and service, ultimately, all nobles were indebted to the king. Loyalty to the monarch was paramount, and they were expected to prioritize the king’s interests above their own. Criticizing or speaking ill of the king was generally considered taboo, and their actions and efforts were aimed at furthering the king’s agenda and maintaining the stability of the realm. So, in this feudal system, although there might have been some elements of meritocracy within the ranks of the nobility, the ultimate power and ownership rested with the king, and their actions and duties were ultimately in service to the monarchy.

    The “Great Society” in the USA indeed led to a significant expansion of the government’s role in various aspects of society. It resulted in increased reliance on government subsidies, funding, and assistance across different industries and among certain groups of people. This shift represented a departure from the earlier notion that individuals primarily needed protection from external threats and basic law enforcement.

    With the implementation of various programs and policies, the government became more involved in providing financial support, welfare services, and regulatory oversight in different sectors. This brought back the feudalism and clientelism, of old that all great societies had in order to achieve their bigger goals of being the PAX. Especially, when now almost every industry has become highly dependent on government subsidies and funding, while certain segments of society find themselves trapped in a cycle of reliance on government assistance, struggling to break free from the ” government handout gap.”

    WHAT IS THE GOVERNMENT HAND OUT GAP?

    The government trap of the handout gap arises from the vast disparity between the eligibility criteria for government assistance and the level of self-sufficiency required to break free from it. This creates a strong incentive to remain reliant on government support, as escaping the coverage gap between subsidized and market-based options demands two to three times the effort. Take the example of an egg farmer: while an independent farmer must produce 20,000 tons of eggs, a farmer receiving government subsidies is limited to 7,000 tons due to government-mandated price controls. However, for a farmer seeking to reenter the regular market, the process can be prolonged and prohibitively expensive. They would need to acquire additional chickens, equipment, and resources necessary for egg farming, effectively leaving many farmers trapped by the subsidy. Similar dynamics can be observed in government-subsidized housing and welfare programs, where the significant disparity between benefits received and the threshold for self-sufficiency discourages mobility.

    Through these programs, many individuals have become dependent on the locality they were born into, akin to serfs tied to the land. While ants and sheep follow certain paths out of competency and security, in this context, people have started relying on handouts simply because they are available. This has effectively resurrected a modern version of the peasant or plebeian classes, with government assistance serving as their primary source of support.

    One of the advantages political parties gain is that both sides now have automatic constituents who are obliged to vote in a certain way in order to maintain their subsidies. Policies like the Economic Opportunity Act of 1964 lay the groundwork for corrupt aristocracies to exploit the power of the state, consolidating their positions based on patronage rather than merit or the value they bring to their communities. When the government realizes it can effectively buy its constituents, the aristocracy becomes a contest to see who can appear the fairest to their clientele.

    As we delve deeper into this topic, it becomes evident that natural hierarchies emerge in human societies. However, in proper systems, these hierarchies consist of two components: one that can be inherited and another that is based on merit and deservingness. In a balanced structure, even those born into high status can lose it all through imprudence and degeneration, while those with little or no initial advantages can ascend to great wealth or raise their descendants to a “wealthy” status. Therefore, status should be earned through individual effort and merit, but the value of legacy should not be disregarded solely based on its lack of popularity.

    However, the Great Society implemented a system of categorizing individuals into groups eligible for government assistance. However, relying on government assistance can be likened to an addiction. It starts off as a free and seemingly helpful resource, but eventually, it becomes a dependency that can dictate your actions and livelihood. If you don’t comply with the requirements or regulations tied to the assistance, you risk losing your business, livelihood, and all the hard work you’ve invested over the years. Without this constant subsidy, the model you’ve built for yourself becomes unsustainable, leaving you reliant on the government as a sort of “big brother” figure.

    Each title can be seen as a chink in the armor of communities. The communities were already weakened by letting secularism in and the division of the extended families into nuclear ones which meant that the infusion of these systems seemed okay, and not as bad as they are.Top of Form

    Take the Jobs Corps as an example. When a group of people or a community becomes reliant on the government to provide them with job opportunities or job training, it indicates a failure in the traditional roles of parents and the community. It is the responsibility of parents and the community to impart skills that can lead to employment. Even if these skills are as basic as digging a 6 by 6 hole or chopping down a tree, they still serve as a foundation. With a developed skill, individuals can leverage it to acquire additional skills. Mastering one skill equips individuals with the tools and mindset needed to learn and excel in other areas.

    The establishment of the Jobs Corps had the potential to be beneficial if it focused on promoting community service and encouraging participation in meaningful activities that contribute to the country or local communities. However, it fell short of its potential. The need for such a program arose because communities had not effectively organized themselves to provide job education and skills training. This allowed the government to step in and assume a role that was traditionally the responsibility of parents, community members, and religious leaders in shaping the youth.

    The other programs of the Great Society aimed to provide assistance to the undereducated, impoverished individuals, as well as subsidies and loans to farmers and agricultural workers. Banks, manufactuers, miners and just about any field became able to kiss the government ring for lumps of money or special privileges.  Additionally, Medicare was introduced to cater to the healthcare needs of older individuals, solidifying their support as a voter base. While these programs were designed to offer help to those in need, they also created a system of dependency. Individuals who accepted these programs often found themselves trapped in a cycle of reliance, akin to peasants in a feudal system.

    However, it’s important to note that individuals still have the choice to reject these handouts and not trade their class mobility for a regular stipend. Though the decision may be challenging, there remains an opportunity for upward mobility. It requires a change in mindset, a shift in social circles, and the search for a supportive community that fosters self-reliance. By making these choices, individuals can navigate a path towards autonomy and independence.

    On Military Powers

    The Great Society drew inspiration from other historical societies and established a distinct class structure, assigning specific roles and responsibilities to individuals. In this system, there are no free benefits or handouts from the government that elevate one’s position fairly. Rather, everything provided by the state is contingent upon fulfilling the duties and expectations associated with one’s current station.

    As mentioned earlier, feudal systems operated on a hierarchical chain of command, where individuals with titles reported to higher-ranking officials. Similarly, many other great societies or nations throughout history implemented similar structures, creating hierarchies within their respective states.

    Every great society throughout history has been characterized by its military prowess. A successful military organization not only comprises various divisions with distinct roles, but even within those divisions, there are further subgroups with specific functions. This atomized structure ensures that each individual knows their role and understands the responsibilities of the person standing next to them. Through practice and training, these individuals develop a bond and the ability to move as a unified and cohesive unit. This level of unity is achieved through a deep understanding of their individual tasks and how they contribute to the overall mission.

    The remarkable aspect lies in the transformation of civilians into effective members of a squad, forming part of a platoon, which is a crucial component of a larger company working in coordination with other battalions. This process requires exceptional skill and strategic thinking. Generals and colonels play a pivotal role in delegating responsibilities downstream, but at every level of command, the leader is not only reliant on their own expertise but also responsible for ensuring the competence of the individual immediately below them, who leads those under their command. This chain of accountability extends throughout the hierarchy, with each leader diligently assessing the capabilities of their subordinates and expecting the same level of assessment to be carried out by every level of command.

    The intricacy of such systems lies in the effective transmission of knowledge, the elimination of ineffective practices, and a steadfast commitment to the established plan. Additionally, it is a remarkable achievement to bring together individuals from diverse backgrounds, potentially hailing from different states and following different faiths, and unite them towards a common objective. This phenomenon finds resonance in various team sports, making them valuable tools for fostering teamwork and instilling a sense of responsibility in both youth and adults alike.

    The essence of nation or society building lies in instilling a belief in something greater than oneself, where each person’s work contributes to the growth and greatness of the nation. While the military aspect may be more straightforward, as individuals are taught their roles and expected to follow orders, the challenge in both military and civilian contexts is convincing people to dedicate themselves to a cause worth sacrificing for. It requires effectively selling the dream to individuals and the various factions they belong to, demonstrating how their participation will lead to a better life for themselves and future generations. This involves appealing to their sense of honor, pride, and the opportunity to be part of the best nation that has ever existed.

    This sentiment can be observed in the Cold War era, where Americans held a deep fear and apprehension towards the USSR and communism. It was a time when the prevailing belief was that the values and way of life cherished by Americans were under threat. This fear played a significant role in shaping the collective consciousness of the nation throughout the 20th century.

    To establish a cohesive nation or society consisting of diverse individuals with varying backgrounds, beliefs, and lifestyles, there needs to be a system that accommodates different types of people. In these divisions, individuals are recognized and appreciated for the unique contributions they make to society as a whole. This is why terms like working class, middle class, blue collar, white collar, and others were coined. Additionally, it is essential to foster a shared understanding that hierarchies exist in the world. However, these hierarchies are not about one group being inherently superior or inferior, but rather about different individuals excelling in specific roles or areas of expertise.

    The unique aspect of the American concept of hierarchies, in contrast to many other “great societies,” is that it does not rely on strict hereditary aristocracy based on one’s bloodline. Instead, merit, ability, and competence play a significant role in determining one’s position within the hierarchy, depending on the field, skill, or realm they are engaged in.

    A valid criticism of traditional hierarchies is that they often lacked mobility, preventing the removal of elderly or unfit individuals from positions they were no longer suited for.

    The founding fathers of this country deliberately avoided the concept of a “great society” for two primary reasons. Firstly, they intended for the individual states to have the freedom to govern their respective regions as they saw fit, without imposing a unified societal framework or moral compass, apart from the fundamental principles outlined in the Bill of Rights. Unlike in other powerful nations, local state leaders were granted significant autonomy and were primarily accountable to their leaders in the capital, rather than the people they governed.

    The second reason, which forms the core argument of this piece, is that the founding fathers recognized that all “great societies” inevitably require the sacrifice of individual freedoms in order to achieve their ambitious goals and aspirations. They understood that the pursuit of a collective vision often entails limitations on personal liberty and the consolidation of power in the hands of the governing authority.Bottom of Form

    NOTHING IS FREE.

    From the very basic necessities of life, such as the air we breathe and the water we drink, to the more complex aspects of our existence, everything carries a trade-off. Even something as fundamental as breathing involves inhaling oxygen and exhaling carbon dioxide, which affects the balance of gases in the atmosphere. Similarly, the availability of clean drinking water often requires extensive processes like desalination, which have their own environmental implications. These examples demonstrate that even the most essential elements of our lives come with complexities and trade-offs that need to be considered.

    WHAT MAKES PEOPLE THINK THE GOVERNMENT IS NOT GETTING THEIR CUT!?!

    The leaky buckets theory offers a useful framework for understanding this concept, that many people don’t realize is the cost of all bureaucratic machines, particularly so with federal governments. Consider a scenario where the government collects some X amount of dollars in taxes. A significant portion of this revenue is allocated towards the operation of various agencies such as the IRS and other administrative bodies, which incur significant costs. Additionally, if there are programs in place to distribute funds, there are costs associated with overseeing and managing those programs, as well as compensating the individuals working at the ground level. It is also important to consider the potential externalities that may arise from the utilization of these funds. In this way, the leaky buckets theory highlights how resources allocated by the government can encounter various costs, both visible and hidden, as they flow through different channels and sectors of society. [1]

    This brings us back to the analogy of ants and sheep. Ants understand the concept that nothing comes for free. They work diligently and know that they must earn everything they receive. Even if they didn’t personally forage for food, they contribute to the hive in other ways. On the other hand, sheep have an expectation of being taken care of simply because they belong to a group. They fail to realize that the shepherd’s actions are not solely driven by benevolence. While shearing their wool may benefit their health, the shepherd ultimately benefits by using the wool to create clothing. Sheep are nurtured and protected by the shepherd to maintain their value, as without this protection, they would be vulnerable to harm or even death. It’s important to note that this protection provided by the shepherd comes at a cost to the sheep.

    The government, if given the opportunity, will try to shape people into sheep. Social conditioning affects everyone, even those with a strong sense of individualism. Even the most diligent and hardworking individuals can be tempted to accept things they know they shouldn’t have if those things are readily available. They may even convince their fellow ants to do the same, just like the ant that carries back refined sugar to the hive, despite knowing its potential drawbacks.

    Industries.

    The industrial military complex thrives on subsidies and profits derived from war. They are awarded medals and badges of honor based on securing lucrative contracts. The media, on the other hand, serves as the government’s public relations division. When it becomes challenging to justify the corrupt actions of the government, they divert attention by pitting different groups against each other or highlighting unrelated issues in distant locations. For instance, when the Occupy Wall Street movement gained momentum, there was a sudden and significant increase in news coverage on racial issues, which has continued to escalate ever since.

    Many great societies throughout history were held together either by powerful leaders or by oppressive hierarchies where disobedience to the leader meant certain death. This historical context is important to consider when we hear the FBI and DHS labeling various groups they disapprove of as domestic terrorists.

    It is worth noting that most great societies faced significant challenges when their leaders passed away, when their leaders’ successors proved to be unfit or corrupt, or when regional leaders prioritized their own interests over the well-being of the society as a whole. These factors often led to the fragmentation or decline of the society.

    However, corruption has always been a concern within large governmental bodies, regardless of their form. In the absence of hereditary power, which once served as a standard for upholding a family name or risking its descent into oblivion under greedy lords, politicians now have different motivations to solidify their positions. Whether it be through amassing wealth, gaining status, seeking fame, or enacting laws that favor their interests, politicians recognize that they likely only have one opportunity to leave their mark. Consequently, they may resort to employing ruthless tactics to secure their positions and advance their agendas.

    The American political landscape appears to suffer from a lack of accountability stemming from the relatively short terms of elected officials. When leaders know that their time in power is limited, there is a temptation to make the most of it without considering the long-term consequences. However, accurately gauging the impact of policies over time is a challenging task.

    Each successive leader must contend with the policy mistakes and challenges left behind by their predecessors, making it difficult to fully assess the intentions behind past decisions. Even granting the benefit of the doubt to leaders like LBJ, it becomes apparent that their ambitious visions may have been driven, at least in part, by a sense of hubris.

    A religious education can be seen as a modern form of aristocracy. Throughout history, societies were often governed by nations, empires, or religious institutions that imposed their principles and beliefs on their people. These entities sought to expand their influence by conquering new territories and either enforcing strict adherence to their beliefs or collecting taxes for their sustenance. The success of these “great ideas” or empires was tied to their ability to push boundaries and engage in battles with those who opposed them. Additionally, they relied on hierarchical structures where individuals understood their rank, roles, class, and position in the social order.

    Thus, the concept of a great society necessitates individuals being aware of their social class or role, such as nobles, plebeians, workers, or senators. In many cases, these positions are relatively fixed, although there is still some degree of social mobility, particularly driven by economic incentives. However, when the government excessively protects industries like banks, car companies, technology firms, and pharmaceutical companies, these industries become intertwined with the government itself. Consequently, the titles and programs established within the framework of the great society, as devised by LBJ, serve as tools to perpetuate the existing power structures found in nation-states and empires that have existed globally for thousands of years.

    The intention behind these programs was not necessarily to uplift people out of poverty, but rather to maintain certain individuals or groups in their respective places. They give the illusion of change while subverting certain groups through an educational system that LBJ would not have subjected his own relatives to, as it propagates a secular indoctrination.

    The New Age of Aristocracies

    If we examine elected officials, it is noticeable that the majority, if not all, have had a significant religious upbringing. Whether they attended private schools, Sunday schools, or were exposed to religion in various ways, it has shaped their early experiences. While many may not actively practice their religion now, it is still consequential as they have adopted a secular approach while being influenced by the religious principles they were raised with. This influence is evident in how information is conveyed in the world, particularly in the United States. Religious precepts, such as the notion of the end of days, the distinction between good and bad actors, the identification of enemies, and adherence to moral imperatives, continue to shape the way information is presented and perceived.

    In the realm of American politics, one way to analyze the ongoing battles is by examining the concept of aristocracy. The right-wing tends to prioritize religious affiliation or competence in specific issues as qualities that should define elected officials. They argue that these attributes align with their vision of leadership. On the other hand, the left-wing focuses on identity markers such as race, gender, and sexual orientation, aiming to include historically marginalized groups within the ranks of power. This perspective stems from a discourse of oppressor versus oppressed, where the left seeks to empower the previously oppressed as a form of equity and fairness, challenging the traditional aristocracy.

    However, it is important to recognize that government programs can inadvertently strengthen the aristocracy of religious groups. This is because religious individuals tend to have larger families and prioritize passing on their faith to their children. Conversely, if you belong to a group that perceives themselves as oppressed, the education or belief system becomes solely focused on being anti-establishment. As a result, there is often a lack of substantive values or practices being transmitted to the next generation. While the left’s fight for social justice may gain momentum over time, their communities can face internal divisions as each new generation sees the previous generation as not being extreme enough. This can lead to the creation of terms such as TERF (trans-exclusionary radical feminist) or MAPs (minor-attracted persons) as each generation seeks to push the boundaries further and labels former allies as enemies.

    This is why religious traditions often exhibit characteristics similar to autocracies in this country. While financial wealth tends to dissipate over time, religious legacies have proven to be more enduring. The 3rd generation curse affects most secular or lazy families, causing their financial legacies to crumble. While various theories attempt to explain the persistence of religious groups, a more secular understanding is that religious values withstand the test of time by providing strict traditions and rules for established families and their lives. These religions have codified and canonized teachings on how to navigate the external world and resolve internal conflicts of the soul, spirit, and flesh. Virtually all notable religions encourage their followers not to trust their base desires and to prioritize long-term goals over immediate pleasures.

    However, even these achievable aristocracies, open to individuals from any walk of life, rely on communities that prioritize a specific set of values above all else. Without such a foundation, we risk seeing degenerate individuals in positions of power, who are essentially secular versions of the faiths they were raised in. These individuals may possess the ability to set aside distractions, but they still prioritize personal pleasure over true values. Even those who do not actively follow the rules or traditions they were raised with still possess a toolbox of principles that non-theists are attempting to construct for themselves through various ideologies based on “why not” isms.

    In conclusion, it is essential to recognize ourselves as autonomous individuals who uphold the traditions of our ancestral culture. We should embrace the values that provide meaning to our genealogy, encompassing our past, present, and future. By actively working hard every day, we contribute to the strength and growth of the community that gives our lives purpose and significance. Rather than being defined solely as an ant or a sheep, a serf or a lord, or a soldier or a citizen, our true identity lies in being individuals who cherish and actively nurture the traditions and values that shape our sense of belonging and that provides meaning to your genealogy (past, present and future) and what one works on everyday to strengthen that community that gives life purpose.

    I embrace both the positive and negative aspects that have been mentioned, as they exist in different areas of my life. However, my commitment lies in striving every day to move closer to the ideal and to rise stronger each time I encounter setbacks. Even if it may seem unconventional, I believe it is worthwhile to express my beliefs passionately if it helps just one person to gain a new perspective and find their own enlightenment.


    [1] https://www.reed.edu/economics/parker/201/cases/leaky.html          (1) Reductions in work effort both by the rich who have lower after-tax wages due to the tax and by the poor who now have additional non-labor income and who may be dissuaded from work if earning more disqualifies them from the transfer program. (2) Saving and investment may be discouraged by high tax rates on income both because the incentives to accumulate wealth are reduced and because the wealthy typically save more of their income than the poor. (3)Socio-economic leakages due to the possible stigmatization of wealth accumulation, which may cause individuals to try less hard to be productive and get rich.

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
  • Miseducation: State run, and Neglected by Communities of Parents

    June 4th, 2023

    Education has undergone significant changes throughout history, the traditional form of education, where individuals learned a trade from their parents or community members through apprenticeships, contrasts with the modern education system as we know it today. In the past, many children, including myself, had to contribute to their family’s livelihood by working alongside them. The concept of a carefree childhood centered around or filled with play games and leisure is a relatively recent development that emerged in the 20th century. Unlike today, the majority of children did not have the luxury of spending their days in classrooms. Instead, they actively participated in supporting their families, communities, or employers by assisting in the labor-intensive tasks, such as harvesting crops during the summer months that were cultivated during the spring. By understanding the historical context of education and the different roles children played, we gain a deeper appreciation for the evolution of educational systems and the privileges we enjoy today.[1]

    In contrast to the traditional approach of learning a trade within one’s immediate surroundings, the modern education system is accurately criticized for its resemblance to a factory. The current model for public schools was initially established during the Victorian era, with the intention of preparing students for future work in factories or assembly lines. However, it is important to acknowledge that the educational model has undergone minimal adjustments since then, which raises the question of why this is the case.

    The static nature of the education system should prompt us to reflect on the reasons behind its lack of significant changes. It is crucial to consider various factors that contribute to the system’s resistance to change. Some possible explanations may include a combination of institutional inertia, bureaucratic processes, societal expectations, and the complex nature of implementing comprehensive reforms in a large-scale educational system.

    Critics argue that the factory-like structure of education fails to address the diverse needs and potential of individual students. The one-size-fits-all approach may not adequately cater to the unique talents, interests, and learning styles of each student. As our society evolves, and with the advancement of technology, it becomes increasingly important to reassess and adapt educational systems to better prepare students for the challenges and opportunities of the future.

    Examining the reasons behind the current model of education can lead to insightful discussions and potential avenues for reform. By critically evaluating the system the wuestion of arises as to why the system has not been improved over the years.

    I contend that education was originally not intended for the masses but rather served a different purpose. It can be argued that the average person today possesses an abundance of knowledge on topics that hold little significance to their lives or areas where they have agency. Consequently, this surplus of information becomes a distraction from focusing on meaningful aspects that individuals can actively engage with and effect change upon. In our contemporary society, there is an ongoing battle for attention on a grand scale, with billions of dollars invested in capturing people’s focus.

    The proliferation of information and the constant influx of content vying for our attention have created a dynamic in which individuals are bombarded with trivial or irrelevant knowledge. This inundation can divert people’s energy and resources away from areas where they have the power to make a tangible impact. As a result, valuable time and effort may be wasted on absorbing information that does not contribute significantly to personal growth or the betterment of society.

    In this context, it becomes imperative to critically evaluate the purpose and impact of education in today’s world. Are we equipping individuals with the necessary skills and knowledge to navigate the complexities of life and contribute meaningfully to their communities? Or are we inadvertently perpetuating a system that prioritizes superficial knowledge over practical application and genuine growth beyond that of a large GDP?

    Recognizing the role of attention as a valuable and finite resource is essential. Various entities, such as corporations, advertisers, and media outlets, compete for people’s attention as it holds immense economic and social value. Consequently, individuals must actively discern which information and pursuits are worthy of their attention and align with their goals and values. This was one of the most valuable skills one can ever learn which is not being taught in schools any more.

    By questioning the purpose of education and the forces that shape our attention, we can foster a more critical and discerning approach to learning. This can empower individuals to focus on areas where they can make a real difference, contribute to meaningful causes, and participate in the ongoing battles that shape our society.

    Before I delve into the controversial statement that “education was never meant for the masses,” let me clarify my perspective. The term “education” has evolved to encompass various meanings, including the acquisition of knowledge and skills, as well as the pursuit of specific subjects or areas of study. In contemporary usage, education often revolves around the dissemination of information and the expectation that students will memorize and reproduce it in a prescribed manner. However, it is important to acknowledge that people often forget the specifics of what they were taught over time. It should be noted that certain subjects, such as mathematics and language skills, build upon foundational knowledge, with each lesson building upon previous concepts.

    It is crucial to recognize that when I suggest education was not intended for the masses, I am referring to the mass production model of education that prioritizes rote memorization and conformity alongside standardized tests. This model, influenced by the Industrial Revolution and the need to prepare individuals for factory-like settings, does not effectively cater to the diverse needs and aspirations of all persons. It teaches people how to be workers. While this mass production approach has its merits, it is limited in its ability to foster critical thinking, creativity, and adaptability—qualities that are increasingly valued in our rapidly changing world.[2]

    In today’s information age, where knowledge is readily accessible, education should evolve to emphasize skills such as critical thinking, problem-solving, collaboration, and innovation. Rather than focusing solely on the retention of specific information, education should empower individuals to analyze, evaluate, and apply knowledge to real-world situations. This shift in emphasis acknowledges that the ability to adapt, learn independently, and think critically is often more valuable than memorizing facts. However, skills like these are usually shown through meticulously instruct and are best as a practice that is passed down within a community.

    Furthermore, it is essential to recognize that education is not solely confined to formal classrooms and institutions. Learning occurs in various contexts, including personal experiences, interactions with others, and practical application of skills. The concept of lifelong learning acknowledges that education is a continuous process that extends beyond traditional schooling, and individuals should be encouraged to pursue knowledge and skills throughout their lives.

    One of the terrible products of the ridge way public education is set up is there are people who believe learning, expanding their knowledge or reading books are now not necessary for their development of self because they finished the courses they were mandated to take.

    However, it is worth noting that the current education system, which has been in place for several decades, may not adequately teach children how to critically analyze and think about the information they receive. Instead, the focus often lies on memorization techniques that work best for each individual. It is not uncommon to hear the saying that “schools are not meant for learning but for socialization,” implying that given the limited amount of information covered in middle schools and high schools, most children could complete their high school education by the age of 12 or 13.

    While this statement may be provocative, it highlights the concern that the education system is not equipping students with the necessary critical thinking skills and the ability to evaluate and analyze information independently. The emphasis on rote memorization and adherence to specific formulas or methods can limit students’ capacity to engage with knowledge in a meaningful and intellectually stimulating manner.

    Education should ideally encompass a holistic life approach that nurtures not only academic knowledge but also fosters community, culture, curiosity, creativity, and problem-solving abilities. By encouraging students to think critically, ask questions, and engage in meaningful discussions, we can better equip them to navigate the complexities of the world and contribute to society in a more profound and meaningful way.

    Furthermore, the notion that schools primarily serve as socialization hubs emphasizes the importance of culture, community, interpersonal skills, collaboration, and the development of social competencies. Education should provide opportunities for students to interact, engage in group projects, and cultivate skills such as empathy, teamwork, and effective communication.

    Education is not solely achieved by passively absorbing information in a confined classroom setting. The true essence of education lies in the rich tapestry of personal conversations, interactions, and hands-on experiences. While diplomas and certificates may be perceived as symbols of education, they oversimplify its true nature.

    Genuine learning occurs through meaningful engagement with others in one-on-one or group conversations. These discussions allow us to explore diverse perspectives, challenge our own assumptions, and deepen our understanding of the world. When we have conversations with people we love, admire, and respect, the information shared becomes more relevant and meaningful. We can connect new knowledge to our existing frameworks, observations, cultures and experiences, allowing for a more comprehensive and holistic understanding that connects us with our community’s.

    Education should not be confined to the walls of a classroom or reduced to the acquisition of degrees. It is an ongoing process of exploration, discovery, and growth that extends far beyond formal educational institutions and should be part of your everyday life. From engaging in discussions with mentors, participating in community activities, or pursuing personal passions, we gain valuable insights and practical skills that shape our understanding of the world.

    Recognizing the significance of personal interactions and experiential learning, we can create environments that encourage curiosity, collaboration, and active engagement. By fostering spaces for open dialogue and providing opportunities for hands-on experiences, we empower individuals to become active participants in their own education. This approach embraces the diversity of learning styles and encourages individuals to find connections between the information presented and their own lived experiences.

    When I mentioned that “education was never really meant for the masses,” I was highlighting a concern regarding the mass production of education. It is essential to recognize that education should not be reduced to a mere transactional process where knowledge is imparted without considering its relevance or value to individuals.

    In contemporary society, there is a prevalent issue where individuals accumulate vast amounts of knowledge without the ability to retain or apply it effectively. This situation often leads to feelings of frustration, resentment, or even financial burden, as many find themselves indebted for an educational experience that did not meet their expectations or provide them with tangible value.

    The concept of debt in education extends beyond financial loans. It also encompasses a relational debt within the civil contract between learners and educational institutions. In this contract, individuals invest their time, energy, and trust, expecting to receive an education that equips them with knowledge, skills, and opportunities they value.

    To address this issue, it is crucial to shift the focus from mass production to personalized and meaningful education. Each individual has unique interests, talents, and aspirations that should guide their educational journey. By emphasizing community based personalized learning experiences and fostering a sense of individual agency, we can empower learners to pursue knowledge and skills that align with their cultures and being.

    Furthermore, it is important to prioritize the quality and relevance of education over quantity. Instead of pursuing education for the sake of accumulating degrees or certificates, we should encourage a lifelong learning mindset that values continuous growth and personal development. This approach promotes a deeper understanding of subjects, encourages critical thinking, and fosters a sense of curiosity and intellectual exploration.

    In essence, the goal should be to provide individuals with an education that is meaningful, valuable, and aligned with their aspirations.

    THERE AINT NO FREE

    The analogy of companies providing free samples to create a positive association with their brand can be applied to the concept of publicly funded education. Just as receiving a free sample can generate a sense of gratitude and influence one’s perception of a product or brand, the provision of “free” education has shaped the perception that education is a necessity without which the world would come to a halt. [3] This phenomenon can be seen as a form of indoctrination that has influenced the mindset of the past two generations.

    When individuals receive publicly funded education, regardless of their opinion about the quality of their school, there is often a sense of indebtedness or obligation associated with it. This perception of education being a “free” product provided by the government creates a narrative that reinforces its importance and indispensability.

    However, it is crucial to critically examine the impact and value of this education. While the intention behind publicly funded education is to provide equal opportunities for all individuals, it is essential to assess whether it is truly meeting the needs and aspirations of students. This evaluation should go beyond the notion of education as a commodity or a free product and focus on its ability to empower individuals, foster critical thinking skills, and prepare them for the challenges of the real world.

    By recognizing the potential indoctrinatory aspects of the “free sample” mentality in education, we can encourage a more nuanced and comprehensive perspective. This includes considering alternative educational approaches, such as community based personalized learning, vocational training, or apprenticeships, that cater to the unique learning styles and individual interests. It is important to view education as a means to develop communities, critical thinking, curiosity, and a passion for lifelong learning, rather than simply an institutionally provided “free” product.

    By examining the parallels between the provision of free samples by companies and publicly funded education, we can gain insights into the potential influence and perceptions surrounding education. It is crucial to foster a critical mindset, promote alternative educational approaches, and prioritize the true value and impact of education on individuals and society as a whole.

    I am not suggesting that education itself is inherently negative. However, in its present state, it often resembles a bureaucratic hydra with dual functions. Firstly, it fails to adequately teach the fundamentals of English and math to a majority of students. The best students typically rely on self-teaching, seeking assistance from others, or having the means to afford private tutoring. Secondly, education also operates as a state-run propaganda machine, particularly in the realm of humanities. These subjects have become narrowly tailored to dictate what is considered right and wrong, but not from a values perspective. Instead, they tend to focus on what is deemed to be socially acceptable or what feels right according to prevailing societal norms.

    The purpose of the humanities has never been to provide definitive answers; rather, they are meant to raise thought-provoking questions that allow individuals to form their own conclusions. Classical literature, for example, does not aim to dictate what is right or wrong. Instead, it explores the intricacies of existence and sheds light on the complexities of the societies we inhabit. These narratives also emphasize the notion of personal agency, highlighting that we have the freedom to approach challenging social issues in various ways.

    However, when equity and racial discourse are introduced into the classroom without room for nuance, discussion, or dissent, it can devolve into a form of secular indoctrination. The problem arises when these topics are presented as absolute truths without allowing for critical thinking or exploration of alternative perspectives. True education should encourage open dialogue, facilitate the examination of multiple viewpoints, and foster a deeper understanding of complex issues, rather than promoting a one-sided narrative.

    Moreover, it is immensely valuable for individuals to explore and gain knowledge about their personal ancestry, the value systems of their cultural and ethnic backgrounds, even if they come from diverse racial, national, or religious backgrounds. This understanding provides a sense of purpose and belonging, connecting them to a rich tradition that has endured through devastating wars, plagues, and oppressive regimes that sought to subjugate those who oppose them. It instills a profound sense of pride rooted in the achievements and resilience of one’s lineage, carrying with it the responsibility to continue advancing in the world. Embracing this knowledge and heritage can provide individuals with a solid foundation from which to navigate their own identities and make meaningful contributions to society.

    This notion brings to mind the old adage, “Are you brave enough to face the prospect of leaving no legacy behind and having made no significant impact during your lifetime?”

    However, in the midst of mass-produced education systems, we often overlook the importance of cultivating a collective national identity and narrative. This ethos and mythos, intended to unite a nation, ultimately falls under the influence of those who hold control over the education system, particularly the teachers. Consequently, the direction and perspective of those in power regarding the country’s origin, trajectory, and desired future can significantly impact the curriculum.

    Valid[4] arguments can be made for adjusting the curriculum to align with a particular vision of national identity. This could involve modifying educational materials to reflect a specific understanding of the country’s historical roots and the desired direction it should take. It becomes a matter of determining which perspectives and values should be emphasized, acknowledging that the choices made in shaping the curriculum have a profound impact on shaping the collective consciousness of the nation.

    As explored in my essay on “Weapons of Distractions” and “The Tradition Ender,” the ongoing debates surrounding effective educational methods have existed for some time. However, in the past, these discussions were limited in their reach, lacking the means of widespread dissemination. Moreover, these debates often flourished in the shadows as families and communities became engrossed in what we now refer to as entertainment.

    Consequently, the crucial conversations that should have taken place within communities and between families, allowing for civic disagreement and understanding, were stifled. Additionally, the transfer of essential skills, values, and traditions that form the core of meaningful education were neglected. These are the timeless practices of sharing stories and lessons learned from our ancestors, a tradition that has guided us through all of antiquity.

    By succumbing to distractions, we inadvertently neglected the important intergenerational transmission of knowledge and wisdom. It is through these stories and the collective wisdom of our ancestors that we can truly impart the most meaningful aspects of education and shape a more cohesive and informed society.

    Imagine going back 200 years and informing someone that failing to teach their child according to government mandates would result in criminal consequences. They would undoubtedly perceive such a scenario as a form of tyrannical state indoctrination. Yet, over time, we have justified this encroachment on personal freedoms by asserting that education is essential for societal well-being. We have placed excessive value on our secular education system as the primary catalyst for creating a “Great Society.”

    However, it is important to acknowledge the broader context and consider the repercussions. In the subsequent piece, we will explore how the prevailing zeitgeist of the Great Society perpetuates the remnants of outdated systems, which erect formidable barriers that hinder proper community growth.

    It is crucial to distinguish between the necessity of imparting basic knowledge to children and the government’s monopoly on education. While there is a legitimate need for foundational knowledge to be disseminated, it is not synonymous with the government having exclusive control over education.

    By imposing a specific educational system on individuals who may not inherently value or resonate with that particular form of education, there are trade-offs involved. One consequence is that people have less time to share with their children the traditional knowledge and teachings that were once considered essential within their communities. The initiative taken by the state to intervene in education has weakened the communities that previously fostered education within their own realms. By accepting government funding, there has been a compromise where the valuable time that could have been dedicated to teaching and passing down community values was exchanged for adhering to the dictates of the secular government system.

    It is crucial to consider the impact of these choices and recognize the potential erosion of community-driven education and the loss of valuable traditions and wisdom that were once at the core of educational practices.

    It is important to note that the acceptance of subsidies and federal funding for education may not have been intended as a deliberate act to undermine communities. However, regardless of intent, the current consequences and outcomes are what truly matter.

    In light of this, those who oppose “school choice” show their face and their lack of care for genuine community engagement and an absence of appreciation for the cultural diversity and traditions that have been eroded, leading to subpar literacy rates and the disregard for the preservation of people groups. If government funding of education is going to continue, no one should be limited regarding where they want to send their child to get educated. Top of Form

    Meaningful Education

    Education holds a significant impact on our biology. Studies in mice have shown that experiences and traumas can alter the gene expression in their gametes, subsequently influencing the behavior and responses of their offspring. This suggests that children born after a traumatic event may exhibit predispositions to be affected by similar stimuli, while those born before may not have the same hardwired response. While human genetics are undoubtedly more intricate than that of mice, it is reasonable to assume that similar complexities exist in our genetic makeup.

    Considering the potential influence of positive and negatives education on our gene expression, it becomes apparent that the learning experiences we provide for future generations can have profound effects on their genetic expression and subsequent behaviors. This understanding underscores the importance of cultivating enriching and nurturing educational environments that support the development of well-rounded individuals. By acknowledging the intricate relationship between education and one’s ancestral memories and knowledge passed through genes, we can approach learning and teaching with a greater appreciation for its profound impact on future generations.

    As beings with genes and developed prefrontal cortexes, our actions often stem from unconscious processes. Many of our hopes, fears, aspirations, and phobias are imprinted in our genetic makeup, shaped by thousands of years of evolutionary development and the impact of collective traumas. The nature versus nurture debate revolves around determining which factor, nature or nurture, has a greater influence on shaping individuals, rather than negating the significance of both in shaping who we are.

    Both nature and nurture play substantial roles in our development, and the crux of the argument lies in assessing the relative impact of each. Our genetic predispositions, influenced by our ancestral heritage, interact with environmental factors and experiences to shape our personalities, behaviors, and beliefs. It is an intricate interplay between our inherent genetic traits and the nurturing environments in which we grow and learn.

    By recognizing the intricate balance between nature and nurture, we gain a deeper understanding of the complexities that contribute to shaping human beings. It allows us to appreciate the importance of creating supportive environments that encourage growth and provide opportunities for individuals to reach their full potential, while also acknowledging the fundamental genetic underpinnings that contribute to who we are.

    I contend that an education disconnected from one’s specific background is inherently lacking compared to an education that is tailored to it. This disconnect is often accompanied by the belief that a secular lifestyle can provide a sense of meaning. However, secularism often borrows certain religious concepts, selectively embracing those that resonate while discarding others.

    Yet, a meaningful life cannot be approached as a simple algebraic equation, where one methodically solves for a desired variable while disregarding everything else. What secularism dismisses as mere bathwater is often the very essence that sustains the foundational principles of society.

    These discarded elements are the water that nourishes and preserves the seeds of society, providing the roots from which meaningful traditions and values emerge. By neglecting or dismissing these vital aspects, we risk losing the depth and richness that they contribute to our individual and collective existence.

    A truly meaningful education acknowledges the importance of preserving and understanding the cultural, historical, and philosophical foundations that underpin our diverse backgrounds. It recognizes the significance of the entire context rather than selectively embracing or rejecting specific elements. By doing so, we can cultivate a more holistic and enriched understanding of ourselves, our communities, and our shared humanity.Bottom of Form

    “SO WHAT SHOULD EDUCATION BE?”

    Throughout history, the primary source of knowledge for individuals has been their immediate surroundings rather than the state or government. Communities played a vital role in shaping and imparting knowledge, with individuals learning from their families, neighbors, and close-knit social circles. The education received was often influenced by factors such as family background, temperament, and religious beliefs. In some cases, families hired renowned tutors to provide specialized instruction. Formal education, as we understand it today, was limited and less prevalent.

    It is important to emphasize that the concept of state-run education would have been viewed as a form of indoctrination or even reminiscent of reeducation camps. The notion that the government should dictate and control the educational process would have been met with skepticism and resistance in many historical contexts. The idea of education being centralized and standardized by a governing authority was not the norm, and the responsibility of education largely rested within the immediate community and family structures.

    Understanding this historical perspective highlights the significance of community-driven education and the potential concerns surrounding the concentration of educational power within the state. It encourages us to critically examine the balance between state involvement and the autonomy of local communities in shaping educational systems that best serve the needs and values of society.

    The cultural shift that occurred in the 1960s and continued into the 1970s resulted in the belief that public schools could provide all the necessary education for children. This shift may have been influenced in part by the distractions mentioned earlier. However, as time passed, these issues became more pronounced and problematic.

    During this period, parents increasingly embraced a secular worldview, assuming that the general education and skills provided by the public school system would be sufficient for their children’s development. Additionally, there was a widespread acceptance of the myth of the nuclear family, which further contributed to a distancing from the broader community that traditionally played a significant role in education.

    While subjects like social studies, mathematics, and writing are undoubtedly important, they alone serve as vessels without the deeper theological or teleological meaning that can infuse life with purpose. If one solely relies on secular subjects without exploring broader philosophical or spiritual dimensions, it becomes challenging to find meaning in the inherent struggles and complexities of existence.

    It is crucial to recognize that education extends beyond the acquisition of knowledge and skills. It encompasses the exploration of values, beliefs, and the search for purpose. By embracing a more holistic approach to education that integrates secular subjects with broader existential inquiries, individuals can navigate the challenges of life with a greater sense of meaning and fulfillment.

    Amidst these shifts, there has been an increasing focus on pleasures and distractions as sources of fulfillment. Communities, too, have drifted away from their traditional values, traditions, and faith, contributing to a sense of fragmentation. In an attempt to fill the void left by these eroding foundations, communities have often turned to extracurricular activities such as sports, arts, and academic performance as central focal points.

    While extracurricular activities can offer valuable experiences and opportunities for personal growth, relying solely on them as a replacement for deeper values and shared beliefs can be limiting. These activities, while enjoyable and engaging, often do not provide the same sense of cohesion and spiritual fulfillment that a strong community built on shared values and faith can offer.

    By placing excessive emphasis on extracurricular pursuits, communities may unintentionally overlook the importance of nurturing a sense of shared purpose and a shared moral compass. While these activities can enhance individual development, they should not overshadow the need for a strong foundation rooted in shared values, traditions, and faith. It is through these deeper connections that communities can truly thrive and provide a meaningful sense of belonging and purpose for their members.

    Extracurriculars are not Value Driven(or virtues) They are a Segway for Professional and Social Development.

    While extracurricular activities within the school system offer numerous benefits such as entertainment, team-building, discipline-building, and overall personal development, it is important to recognize that they alone do not provide ultimate meaning in life. These activities serve as hobbies that impart valuable lessons and skills, which can be applied to various aspects of life. They provide an outlet for individual expression and may even lead some individuals to pursue them professionally.

    However, it is essential to distinguish between activities that help sustain our livelihood and those that serve as long-lasting sources of meaning. While certain pursuits may be instrumental in putting food on the table or serving practical purposes, they may not provide profound existential significance to the everyday experiences of life.

    Meaning in life often arises from deeper connections, shared values, and a sense of purpose that goes beyond individual pursuits or hobbies. It encompasses relationships, personal growth, contribution to the greater good, and a search for understanding our place in the world. While extracurricular activities can be enjoyable and contribute to personal development, they should be viewed as complementary to, rather than substitutes for, the broader quest for meaning and fulfillment.

    Moreover, many individuals become captivated by the notion of living vicariously through their children. They strive to elevate their children to a higher social status than they themselves achieved at the same age, fulfilling the long-held aspirations that they had during their own youth. Additionally, for numerous parents, involvement in their children’s extracurricular activities becomes a platform for connecting with other parents, forming a semblance of community centered around supporting one another in nurturing these pursuits.

    While the desire to see one’s children succeed and thrive is natural, it is important to examine the motivations behind these aspirations. Placing excessive emphasis on external achievements and social status can inadvertently overshadow the deeper aspects of personal growth and character development. The pursuit of extracurricular activities should not be solely driven by a desire to fulfill unfulfilled dreams or to seek validation through the accomplishments of one’s children.

    Instead, it is crucial to foster a balanced approach that values personal growth, genuine connections, and the development of well-rounded individuals. Extracurricular activities can be a means to cultivate important life skills, promote teamwork, and encourage individual passions. However, the focus should always be on nurturing the whole person rather than using these activities as a vehicle for personal fulfillment or social status.

    However, it is important to recognize that childhood experiences and friendships formed during that time have their limitations. While these experiences may have been formative, they do not necessarily provide a deep sense of community. High school friendships hold a certain charm as they remind us of simpler times, but with the advent of technology and changing social dynamics, fewer people solely associate their lives with their high school experiences. Unless these relationships were built on mutual support and a drive for personal growth, they often fade away.

    Friendships can wane due to a lack of growth or when individuals find themselves in different life stages. Even if a friendship was initially based on pushing each other to grow, shared geographic location, common hobbies, or attending the same secular school, these factors alone do not provide a profound sense of meaning. They offer only a shallow connection without a solid foundation for pride and enduring legacy.

    Consequently, another source of communal disconnection and fragmentation arises, leaving people feeling detached and unfulfilled. It becomes evident that a more profound and lasting sense of community requires deeper shared values, collective purpose, and a commitment to nurturing meaningful connections that transcend superficial ties.

    Education should be centered around principles and a well-established canon. Understanding the principles that underpin different ideologies or belief systems allows for critical analysis and the ability to articulate counterarguments. It provides a foundation for comprehending different systems of thought and facilitates productive discussions.

    Many individuals, including some of my fellow college peers, often lack a deep understanding of their own beliefs. Their convictions may be driven more by emotions such as anger, entitlement, or anxiety rather than a rational and reasoned comprehension of their own positions. Consequently, they struggle to engage in constructive dialogue, resorting instead to ad hominem attacks, shaming, or guilt-tripping in an attempt to persuade others.

    By embracing a curriculum that focuses on community oriented principles and a well-rounded canon, education can equip individuals with the necessary tools to think critically, articulate their beliefs effectively, and engage in respectful discourse. This approach encourages intellectual growth, fosters open-mindedness, and empowers individuals to engage in thoughtful and persuasive discussions based on a solid understanding of their own values and those of others.[5]


    [1] While I hold conflicting views on child labor, I recognize the importance of child labor laws and the intention behind safeguarding the well-being of children, allowing them to enjoy their youth. However, it is crucial to acknowledge that when children are not engaged in work, including having a summer job during their teenage years, there can be unintended consequences such as a lack of understanding about the value and cost of things. This can be seen as a form of miseducation.

    When children are provided with expensive and distracting technology without having to undertake meaningful tasks like chores, academic pursuits, or athletic achievements to earn it, they may develop a sense of entitlement. The object’s value becomes trivial to them as they perceive it as something they deserve simply because it was readily available when they desired it, becoming an integral part of their daily life. It is important to note that this issue primarily stems from parenting styles and the education provided at home, rather than state-run education systems.

    While there may be varying opinions on child labor and the impact of not engaging in work during childhood, it is essential to foster a balanced approach that instills a sense of responsibility, work ethic, and an understanding of the value of things in children. By providing opportunities for meaningful tasks and imparting a sense of earning and achievement, we can contribute to their overall development and prepare them for the realities of the world beyond their youth.

    [2] I have encountered individuals who attend highly prestigious universities and only dedicate a mere 10 hours of study to an entire course. Surprisingly, they never bother attending the classes, yet manage to show up for the final examination and receive a respectable B grade. One might initially perceive them as geniuses, and while some of them may indeed possess exceptional intelligence, what is more significant is their ability to manipulate the system. They have effectively demonstrated the futility of many classes. Moreover, they have highlighted how the habits of those who strive for A grades in every class may be misplaced, as achieving a decent grade can be attained with just a fraction of the effort. Consequently, individuals can utilize the remaining time to pursue other endeavors. It’s important to note that this attitude is not typically taught but rather learned through observation and experience.

    [3] There of course is the freerider issue however, even then that’s a minority and freeriders usually still speak well of the experience until and sometimes even after they are cut off.

    [4] Valid vs Sound à know the difference, and learn the word play.

    [5] On a personal note:

    My journey with English language education has been challenging and unconventional. From second to ninth grade, I did not take any formal English classes, and even in ninth grade, I struggled and failed in English. In fact, there were times when graduating from high school seemed uncertain due to my English grades. However, everything changed when I enrolled in a community college and took two quarters of English writing, and in this period I learned more about the skill of writing then I had prior to the age of 19.

    Community colleges recognized the critical importance of writing skills for student success and retention. Writing is a multifaceted skill that requires several abilities to converge towards a common goal: to effectively communicate ideas in a manner that is clear and comprehensible to readers. Growing up, I had exposure to reading in other languages while primarily speaking English, and I encountered primary texts in three additional languages, all of which conveyed meaning beyond a straightforward interpretation. As a result, my struggle lied in translating abstract concepts into concrete expression, and this challenge persists to some extent.

    However, throughout my journey, I have always known what I am trying to convey because I was taught the values and traditions of my people. This grounding in my cultural heritage has provided me with a sense of purpose and clarity of thought. While my linguistic journey may have presented obstacles, I have always possessed a deep understanding of what I aim to communicate, as it stems from the rich tapestry of my people’s values and traditions.

    Even during the phase when I rejected religion and embraced atheism, I couldn’t escape the deep-seated foundation that was ingrained in my very being. As I observed the chaos and shortcomings of the world, the things that irritated me and seemed hypocritical as a child pale in comparison to the flaws I see within academia and public education. It seems that each teacher or educational influencer (yes, I deliberately chose that cringy term) is solely concerned with their own achievements and how they are perceived in relation to their peers. Their value systems are based on popularity, the latest trends, what will get them noticed, published, or even earn them a promotion. They engage in empty praise and eagerly await their turn for reciprocal recognition.

    Regrettably, many of these individuals fail to embody the true essence of being a teacher. They simply regurgitate what is written in the textbooks, functioning more like social algorithms than educators. They are in lockstep with each other, devoid of any innovative or intellectually stimulating curriculum that challenges their students. While I understand that teachers are bound by mandates and guidelines, incorporating value-based and faith-inspired systems into their lessons at least demonstrates a belief in something beyond their own status in relation to their less-informed colleagues.

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
  • Weapons of Mass distractions: The Technologies that Distracted Communities and their Communication Skills

    May 28th, 2023

    Attempting to comprehend the technological advancements that my 80-year-old grandmothers have witnessed throughout their lives is incredibly challenging. They grew up during a time when the introduction of color televisions was a groundbreaking development. They actively participated in the civil rights movement and witnessed the remarkable transition from very few households owning televisions to virtually everyone having large colored flat-screens . In their youth, communicating with family across long distances primarily involved writing letters, followed by brief phone calls from their home landlines which were limited by the cost of long distance minutes. Now, with the advent of cellphones, they can easily reach out to loved ones, which also have the incredible ability to video conference with anyone, regardless of location or time zone.

    It’s truly mind-boggling to imagine how my grandmothers managed to navigate their lives amidst the constant influx of new technologies. With each passing generation of advancements, the burdens that plagued our ancestors—such as the search for life’s meaning during challenging times—became less pressing. In fact, the significant amount of our ancestors waking hours were spent grappling with these existential dilemmas. Which is why throughout history, communities have played a vital role in providing support during difficult times. Since having people whom you could rely on was essential when faced with adversity. Additionally, these communities offered opportunities for sharing joyful moments, celebrating achievements, and finding solace during the most challenging times. The concept of “communities” is complex and multifaceted, and it will be explored further as we delve into this topic. For now, I will provide subtle hints that point towards its definition and significance.

    However, with the advent of technology, one of my grandmothers found it remarkably convenient to distance herself from family members she did not particularly get along with. She had access to resources that allowed her to selectively interact with only those family members she liked. This newfound ability to curate her social interactions was a significant change compared to the past. Now, let’s consider individuals who grew up during this transformative period but harbored strong dislike for their family or community. They were no longer bound by circumstances beyond their control. With the aid of technology, they had the means to travel, relocate, pursue new job opportunities, and essentially shape their lives according to their own desires. This “freedom” allowed them to break away from unfavorable family or community dynamics, enabling them to establish new paths based on their new found beliefs and aspirations. Persistence and dedication to their newfound principles played a crucial role in this process.

    The emergence of new technologies has introduced distractions that can divert people’s attention from the significance of family and community. These technological advancements provide individuals with a convenient way to avoid dealing with familial challenges or confronting situations they cannot control. As a result, the value placed on family and communal ties has been compromised, leading to a slow, but subtle, erosion of these important connections.

    The Advent of the Printing Press.

    In the 15th century, the invention of the printing press revolutionized the dissemination of information. Prior to its arrival, obtaining a book was a laborious process, often requiring commissioned scribes to meticulously transcribe texts. Even for religious works, the individuals who wrote them were sponsored, although not necessarily in the same profit-driven manner as today. These skilled professionals, known as scribes, approached their craft with an artistic touch. Crafting ink was a time-consuming task, feathers had to be carefully sharpened or sourced from specific birds, and parchment or paper required various plants or animal hides, as both were required for writing and replicating texts word for word. Moreover, spelling lacked standardization, and words were often spelled phonetically, which made reading a much slower process.

    Prior to the invention of the printing press, books and even writing itself were relatively scarce commodities. Owning a book was a rare occurrence, often limited to cases where one inherited or received it as a special gift, such as for marriage. Common people had almost no access to books, and the privilege of possessing a library was primarily reserved for kings, emperors, universities, and major religious institutions. Additionally, the lack of centralized education further hindered widespread literacy, with many individuals having no reason or means to learn how to read beyond basic numeracy.

    Throughout history, the literate class predominantly consisted of deeply devout individuals within religious orders or clergy members. Scholars, too, were part of this educated elite, often originating from longstanding scholarly traditions or aristocratic backgrounds that allowed them the luxury of extensive and in-depth education before assuming leadership roles within their respective realms. These individuals were ordained or destined to follow a path in line with their profession or life’s calling, which entailed a commitment to intellectual pursuits. In essence, reading and literacy were primarily associated with the wealthy, high-born individuals, or those involved in religious affairs.

    The introduction of mass-produced papers and the dissemination of information through the printing press eliminated the need for specialized knowledge to interpret specific words or phrases within historical texts. This development had a profound impact on the accessibility of literature, including biblical texts, Shakespearean works, and the writings of ancient philosophers. As a result, the desire and ability to read became more attainable for a wider population. With the newfound availability of printed materials, even middle-class working individuals gained access to educational resources that could teach them how to read. While books remained relatively expensive, they were no longer completely out of reach for the average person. The increasing affordability and availability of books meant that individuals from various socioeconomic backgrounds could engage with written knowledge, expanding their intellectual horizons and fostering a greater literacy across the world.

    While other mediums, such as radio, television, and the desktop internet, emerged in attempts to supplant the dominance of printed materials, they were limited in their capabilities until the late 1990s. These alternative forms of communication struggled to match the vastness and scope of the printed word. They were constrained by their technological limitations, mobility, and had yet to fully realize their potential. The printed medium, with its tangible books, newspapers, pamphlets and broad distribution, remained a cornerstone of information dissemination and cultural expression for centuries. It fostered a rich literary tradition, allowing writers to convey their ideas, stories, and knowledge to an ever-expanding audience. Only with the advent of advanced digital technologies and the internet’s widespread accessibility did alternative mediums begin to catch up and challenge the supremacy of the printed word.

                Books possessed the remarkable ability to transcend distances and encapsulate entire worlds within their pages. As J.Z. Young astutely noted, “for the medieval type of brain… making true statements depended on fitting sensory experience with the symbols of religion.” Prior to widespread literacy, individuals primarily relied on their personal experiences and the narratives dictated by the dominant religious framework within their culture. However, the advent of books transformed the human brain. It enabled individuals to expand their perspectives beyond the confines of religious precepts and compare their own thoughts and experiences with those articulated by diverse authors through the written word. Reading provided a gateway to a broader understanding of the world, encompassing a myriad of viewpoints and knowledge that extended beyond religious boundaries. It allowed individuals to engage in critical thinking, questioning, and reflection, encouraging them to explore different perspectives and challenge their own beliefs.

    While radio, television, and the internet have their merits in terms of broadcasting information and entertainment, they lack the corporeality and direct engagement that books provide. The ability to hold a book in one’s hands, flip through its pages, and establish a personal connection with the text is an unparalleled experience for those who value the written word. In a world that was becoming increasingly digitized, the book remained a symbol of mobility and a steadfast companion for avid readers. Its unique portability allowed individuals to carry an entire world with them, empowering them to indulge in the pleasure of reading at their convenience and fostering a deep appreciation for literacy.

    THE COMMANDING VOICES FROM THE BOX: radio

    The preceding section should not be interpreted as a means to diminish the immense impact that radio, as a form of communication and as a disseminator of information, had on people in the 20th century. To the contrary, radio served as a unifying force that was previously inconceivable. While books allowed for thousands or millions of people to read a text and grasp its general essence or engage in discussions and debates over its message or purpose, the process of reading required individual time and interpretation, varying across different stages of people’s lives.

    In contrast, radio was not a recorded medium but rather an accessible one, whether through car radios or plugged-in receivers. This meant that if you and others gathered around a stereo, you could collectively tune into a shared experience. It extended beyond just you or the people within arm’s reach; anyone who tuned into the station could listen to a sports event, a boxing match, the speeches of world leaders, or personalities providing news, commentary, entertainment or insights.

    It is important to recognize and appreciate the significant role that radio played in shaping the cultural fabric of the 20th century. Its ability to transmit live events and connect people through shared listening experiences was a transformative phenomenon that united communities and brought the world closer together. Beyond the realm of books, radio played a crucial role in fostering a sense of cohesiveness among people. This was particularly significant because the events that were accessible for national or even regional consumption were limited in scope. Radio served as a unifying medium, bringing together moments of despair, unity, freedom, and entertainment into a single relatable zeitgeist.

    Radio provided a platform to disseminate news and information that impacted the entire country, such as the devastating events of Pearl Harbor. In times of adversity, radio became a rallying point, uniting people in their efforts to rebuild and move forward. Radio also brought people together in moments of triumph, achievement and calls for civil/social change, like the Martin Luther King Jr. “I have a dream” speech. It allowed the nation to collectively be on notice, celebrate milestones and groundbreaking accomplishments.

    In essence, radio had the power to unify diverse worlds under a single wave of a signal. It created a shared understanding and a collective consciousness, enabling people from various backgrounds and regions to connect with a common zeitgeist. It played a pivotal role in shaping national identity and fostering a sense of belonging during pivotal moments in history. The radio brought a sense of immediacy, making it possible for individuals across vast distances to be plugged into the same experience and be part of a collective audience. Radio, with its ability to reach and engage a wide audience, transcended physical boundaries and facilitated a sense of togetherness. It united individuals across vast distances, creating a shared narrative that helped shape and define the cultural fabric of society.

                Indeed, radio, being a verbal form of communication, had the power to spark conversations rather than stifle them. However, the emergence of television and cable broadcasting was on the horizon. While television encompassed many of the same attributes as radio, with the added advantage of visual imagery, it couldn’t quite rival the inherent mobility of radios.

    Television revolutionized the way information and entertainment were delivered, providing a captivating visual experience that engaged viewers on a new level. It brought stories, news, and events to life with vivid pictures and dynamic audio. Yet, despite its visual appeal, television couldn’t match the portability and flexibility offered by radios.

    Radios over time could be easily carried and moved around, allowing people to tune in to their favorite programs and engage with content anywhere they went.

    While televisions quickly became a centerpiece in many households, anchoring families to a specific location, radios continued to offer a versatile and portable means of communication. They accompanied people on their daily activities, becoming a constant companion that connected them to the world.

    The mobility of radios, coupled with their ability to stimulate conversations and foster a sense of community, made them a cherished medium. Even with the introduction of television and its visual allure, radios maintained their significance as a portable source of connection and communication that could be enjoyed anytime, anywhere.

    the visual war: TELEVISION

    There is a profound analysis that delves into the impact of television, examining its technological advancements, visual quality, and significance. Television represents an era where video and recorded content transitioned from being a costly luxury to becoming a staple and a piece of furniture in living rooms across the nation, eventually reaching almost unanimous adoption.

    The evolution of television reflects the remarkable journey of accessibility. Initially, owning a television was a privilege limited to those who could afford it. However, as time progressed, television sets became more affordable and accessible to a broader population. What was once considered a luxury item gradually transformed into a common household possession.

    Television not only changed the way we consumed visual content but also revolutionized entertainment and information dissemination. It brought the world into people’s homes, allowing them to witness historic events, enjoy captivating storytelling, and stay informed about the world’s happenings. Television became a centerpiece of family gatherings, providing shared experiences and sparking discussions. As technology advanced, television screens became thinner, colors became more vibrant, and the clarity of images improved significantly. These advancements further enhanced the viewing experience, captivating audiences and immersing them in a visual feast.

    When television first emerged, it was far from being considered a piece of furniture. In its early days, television sets were predominantly black and white and came with a hefty price tag. Only a select few could afford them, and even if you were fortunate enough to own one, it was typically a single unit rather than multiple sets scattered throughout the house.

    At its inception, television shared similarities with the radio in terms of its ability to connect people through shows and news broadcasts. However, unlike the radio, television introduced a visual element that added a new dimension to the viewing experience.

    During the early stages, the number of available television channels and shows was limited. There were no means of recording broadcasts, at least until a couple of decades later when cassette tapes allowed for recording, but even then, it required manually setting up the recording while the program was live. As a result, the viewing experience was shaped by the finite amount of content, creating a sense of familiarity and shared experiences among viewers.

    The limited selection of shows also gave rise to the concept of primetime, when the most popular programs were scheduled to air. Since on-demand options were virtually nonexistent, viewers eagerly awaited for their favorite shows during specific time slots, enhancing the anticipation and importance of prime time television.

    The early era of television, characterized by black and white screens and limited programming, fostered a unique sense of connection among viewers. It brought people together around shared entertainment and news experiences, while also reflecting the technological constraints and evolving nature of the medium. Over time, as technology advanced and on-demand options became more prevalent, television would undergo significant transformations, reshaping the way we consume and interact with media.

    The Evolving Problem

    However, with the advent of television, many people became easily distracted during prime hours when the best shows or news anchors were on. This created a dilemma as entire generations’ worth of cultural content were presented as a form of ritualistic commemoration during these coveted time slots.

    The transition to digital information delivery had a significant impact. While the printing press made reading more accessible, it is a skill that is typically learned and practiced from childhood. It requires focused attention, time, and the ability to retain information in one’s memory. Radio, on the other hand, offered an auditory experience that bypassed the manual process of reading and directly delivered the spoken word into ones personal space. However, television tapped into a fundamental ability that most individuals possess: the capability to process complex visuals with our eyes.

    The eyes, being the second most intricate organ after the brain itself, have the remarkable capacity to interpret and make sense of the visual world around us. Television harnessed this innate ability, presenting information and entertainment through a medium that engaged our visual senses. It offered a rich sensory experience, combining moving images, colors, and visual storytelling.

    The introduction of television expanded the realm of communication by providing a powerful visual medium that surpassed the limitations of reading or listening, while also incorporating them. It allowed for the transmission of complex ideas, emotions, and narratives through captivating visuals. Yet a person never had to leave there home or go to a theatre to experience this like that had to previously.

    The fusion of sight and sound offered by television revolutionized the way we interacted with media, creating a new form of communication that appealed to the widest of audiences. As a result, television became an influential cultural force, shaping our perceptions, values, and understanding of the world.

    In the present day, when a child is seated in front of a television, their brain, fascination, and imagination are instantly captivated. There is a primal aspect that gets triggered by images, which is why art has always been revered and considered sacred. Throughout history, many religious groups have either worshiped or expressed disdain for images attempting to depict the divine. The presence of an image narrows our focus and invokes a unique response. For those who worship tangible objects, it represents a manifestation of the divine. On the other hand, those who oppose such practices argue that the divine cannot and should not be confined or disrespected through a physical representation, such as a sculpture.

    Television built upon the auditory aspect of radio and added the power of imagery, allowing hours to pass in the blink of an eye. However, for a long time, owning a television screen was an expensive endeavor. Whether it was due lack of portability, cable connections, or VHS players, televisions were confined to specific locations. As a result, radio and books still held their place in the world, offering alternative sources of information and entertainment. While television became a fixture in homes, radio found its niche in cars and during travel, while books remained a valuable resource for learning or as a means of entertainment when neither radio nor television were readily available.

    However, its initial constraints, both in terms of cost and availability, allowed other mediums to maintain their significance. Despite the immersive power of television, the enduring allure of radio and the timeless appeal of books persisted, providing alternative avenues for information, imagination, and storytelling.

    As technology advanced and televisions became more affordable and accessible, their presence expanded, eventually becoming a ubiquitous feature in households worldwide. This transformation reshaped the way we consume media and connect with the world. Television’s ability to instantly transport us to different places, share experiences, and transmit information has made it a cornerstone of modern society.

    However, it is important to recognize that through each of these mediums, people were often seeking distractions from the hardships of life, a practice commonly referred to today as “decompressing.” Distractions in of themselves are not inherently negative; they can provide much-needed respite and entertainment. Yet, if these distractions begin to isolate individuals from the connections they once relied upon for a sense of purpose, self-identity, belonging, or community, then piece by piece, they erode the foundations that make communities distinct and cohesive entities.

    While technology and various forms of media offer convenient means of escape, it is crucial to maintain a balance between engaging with these distractions and nurturing the relationships and communal ties that foster our well-being and sense of belonging. It is when we prioritize the virtual world over authentic connections that we risk losing the richness of human interaction and the support systems that communities provide and necessitate.

    In our pursuit of decompressing, we must remain mindful of the importance of maintaining genuine human connections and actively engaging with our communities. These connections are the lifeblood of our shared experiences, mutual support, and collective growth. By striking a healthy balance between the allure of distractions and the nourishment derived from genuine human connection, we can preserve the essence of what makes communities thrive and endure in an increasingly digital age.

    It is important to acknowledge that technologies that help facilitate connections to others and expanding one’s sense of community are not inherently negative. In fact, it can be a positive outcome of technological advancements and the broader access to information and diverse perspectives. However, problems arise when community becomes defined solely by superficial factors such as race, economic status, or secular education.

    When we allow these external markers to define our communities, we risk perpetuating values that were never corner stones of communities or life itself. True community should be based on shared values, empathy, and a genuine desire to understand and support one another. It should transcend superficial differences and foster growth, respect, and unity among its members.

    The last two Paragraphs beg the question of what is “wrong with building a community based on those factors”.

    Superficial commonalities such as hobbies, appearance or activities do not necessarily imply shared values or a deep sense of community. Engaging in small talk or participating in a particular activity may provide initial connections, but true community is built on shared values, beliefs, religion, or worldview.

    When we encounter challenges or require support, it is often those who share our fundamental values and beliefs who are most likely to provide the assistance and understanding we need. These shared principles serve as a foundation for trust, empathy, and a sense of belonging within a community.

    While engaging in activities or having common interests can be enjoyable and provide a starting point for connections, it is essential to foster relationships based on shared values in order to create or maintain a genuine and enduring sense of community. By actively seeking out individuals who align with our core beliefs and principles, we can establish communities that go beyond superficial connections and cultivate meaningful relationships that support personal growth and well-being.

    Ultimately, it is the shared values, beliefs, and worldviews that bind individuals together and form the basis for strong, supportive communities.

    The proliferation of entertainment mediums like the printing press, radio, and TV undoubtedly brought about significant changes in society and communal dynamics. However, by the time the 1980s arrived, it became apparent that these distractions had contributed to the fragmented state of communities. Despite the presence of various sources of information and diverse opinions, people were often confined to a limited range of perspectives due to the restricted access to alternative sources.

    This limitation had an impact on both individuals seeking out unconventional ideas and “creators” attempting to disseminate such information. People with unique or unconventional viewpoints found it challenging to connect with like-minded individuals or access resources that could nurture and support their distinctive beliefs. Similarly, creators who aimed to share alternative perspectives were often hindered by the limited means available to reach a wider audience.

    The resulting landscape fostered a sense of isolation for those whose ideas deviated from the mainstream, as well as constrained the diversity of information and viewpoints available to the broader public. Communities became more polarized, as the potential for meaningful dialogue and understanding between different groups was diminished by people being distracted by visual screens.

    At the same time, the digital age has provided new avenues for individuals to explore differing perspectives, connect with like-minded communities, and share their ideas on a global scale. This shift has brought both opportunities and challenges, as the abundance of information now seems to lead to information overload and the proliferation of echo chambers.

    The WEB and its sticky fingers.

    Let’s take a step back and examine the earlier days of computers. In the 1960s, computers did exist, but they were significantly different from the compact and powerful devices we have today. They were large, occupying vast spaces, and their computing power was relatively limited compared to modern-day calculators.

    During that time, computer technology was primarily accessible to large organizations, such as companies like IBM, academic researchers and government agencies. These entities had computers that were primarily used within local networks, and the speed at which data could be processed and transferred was considerably slower than what we are accustomed to today.

    Additionally, using computers during that era required a certain level of expertise and the ability to create programs or applications for their specific use, known to some people as floppy disk programs. Unlike now, where we have a wealth of information readily available, computers at that time did not offer an abundance of pre-existing data or resources.

    The internet’s origins can be traced back to the late 1960s with the development of ARPANET, a network created by the U.S. Department of Defense for research and communication purposes. It was through the development of ARPANET, a precursor to the internet, that the foundations for global networking and information exchange were laid. Its initial focus was on facilitating information sharing and collaboration between researchers and academics. Over time, the advancements in technology, increased computing power, and the evolution of the internet infrastructure over the subsequent decades led to the widespread accessibility and availability of information that we experience today.

    The growth of the internet and technological advancements over time have played a crucial role in transforming computers into the powerful and accessible devices we rely on today.

    As home computers started to become more common and the internet entered its early stages, it faced significant challenges in competing with television. Navigating the internet during this time, the late 1980 through the early 2000’s, required a significant investment of time and effort to master. The learning curve was steep, and it took hours and even weeks to become proficient in using the available technology.

    Additionally, the internet had limited content compared to the vast array of programs and visually captivating graphics offered by television and movies. While there were some educational programs and basic applications available, they often lacked the engaging visuals and immersive experience that television provided.

    During the early 2000s, with the rise of the dotcom boom, computers and the internet were predominantly used by professionals such as traders, gamers, and writers. Laptops, though becoming more accessible, were still relatively expensive, with limited processing power and short battery life. Being connected to the internet required a physical connection to a landline, which meant that for many people the phone line couldn’t be used simultaneously.

    These factors contributed to the continued dominance of television as the preferred medium for entertainment and information consumption during that time. The internet and computers were primarily seen as tools for specific professional, or advanced leisure purposes rather than mainstream sources of entertainment and communication.

    As computer games started to emerge and become more accessible to the average person, the popularity of computers began to rise. People found enjoyment and entertainment in these games, and the accessibility of computer systems improved, making them easier to use. Messaging services like AOL, Yahoo, and others also played a significant role in increasing internet usage during this period.

    However, it’s important to note that prior to the emergence of platforms like Myspace, Google, and YouTube, internet consumption was still primarily seen as a passing recreation for few people. The internet had yet to fully establish itself as a mainstream source of entertainment and information.

    Around the same time that these popular websites began to gain prominence, computers started to become more commonplace in middle-class households. It became increasingly common for households to have both a television and a computer, reflecting the growing popularity of computers as objects of entertainment or learning.

    This shift marked a significant turning point in the adoption of computers and the internet by a wider audience. With the rise of user-friendly platforms and the increasing availability of online content, the internet started to transform into a more engaging and immersive medium for entertainment, communication, and information dissemination. The combination of accessible computer systems and the emergence of popular online platforms laid the groundwork for the internet’s eventual transformation into an integral part of our daily lives.

    Then the internet introduced a new form of distraction, one characterized by an abundance of choice. Unlike radio and TV, where the available channels and programs were limited, the internet provided a broader range of options, albeit still more limited compared to what we have today. This shift allowed individuals to break free from the constraints of network schedules and broadcaster programming.

    With the internet, people gained access to the specific forms of entertainment they desired. Whether it was reading articles, watching videos, playing games, or engaging in various online activities, the internet provided a customizable experience. It opened up a world of unfettered sources and diverse content, all readily accessible and streamed directly into the mind.

    The internet’s ability to provide tailored experiences and direct access to a vast array of information and entertainment marked a significant shift in the nature of distractions. It allowed individuals to curate their own digital experiences and explore a multitude of interests, opening up new avenues for discovery, connection, and entertainment.

    EDIT FORM________———-

    The proliferation of message boards, diverse media, and easily accessible information further contributed to the erosion of traditional communities. People started to develop a false sense of community by connecting with others who shared similar interests. In the past, finding individuals who shared niche hobbies or passions was challenging and limited in scope. However, with the advent of the internet, these barriers began to crumble, allowing people to connect with like-minded individuals on a larger scale.

    The internet became a breeding ground for groups and faux pas communities centered around specific interests, allowing individuals to find and engage with others who shared their hobbies, passions, or niche pursuits. The floodgates of superficial connectivity were opened, and people could now tap into a seemingly endless network of individuals who understood and appreciated their interests.

    However, it was not until the widespread adoption of smartphones that the containment of these communities fully broke. Smartphones revolutionized the way people accessed the internet and consumed information. With the internet constantly at their fingertips, individuals could now connect with their online communities anytime and anywhere. The portability and convenience of smartphones enabled a seamless integration of online interactions into daily life, blurring the lines between virtual and physical communities.

    This shift in technology further exacerbated the fragmentation of traditional communities. While the internet offered a sense of belonging and connection through shared interests, the nature of these communities often lacked the depth and substance found in face-to-face interactions. The ease of online connection through smartphones led to a proliferation of virtual relationships, sometimes at the expense of meaningful real-world connections.

    In summary, the internet and its various platforms provided opportunities for individuals to find like-minded communities, but it was the widespread adoption of smartphones that intensified the breakdown of traditional communities, blurring the lines between virtual and physical interactions.

    Smartphones, Dumb-communication.

    Phones, in their essence, are derived from the power of the written letter. Letters carried a profound significance, conveying emotions, thoughts, and experiences in a way that was often ineffable. They held the ability to bridge the gap between individuals who were separated by distance, enabling them to understand and empathize with one another’s joys, sorrows, and the trials they faced in their absence.

    The act of writing a letter to someone you miss had a profound impact. It allowed you to articulate the complexities of your feelings, to share the weight of misfortunes that occurred in their absence, and to provide solace through the written word. The letter possessed a unique ability to preserve and transmit the essence of human connection, serving as a tangible representation of the bond between individuals.

    However, with the advent of modern technology, the traditional art of letter writing has “evolved”. The invention of telephones transformed the way we communicate, bringing real-time conversations to the forefront. While the immediacy of phone conversations provided a new level of connection and the ability to convey emotions through voice, they also brought about a different dynamic.

    With phone calls, the exchange of thoughts and experiences became more fluid, instantaneous, and ephemeral. The spoken word could convey nuances and emotions that were difficult to capture in writing, adding depth and immediacy to conversations. Yet, at the same time, the fleeting nature of phone calls diminished the sense of permanence that letters once held.

    In the digital age, text messages and instant messaging further revolutionized communication. Short, quick messages replaced the longer, heartfelt letters of the past. While this enabled more frequent and convenient exchanges, it also diminished the depth and richness of expression found in traditional letter writing.

    In essence, while phones and digital communication have brought us closer together in many ways, they have also changed the nature of our interactions. The letter, with its unique ability to convey deep emotions and connect souls across distances, has evolved into new forms of communication that prioritize immediacy and convenience. Though the essence of human connection remains, the way we express it has shifted in the face of advancing technology.

    The telephone, while not able to fully replicate the power of a written letter, played a significant role in bridging distances and fostering connections. Initially, telephone services were quite costly, making them inaccessible to many individuals. Even when people did have access to telephones, they were constrained by limitations, such as the number of minutes they could use or specific times when calls were more affordable to avoid excessive bills.

    Similarly to television and the internet, telephones were primarily stationary during their early iterations. They were tethered to a specific location, restricting mobility and limiting the ability to communicate on the go. Additionally, the cost of mobile phones remained high for a considerable time, and their capabilities were constrained. Usage was often limited, with restrictions on minutes and texts.

    These factors hindered widespread adoption and hindered the potential of telephones to revolutionize communication. However, advancements in technology and changes in the telecommunications industry eventually led to increased mobility and affordability. Mobile phones became more accessible to a broader range of people, offering greater freedom to communicate from anywhere.

    As mobile phones evolved, they transformed into multifunctional devices that not only facilitated voice calls but also allowed for text messaging, internet access, and a wide range of applications. This newfound mobility and versatility revolutionized communication by enabling instant connections and convenient access to information.

    Despite their initial limitations and high costs, telephones gradually became indispensable tools for modern communication. They played a pivotal role in connecting people across distances, overcoming the barriers of time and space. While the early generations of telephones had their constraints, they laid the foundation for the transformative power of mobile communication that we experience today.

    The introduction of the iPhone in 2008 marked a new era in communication. While it was initially considered expensive during its time (adjusted for inflation), it revolutionized the way we interacted with various media forms. Combining the capabilities of radio, television, telephones, and the internet into a single device, it offered unprecedented convenience and accessibility.

    In the early stages, some of these functionalities were limited by technological constraints, detracting from the advantages of other mediums. However, the iPhone’s true breakthrough was its mobility. Unlike previous technologies, it allowed people to carry a diverse range of media and communication tools in their pockets, enabling connections and access to information from anywhere.

    In the years that followed, applications and websites emerged, connecting individuals across all corners of the internet. The barriers to disseminating information and content were shattered, as anyone could now become a sensation. It was no longer limited to being propped up by large conglomerates; individuals themselves could captivate audiences through their personality or messages. The power of amplification shifted to the people, as they could share, like, comment, or engage in ways that fueled the algorithms and expanded the reach of compelling content.

    This era of mobile devices and widespread internet connectivity opened up new avenues for self-expression, creativity, and community building. The democratization of information and the ability to connect with others on a global scale transformed the way we engage with media and shape our identities in the digital realm. The iPhone and its subsequent iterations paved the way for a dynamic and interconnected digital landscape that continues to evolve and shape our lives.

    The Curtain of Community Fell, and the Wall of Ignorant Miscommunications’ was Raised.

    With the rise of mobile devices and the proliferation of online platforms, “marginalized” and previously voiceless factions found a platform to express their perspectives, share their stories, and assert their rights. These platforms became spaces where individuals could proudly proclaim their righteousness, express their grievances, and shed light on the mistreatment they had endured. The digital realm lifted the veil of ignorance that shielded many people from recognizing the diverse experiences and struggles within society.

    Furthermore, the internet shattered the illusion of community that many individuals had previously held. It exposed the limitations of their perceived connections and highlighted the fragmented nature of society. As people engaged with a wider range of voices and perspectives, they became aware of the complexity and diversity of human experiences. This newfound awareness challenged preconceived notions of unity and forced individuals to confront the realities of a world that both hurt and helped them in different ways. Yet many people just ran back to their eco chambers.

    While this proliferation of voices and perspectives had its benefits in terms of empowering marginalized groups and promoting empathy, it also brought forth new challenges. The vastness and diversity of opinions often led to polarization and heightened conflicts, as different groups clashed in the digital arena. It became increasingly difficult to find common ground and foster meaningful dialogue amidst the cacophony of voices. It brought out the Paradox of Tolerance.

    In the past half a century, Faux pas communities centered around shared interests such as race, sports, arts, acting or other superficial connectors often faced external scrutiny and criticism. However, with the advent of online expression, individuals now have the ability to express themselves in unprecedented ways, which has shattered the imagined consensus within friendships and relationships. The reality of differing beliefs and opinions has disrupted the previously held notions of unity and shared values.

    In genuine communities, discussions about diverse beliefs were once commonplace. These communities were rooted in shared values and faith, where even if individuals disagreed on specific matters, they still held a common core authority or power. Within these communities, disagreements could be handled with respect and a recognition that diverse interpretations of shared values exist. Members of the community could agree to disagree without taking it personally, knowing that there were knowledgeable individuals within the community who could provide guidance and resolution.

    However, the current landscape of online expression and the ability to connect with individuals from various backgrounds has highlighted the stark differences in beliefs. The power of interpretation and the absence of a common framework within digital communities have made it challenging to navigate disagreements constructively. The absence of shared values and the abundance of diverse perspectives often lead to personalization of disagreements and a lack of guidance in resolving them.

    In this devolving landscape, it is crucial to recognize the importance of genuine communities that are rooted in shared values and faith. These communities provide a foundation for respectful dialogue and the opportunity to learn from one another. While the digital realm offers the potential for connection and expression, it is essential to seek out spaces where shared values can guide discussions and differences can be addressed with understanding and humility.

    Due to the prevalence of distractions over the past few centuries, such as television, radio, internet, and now smartphones, people have developed habits of avoiding difficult conversations and lacking the necessary skills to navigate them. The skills that were traditionally learned in childhood, through interactions with family and friends, have been neglected or entirely absent.

    Instead, what has been practiced and reinforced are behaviors of decompressing, dissociating, and seeking escapism. When confronted with disagreements or heated emotions, many individuals have developed a habit of turning to technology as a crutch, rather than engaging in introspection or seeking resolution with the people involved. The convenience of technology allows for postponing or avoiding addressing these conflicts, perpetuating a cycle of unresolved issues.

    As a result, the crucial skills of self-reflection, conflict resolution, and finding common ground have been neglected. The reliance on technology as a means to avoid discomfort or challenging conversations has hindered personal growth and the development of effective communication skills.

    In order to address this issue, it is important for individuals to recognize the patterns of avoidance and actively work towards developing healthier coping mechanisms. This includes engaging in self-reflection, taking time to process emotions, and seeking constructive ways to address conflicts. By fostering a culture of open and honest communication, both online and offline, individuals can begin to rebuild the skills necessary to navigate difficult conversations and promote understanding and resolution.

    It is indeed ironic when people express surprise or dismay at the events of 2016, as it was a time when thoughts and opinions that were once quietly held found a more vocal expression, from all of the “crazy” sides. The advent of technology and social media platforms provided a platform for individuals to openly share their beliefs and perspectives. In this new era, previously unspoken or ignored viewpoints were brought to the forefront, forcing people to confront what they were truly intolerant of.

    While many advocate for values such as liberalism and love, the reality is that the exposure to diverse opinions revealed the inherent biases and prejudices that existed within society. What was once kept nonverbal, brushed aside, or distracted from, was now being thrust into the spotlight, demanding a response and prompting individuals to make choices.

    This shift in dynamics, where differing beliefs and ideologies were being openly expressed, led to polarization and a heightened sense of division among people. The clash between contrasting viewpoints often challenged individuals to confront their own biases and reassess their values. It became evident that true acceptance and understanding required more than mere words or superficial declarations.

    Expanding upon the previous essay on secularism, it becomes apparent that the erosion of traditions and values was not only influenced by the gradual shift in societal norms but was also accelerated by the pervasive use of technologies of distraction. This intrusion of technology permeated the attention spans of people across generations, including the youth, adults, and even the older generation who were once considered the guardians of tradition.

    While there is no definitive guidebook on parenting, the lessons imparted and the values instilled by parents greatly influence their children’s perception of what is considered appropriate and valuable. If parents prioritize their engagement with technology over the nurturing and guidance of their children, it is likely that the children will also adopt technology as a means of distraction, mirroring the behavior they observed in their parents.

    The allure of technology, with its endless array of distractions and entertainment, can easily overshadow the importance of fostering meaningful connections, engaging in deep conversations, and preserving cultural and traditional practices. The value systems that were once passed down through generations now face competition from the “why not” mentality that technology often promotes. This shift poses a significant challenge to the continuity of communities, religious groups, and cultural traditions.

    However, it is crucial for parents to recognize their role in shaping their children’s perspectives and behaviors. By consciously allocating quality time for interpersonal connections, open discussions, and emphasizing the significance of cultural heritage, parents can instill a sense of value and appreciation for traditions in their children. Balancing the use of technology with meaningful interactions can help guide the younger generation toward a more grounded and holistic understanding of the world.

    Ultimately, it is a collective responsibility to navigate the intersection of technology and tradition. By fostering an environment that values both the benefits of technological advancements and the preservation of cultural practices, we can strive to create a society that embraces progress while maintaining a sense of identity and tradition.

    Having said that, I hold no blame towards my parents, and I believe it is important for others to refrain from blaming their parents as well. In fact, I find myself in awe of previous generations and their ability to navigate a world that lacked the convenience of modern technology. Reflecting on my own life, I acknowledge that I have wasted precious time pursuing trivial matters. However, I am fortunate to have grown up with access to powerful technologies that were already well-established by the time I reached adulthood. I had the advantage of a religious upbringing that served as an anchor to navigate base desires and emotions, even during moments of foolishness. Still, I am amazed by how individuals from earlier generations, who witnessed the rapid transformation of technology before their eyes, managed to resist being completely consumed by it.

    If you have ever engaged in meaningful conversations with your grandparents or individuals over the age of 70, you may have a faint grasp of the remarkable technological shifts they have witnessed. However, it is important to consider that they experienced these advancements during their formative years, as each new technological feat emerged, gained popularity, and fundamentally altered the way information is consumed.

    It is an abstract concept to imagine growing up with the constant emergence and widespread adoption of groundbreaking technologies. The ability of previous generations to adapt and navigate through these transformative times is truly remarkable. They bore witness to the birth of innovations that shaped the world we live in today, with profound changes in the way we communicate, gather information, and experience the world around us. Their resilience and ability to maintain a sense of self amidst these rapid changes is worthy of admiration and serves as a testament to the strength of human adaptability.

    As we reflect on the advancements of technology and its impact on our lives, let us appreciate the struggles and triumphs of those who came before us. Let us learn from their experiences and find a balance between embracing technological progress and preserving the core values and traditions that ground us. By doing so, we can navigate the ever-evolving landscape with wisdom, resilience, and a deep appreciation for the remarkable journey of human innovation.

    Prior to the advent of smartphones, the impact of these technologies on communities was largely confined to the realm of entertainment. They influenced how people spent their leisure time: whether they were engaged in learning through books, connecting with a shared media experience through television or radio, or exploring the vast, uncharted territory of the internet. These technologies were initially perceived as tools for convenience, facilitating ease in various aspects of life. Little did we realize that they would also contribute to a world that struggles to find meaning, becoming entangled in distractions to the point where the boundaries between reality and illusion blur. Moreover, they have hindered our ability to effectively communicate with those who hold differing opinions.

    At the core of all societies lies the foundation of community. Superficial associations and fleeting connections cannot alleviate the sense of purposelessness or provide the resilience needed to endure through challenging times. In a world filled with pain and uncertainty, true community is essential. It offers support, shared values, and a sense of belonging that transcends shallow interactions. The distractions brought about by technology can often divert our attention away from cultivating meaningful connections and pursuing genuine purpose.

    In this digital age, it is crucial that we reexamine the role of technology in our lives. While it offers convenience and endless possibilities, we must not allow it to overshadow the importance of authentic human connections and the pursuit of meaningful experiences. By nurturing genuine community, engaging in open and empathetic dialogue, and rediscovering the value of deep, thoughtful communication, we can navigate the complexities of the modern world with a renewed sense of purpose and connection.

    The purpose of each generation should be to strive for improvement and draw wisdom from the lessons of the past. We should resist falling prey to the intricate pitfalls presented by technology. Just as each generation of lions becomes swifter, so too should each new generation embrace the challenge to adapt and develop the necessary skills and habits that enhance their chances of survival.

    While we cannot choose the circumstances into which we are born, we have agency over our will to survive and thrive. It is essential to cultivate the qualities and behaviors that have a compounding likelihood of achieving success. This requires a commitment to growth, resilience, and a willingness to learn from the experiences of those who came before us.

    In the face of ever-advancing technology, we must exercise discernment and navigate the complexities with intention. Instead of becoming ensnared in the traps set by digital distractions, we can harness the power of technology to enhance our lives and empower ourselves by strengthening our communities. By prioritizing critical thinking, adaptability, and the cultivation of meaningful connections, we can chart a course that not only benefits us individually but also contributes to the collective progress of society, and future generations.

    Share this:

    • Click to share on X (Opens in new window) X
    • Click to share on Facebook (Opens in new window) Facebook
    • More
    • Click to print (Opens in new window) Print
    • Click to email a link to a friend (Opens in new window) Email
    • Click to share on Tumblr (Opens in new window) Tumblr
    • Click to share on Telegram (Opens in new window) Telegram
    • Click to share on WhatsApp (Opens in new window) WhatsApp
    • Click to share on Mastodon (Opens in new window) Mastodon
    Like Loading…
1 2
Next Page→

Blog at WordPress.com.

 

Loading Comments...
 

    • Subscribe Subscribed
      • The Wall
      • Already have a WordPress.com account? Log in now.
      • The Wall
      • Subscribe Subscribed
      • Sign up
      • Log in
      • Report this content
      • View site in Reader
      • Manage subscriptions
      • Collapse this bar
    %d