top of page

Accountability in Cyberworld: A Corporate Legal Theory of Platform Corporations 
               Published in the McGill pre-law review (2023)


New Governors, Old Accountabilities, Borrowed Sovereignty 
In 2014, Cambridge Analytica, a political and military consulting agency, illegally harvested data from 50 million Facebook users and devised software that would be used to predict and influence the 2016 US elections and the UK’s Brexit vote. While Facebook dismissed the claims that this negligence was simply willful ignorance (Cadwalladr and Graham-Harrison), that same year Facebook made over 26 billion dollars off the sale of user information to advertisers (Tankovska). Today, the net worth of book trumps the GDP of over 103 countries, effectively crystallizing it into a world power in its own right. Platform corporations hold a well of power that hasn’t seen light in the private realm since 17th-century corporations when giants of industry like the East India Company ran rampant. The East India Company was governed internally by its shareholders, held limited liability over the consequences of its external governance over a fifth of the world’s population, and produced revenues greater than the whole of England (Robins 2003, 79). Similarly, Facebook administrators hold internal governance over the organizational complex of the company and govern a digital space which encompasses over a third of the global population while remaining accountable to none but their private interests and the law which misguidedly insulates these interests. What was once the right to administer justice over colonized lands, engage in war, and trade slaves (Robins 2006, 28), has morphed into the self-appointed legalcapacity to pervade privacy rights on a global scale, govern free speech rights (Klonick 1604), and destabilize governments and economies (Mozur). Kate Klonick describes these platform corporations as the “New Governors'' of cyberspace, “private, self-regulating entities” that exercise their sovereign powers over a system characterized by its “lack of direct accountability to its users” (1603). What separates the corporations of old from these platform corporations is that their unaccountability to the public good was once a shortcoming of the state’s regulatory powers. Today, on the other hand, they remain unaccountable as a “point of legal doctrine” (Ciepley 139). Liberal contractual theory dichotomizes the public and private spheres and attempts to assimilate corporations into a narrow definition of voluntary associations of shareholders (140) that ought to be afforded the same constitutional rights as partnerships or legal persons (Horwitz 74). The outcome is a misrepresentation of these corporations’ government-like array of powers (Ciepley 140), and dependence on the state to derive their corporate personhood (143). These corporations are being treated under legislative schemes granting them the rights of private actors while they effectively behave like public entities with sovereign powers. While the inclination might be to simply shift their status to public organizations subject to the same accountability to their users as state governments are to their constituents, this too seems inadequate. Corporations receive their charter and governance rights from the state and are thus not merely private, and yet are financed by private capital and thus cannot be entirely public (140). Instead, David Ciepley places corporations such as these platform companies within a distinct category of norms and rules to reflect their status as “franchise governments” (140) to be understood under a political theory of the firm. The aim of this paper is to explore a question of pertinent concern: How do we ensure that these platform corporations are accountable to the public interests they find themselves inextricably entangled with? I will argue that by developing a political theory of the corporation to underscore the legal treatment of these organizations and their constitutive charters, we would ensure that digital platform corporations are held accountable to their duty to the public. I will then consider an alternative to this model of accountability, namely the development of a constitution for the internet as a naturally emergent intra-national corporation based on the germanic models of association. 


Why Can’t Platform Corporations Hold Themselves Accountable? 
Prior to the nineteenth century, the formation of a corporation was an exclusive right granted by the state (Dodd 1954, 14-15). Charters were granted to organizations that demonstrated a verifiable contribution to the social good, be it in advancing the socio-economic welfare of the nation or providing a service such as building infrastructure (Ciepley 139). These corporations were regularly vetted for consistency with their charter obligations, and governing bodies such as the British Parliament held the right to revise, rescind or renew the charter based on their adherence to its letter of the law (Frug 1980, 1094). At the turn of the century, the United States saw to a dramatic shift towards general incorporation acts which removed both obstacles of having to petition for a charter as well as the condition of advancing the commonweal (North, Wallace and Weingast, 27). While this instigated the motion of liberalizing and expanding markets as well as inspired a wave of entrepreneurship, it eventually resulted in blanketing platform corporations from responsibility for those they govern. In part, this can be attributed to the liberal assumption which underscores our “social imaginary” of markets as a collection of interactions amongst moral equals (Taylor 2). The legal treatment of corporations becomes driven by the notion that corporations will be morally consistent with this view, if not out of corporate social responsibility, then out of market pressures and user expectations which threaten the company’s bottom line (Klonick 1627). However, they rarely behave as voluntary associations of shareholders in a democratic system of governance. These market pressures fade however when we realize that users are the tradable commodity rather than the consumer in platform corporation’s profit structures (Zuboff 4). Corporations such as Google amass their revenue from the collection and sale of user information. This has devolved into a “raw-material extraction operation” with the goal of automating and instrumentalizing users to befit the wishes of these profit-seeking machines (Zuboff 4-7). When we pair this with the publicly traded nature of large tech corporations, this liberal aspiration of morally inclined corporations fades like a dream. 
Corporate managers and their shareholders are afforded the privilege of limited liability over corporate assets. While shareholders might be considered the residual claimants of corporate revenue (Easterbrook and Fischel 1425) they hold neither the rights nor the liability over corporate assets (Ciepley 146). For platform corporations who produce their revenues from siphoning and selling user information to advertisers and the next highest bidder, these assets in question are the rights over user information. The asset in question engages directly with the privacy rights of users afforded to them by their respective nation of domicile. When shareholders and managerial decision makers act on behalf of the corporation’s profit interests, the asset being traded not only engages the legal rights of users but it does so without a mechanism of accountability over the misuse of these assets. A neoliberal understanding of this phenomenon would seek to protect this translation of privacy rights into corporate assets and the limited liability which hangs over them. Shrouding the exchange behind a view of the corporation as a private actor engaged in a “nexus of [bilateral] contracts” with other consenting private individuals (Jensen and Meckling 310). However the reality of these interactions is a chasm of power imbalance. 


Blackstone describes corporations as miniature republics (456) with extensive rights to “command, regulate, adjudicate, set rules of corporation, allocate collective resources, educate, discipline and punish” (Ciepley 142). Meanwhile, users are given the impossible choice of agreeing to the non-negotiable terms and conditions or renouncing access to a digital anthropocene which has become an indivisible aspect of social life. In Rousseau’s On the Social Contract he argues that “none give up their freedom except for utility” on the contingency that the sovereign acts on behalf and in advancement of the general will (18, 33). Users accede to platform corporations’ terms of agreements, and in turn submit themselves, their rights to privacy and free speech to the governance of a private actor in exchange for participation in a digital nation. Yet no mechanism exists to ensure that these rulers of cyberspace are held accountable to the general will. Some legal theorists such as Balkin suggest that imposing fiduciary obligations between platform corporations and their users would be an appropriate means of protecting their information (1208). Not only would this circumvent the neo-Lochnerian model which conflates First Amendment freedoms with contractual freedoms, but would also accurately capture the power dynamic between the two private agents. On the other hand it’s important to note that it is not simply the “activities of corporations” that transgress the divide between the public and private realms, but rather “that their very being transgresses this divide” (Ciepley 152). The very nature of platform corporations under our current dichotomous division of governance powers places them outside the purview of reasonable accountability. In what Berle and Means describe as the “separation of ownership and control” of the modern corporation (1932), one cannot expect corporations to be driven by social interest when they are neither mandated to do so, nor formed for reasons other than capitalist expansion. In fact Adam Smith admonishes this separation of ownership and accountability as a breeding ground for “negligence and profusion” (264-265). 


Why Can’t Governments Hold Platform Corporations Accountable?

If we cannot expect these corporations to effectively govern themselves as corporate entities, then why can’t governments intervene? After all every sovereign force, governmental or corporate, must derive its legitimacy over governance from somewhere. To concede that this legitimacy is the product of capitalistic privilege and economic hierarchy would be unsatisfactory to say the least given its obstruction of the liberal principles which equate private actors to each other. According to David Ciepley corporations are governmental in provenance given that they depend on the state for their charters, legal contractual personality, and corporate authority (Ciepley 143-144,149). Thus if they derive their legitimacy over the social order of cyberspace from the government, wouldn’t it be justified for governments to intervene in corporate governance efforts? Ciepley describes a three tiered hierarchy of constitutional republics in the United States, each deriving their legitimacy from the layer above it. The federal government rests at the top and draws its power from the general will of the sovereign people. Followed by state governments that employ authority by virtue of federal authorization. Thirdly corporations, including towns and business corporations draw their powers from the states they were chartered in (Ciepley 151). Although there exists a hierarchical superiority in the scope of legislative capacities of these layers of power, the Supreme Court’s powers of judicial review over corporate charters often falls ultra vires, out of their jurisdiction (Maier 79). In fact, the landmark case of Dartmouth College v Woodward (1819) institutionalized the distinction between public and private corporations rendering corporate charters immune to legislative revision. Instead, state governments have fallen back on their habitual role of delegating these policy responsibilities to the corporate institutions who fall under them in the echelons of legal supremacy (Wood 9). In part due to legal deadlocks, in part due to social pressures to remain outside the gates of ‘free’ digital spaces, as well as due to the logistical impracticalities of government intervention. 


Shortly after the inception of the Internet, cyberlibertarian political activist John Barlow proclaims in his Declaration of the Independence of Cyberspace to the “Governments of the Industrial World, you weary giants of flesh and steel... You are not welcome among us. You have no sovereignty where we gather.” (1996). Digital spaces have played a crucial role in empowering collective action and power outside of the peripheries of civil societies and governmental institutions. In 2010, Facebook played an instrumental role in aiding the revolutionary efforts of the Arab Spring…. In Egypt, anti-establishment content could be freely published on the platform without fear of government censorship and mass protests could be organized days in advance without fear of informants hampering these efforts. Ultimately, Facebook was hailed as a liberating tool that was essential for the removal of President Husni Mubarak from office (Zuckerman 1). Inhabitants of the internet have always been cautious about the role governments are allowed to play in the regulation of platforms. Given that these platforms are considered to be the private property of corporations, courts often find difficulty in curtailing the speech of actors in a space that isn’t entirely public in nature (Lloyd Corp. v Tanner 1972). Unbeknownst to Barlow however, the threat of tyranny over cyberspace would not come from the public sphere, but rather from the private corporate powers that have colonized his “civilization of the Mind”. Up to the present day, western governments have heeded the warnings of activists such as Barlow. In Packingham v North Carolina (2017) Justice Kennedy describes platforms as “modern public squares” in which government intervention would interfere with users’ legitimate exercise of First Amendment rights (2). Instead, having private actors regulate each other's content allows states to side-step the messy trail of interjecting in the affairs of cyberspace which is effectively terra nullius, outside the sovereignty of all nations. Section 230 of the US Communication Decency Act grants the architects of this digital nation, platform corporations, the freedom to govern according to whichever “values they want to protect - or to protect no values at all” (Klonick 1617). In Zeran v America Online, Inc. (1027-28), the court recognized two congressional intents on granting the wide immunity that section 230 sets out.

“ 
1) Encourage interactive computer services and users of such services to self-police the Internet for obscenity and other offensive material 
2) To encourage the unfettered and unregulated development of free speech on the internet, and to promote the development of e-commerce 


However, this capacity to self-govern has fallen short of effective, and the need to hamper free speech has become obvious with the plague of hateful speech and misinformation which looms over our screens (Marwick and Lewis 11, 27). Corporations have a conflict of interest between 
promoting the social good and elevating their bottom line. In addition, they hold limited accountability to their corporate assets and the outcome of their governance as a consequence of the traditional liberal dichotomization of private and public governance. Alone, however, governments lack the means, legal rights, expertise, and responsiveness to the ever-changing landscape of online challenges to inspire policy changes on the web. However, when stakes are as high as renouncing governance rights over a digital Anthropocene whose impacts reach out into the physical world, there seems to be a dire need to ensure accountability of the platform corporations whose sovereign role cannot be muted. Platform corporations meet Webber’s famous maxim for the conception of the state given that they hold a monopoly over the legitimate use of violence within their domain, or in this case, control over who has access to the platform. After former President Trump incited the capital riots in 2020 through Twitter, a coalition of tech corporations collectively banned one of the most powerful political forces of the time from accessing their platforms. Not only does this demonstrate the unquestionable and immutable power these corporations have over digital spaces, but it also carries with it a poignant question: Can these powers have limits so long as corporations are continued to be understood as private actors? 


A Political Theory of the Platform Corporation
David Ciepley suggests that in order to keep corporate powers aimed at public interest, they must be framed within a political theory of the corporation. Ciepley’s political theory of the corporation rests on two premises, that corporations are government-like and that corporations derive their personhood, contractual individuality, and governing authority from governments (142). The first premise has been demonstrated thoroughly with platform corporation's uninterrupted powers over the governance of digital spaces. They hold the power to administer justice on these platforms, limit or grant accessibility, hamper or promote free speech, protect or squander privacy rights. The natural conclusion from the deep entanglement of these wide-reaching private interests with public affairs is that it is morally justified to hold platform corporations to more stringent standards of accountability. The second premise is that corporations depend on the state for their legal and governing powers. On one hand, corporations receive their charters from the government demonstrating a dependence on the state for their provenance. On the other, it is often argued under neoliberal theories of the firm that a corporation can be recognized as ne independently of the state, through a nexus of bilateral contracts (Easterbrook and Fischel 1444). Platform corporations would thus derive their powers of governance from the individual contracts of users who must sign off on the terms and conditions before accessing the digital space. This would effectively evade the governmental provenance of legitimate rule over cyberspace, leaving a legal justification of heightened corporate accountability amiss. Despite this, however, even those contracts, just as the rights to own platforms as private property, depend on a contractual individuality that is dependent on the state’s intervention in the market (Ciepley 143). Corporations’ contractual individuality is formed of three distinct features which allow for the distinction between the corporate assets which entail limited liability and personal assets. All three, “asset lock-in, entity shielding, and limited liability” are the byproduct of governments re-ordering normal market conditions for property and liability (Ciepley 145). Consequently, the very mechanism which allows for platform corporations to remove themselves from the moral responsibility for the corporate assets of user information is afforded by the state’s interference. This governmental provenance of corporations as privileged entities brings Ciepley to justify an appeal to older models of corporate law (153). A model that seeks to reframe corporate rights on the 19th-century models when corporate charters were granted only when a clear public benefit was established, fulfilled, and heavily regulated by the state (Maier 75-78). What would this political theory of the corporation look like and what impact will that have on the accountability of platform corporations? 


The primary difficulty of treating corporations within the strict parameters of a liberal private actor is that their constitutional rights are often exaggerated by a classification of corporations as either partnerships or persons. Defining corporations as partnerships would take the neoliberal stance of neglecting the governmental provenance of corporate charters and instead paint them as “voluntary association[s] of their members” (Morawetz 2). The consequence is the conflation of corporate property with individual shareholder property, resulting in courts extending individual property rights to the corporation, and that of the corporation to individuals (Ciepley 154). This would insinuate that in Facebook’s supposed shareholder democracy, Mark Zuckerberg as the majority shareholder, would have ownership over the platform in a way that empowers him to make unilateral decisions over its policies. As dangerous as a corporation holding unaccountable governance over their private property and the policies that are enacted on them sounds,
renouncing that power to a group of individuals is far more menacing. The second definition for the corporation is that of a real person. The common intellectual framework underscoring this definition draws on medieval corporations which were primarily formed as collections of individuals independent from the state, before receiving a charter from the legislative authority. These organisms were considered to be a “real person, with body and members and a will of its own” (Maitland XXVI) and thus claimants to the constitutional rights of ordinary citizens (Ciepley 155). This characterization, however, blurs the forms of medieval corporations with modern corporations. If the medieval corporation were to dismember itself, so to speak, the association of persons that constituted the corporation outside the legal provenance of the state would cease to exist. The modern corporation however maintains personhood and legal continuity by virtue of a fiction subsisted by the state’s existence. Much like the ship of Theseus, this corporation may be dismantled and reassembled, and yet it is the story and its namesake that preserves its personality. In the same way, governments breathe legal continuity through the orifices of a platform corporation that continues to own property and participate in contracts, irrespective of which shareholders sit on the director’s board. 


The implications of each of these legal misnomers are an inflated bank of constitutional rights that corporations ought not have. Santa Clara County v Southern Pacific Railroad (1886) engaged with the conflation of corporate property and individual property as a consequence of the partnership definition of the corporation. One of the lawyers in the case, Pomeroy J. noted that it is in the interest “of protecting rights, [that] the property of all business and trading corporations is the property of the individual corporators” (Horowitz 70). If corporate ownership is treated with limited liability, platform companies will have what little degree of accountability they hold over the corporate assets of user information and platform architecture dissipated. The naturalization of corporations as real persons also opened avenues for lines of defense that could effectively acquit them from certain legal consequences. In US v Martin Linen Supply co. (1977) corporations were afforded a right against double jeopardy, to be convicted for the same act by two separate punishments. When platform corporations enact changes to their algorithms this constitutes one act that holds consequences across multinational jurisdictions. Platform algorithms, contrary to popular belief, are far from neutral and carry with them coder’s biases, and corporation’s financial and diversity interests (West, Whittaker, Crawford 15). Given that corporate interests lie formally with increasing revenues, algorithms that suggest material to users have been designed to capture the widest scope of attention in the most cost-effective and efficient manner. It did not take these algorithms long to recognize that shock value and the promotion of extremist content such as antisemitic conspiracies amassed consistent audiences (Townsend). Corporations charged with the facilitation of hate speech in an international court of law could use their protection against double jeopardy to evade sanctions from more than one legislative authority. Even though damages could have been felt in the US, Germany, and Zimbabwe if the corporation received punishment from one of these courts it could argue against restitution in the others. 


In Citizens United v Federal Election Commission (2010) the partnership theory is invoked in order to argue that corporations, as an “association of citizens” (Citizens 925), would be entitled to the same political speech rights as its citizen members. However, as developed previously, corporations can’t hail from bi-lateral contracts without governmental provenance (Ciepley 145). The majority constructs a separate argument using the real entity argument which proves as morally problematic. In that, it stands on the premise that corporations ought to be afforded the same rights as other ‘real persons’ in their use of private funds to contribute to elections on the “open marketplace of ideas” (Citizens 906-907). If platform corporations continue to be granted the constitutional right to use their private funds to promote a political party, policy, or even lobby governments, the results can be disastrous. In 2012, the US congress in efforts with civil societies and social activists proposed a bill that would effectively tackle online piracy and protect artists’ rights to distributing and monetizing their art. If passed, this bill would mean that Google would be charged with the costs of content moderation, as well as lose a significant portion of revenue. In response to this threat to their bottom line, the corporation exercised its constitutional right to free political speech afforded in Citizens (2010) by posting a banner on their search page that read “Tell Congress: Please don’t censor the Web!” Shortly after congress was met with a barrage of angry web users raising their hands at an establishment peeking its nose in a cyberspace it has no right to approach (Kolbert). Affording corporations the same constitutional rights as citizens increases “their political influence while reducing their political accountability (Ciepley 140). Ciepley thus pushes forward the alternative view of the corporation as an artifice of government. This view is concisely captured by Chief Justice Marshall in Dartmouth when describing: “A corporation is an artificial being, invisible, intangible, and existing only in contemplation of law” (Dartmouth College v Woodward 1819, 636). Reminiscent of the view of corporations prior to general incorporation acts, the corporation being machinated by the law, is afforded only the rights which its charter of creation allows it rather than the constitutional rights of real, independent persons afforded to them (155). 


Government and Corporations as Co-Governors
Reducing corporations to their original forms as artifices of government would realign their organizational goals with the public interest. By halting the legal realist attempt to assimilate corporations into liberalism, platform corporations would have to assume accountability over their socialized property. Moreover, by acknowledging within legislative schemes that the source of legitimate corporate governance originates from the state, corporate power would be redirected towards the social aims of governments. In addition, the misrepresentation of corporations as private and real entities owed the same constitutional rights as citizens would be amended and adjusted to fit the extent of corporate sovereignty and power. The natural outcome of this renewed relationship between governments and platform corporations would give way for a partnership that would “facilitate an extraordinary degree of control over behavior on the Net” (Lessig). By assuming their roles as co-governors under the same hierarchy of republics, their collaboration would be akin to federal and state interactions, not without its conflicts, but unified in their interest. While this sounds like an ideal solution to the government’s deadlocks in accessing the web and to the lack of corporate accountability, the potential of falling into a digital dystopia hangs over this marriage of industry and state. At the turn of the 19th-century American liberalism sought to accentuate the schism between public and private spheres of power and distribute their legal rights and responsibilities accordingly (McCurdy 973). This clear line of distinction between the roles of state and corporation limits the interactions between governments seeking to protect user rights and platform corporations tasked with their governance. A political theory of the corporation seeks to challenge this binary of market actor and market regulator, and would in practice invite collaborative governance over cyberspace between governments and these corporations. Under a line of reasoning quite similar to the threat felt by platform corporations having the right to lobby governments, blurring the separation of each actors’ distinct role as market regulator and market actor, could prove disastrous. When global economic powers were competing over trade routes and scrambling for land to colonize in the 17th and 18th centuries, governments and their respective trading companies developed policies in collaborative unison. One of the chief critiques that Adam Smith raises against the existence of joint-stock corporations such as the East India Company is that Parliament was deploying policies that served to make the corporation more economically competitive while neglecting the economic health of Britain (Smith, V). Nations in the 19th century competed internationally over trade routes and land using these corporations as the vehicle for global authority. Today, on the other hand, this claim to power lies in the state-funded war over the collection and sale of data and information. User data has become one of the most prized resources and a metric of power for the corporations that trade in them, and the governments that support them in exchange for a tool of economic development or even social control (Pendergast). This race for cyberspace directly engages users’ rights over their own personal information, and to their privacy. Under a co-governance model between the state and platform corporations, the degree to which these rights are protected might devolve from a matter of morality and law to a question of political power. 


Three distinct views surrounding these rights have emerged from what Tom Pendergast names “the next cold war”. The first places user’s rights to personal data in their own hands, and protects this information from being misused as a principle of the constitutional right to personality, dignity, honor, and private life (Weber 121). Captured by the EU’s “right to be forgotten” under its Data Protection Directive 95/46, it outlines users’ rights to ownership over their personal data and their consequential right to be erased from the digital world (120). As a matter of bad business, this type of policy is far from the second view on user rights. In the US, user information is a tradable commodity and becomes a corporate asset once users renounce control over their information with the click of an ‘Agree’ button. The third view peers into the future of a state that has completely removed the barrier between government and corporation, denominating user data as the property of the state (Pendergast). With the emergence of China’s social credit system, user data and their accompanying privacy rights have been mutated into their most grotesque form, a tool for social domination. The calamity does not rest solely on the heads of Chinese citizens, however, but rather this effort to convert user data into a weaponizable tool of social control has ignited the global race for data under a moniker of national security. After the Cambridge Analytica incident the power of data became clear, it could either be used to liberate users and expand consumer choices, or it could destabilize the government and mobilize political agents. The popularization of Chinese platform companies amongst western audiences, such as Tiktok for instance, challenges American tech giants’ claim over cyberspace. More than that, however, this access to valuable foreign user information, the claim of Chinese governance over the western members of cyberspace, begins sounding the rhymes of history looking to repeat itself. The British Parliament once sought to support the efforts of the East India Company against foreign competition, resulting in heinous crimes against humanity and the crippling of their own economy. The modern clash of international interests would find governments compromising on the privacy rights of citizens in order to maintain the market share and social governance of these corporations.
While co-governance may spur threats as a consequence of international pressures to take up arms against the threat of an information war, this relationship would also have domestic implications. In Smith v Maryland (1979) the third-party doctrine was upheld by the court, outlining that "a person has no legitimate expectation of privacy in information he voluntarily turns over to third parties" (743). When users voluntarily agree to renounce their information to platform corporations, they also renounce their right to privacy. While governments can’t directly breach the privacy of rights of citizens, they may subpoena a private corporation for the release of the information you have legally afforded them access to. Meaning all the meta-data that Facebook has on you is accessible to the government if you are under suspicion of criminal conduct. In 2016 the FBI tried forcing Apple to grant them access to an iPhone used by Syed Rizwan Farook the perpetrator of the San Bernardino mass shooting. Apple resisted these efforts vehemently in an effort to protect user’s privacy from law enforcement (NYT 2016). If governments and corporations take on the mantle of co-governors, we might very well see a collaborative relationship that grows at the expense of citizens’ privacy rights. When governments overstep their bounds on the grounds of protecting national security, corporations hold sufficient resources, influence, and powers of governance to challenge this breach. When corporations overstep their bounds, governments charge to the defense of citizens against the tyranny of the corporate oppressor. In US West, Inc v. FCC (1999) offers an example of this patronage. In the Telecommunications Act of 1996, Congress instructed telecommunications corporations to protect consumer privacy. In response to this, the Federal Communications Commission developed regulations over the collection and sale of customer’s proprietary information. When the government and corporations align their interests, it is unsure of the direction that this partnership would take. A political theory of the corporation would elevate the level of accountability that platform firms hold by creating a category of “franchise government” outside the private and public spheres. By doing so, however, it would misguidedly invite for a co-governance that dissolves the government's role as a regulating force defending the rights of people and the corporation’s role as a private corporation outside the political motivations of states. 


The Internet as a Corporation 
Corporations may nevertheless be held accountable by the development of a political theory so long as the state maintains its role as a public agent applying legislative pressures from the outside of the system of digital governance rather than from within. However, perhaps a different avenue may simplify the legislative responsibilities of corporate governors, users, and governments. If the internet were to be considered a corporation in its own rights, with a constitution establishing the basic conventions of this digital Anthropocene this framework could presuppose the charters of platform corporations. Such a corporation akin to social contracts of nations would delineate the responsibilities of future digital governors irrespective of the policy priorities of corporate executives and shareholders. Lessig suggests the establishment of such a constitution for the internet, “an architecture...that structures and constrains social and legal power, to the end of protecting values“ (7). What such a constitution would look like however is instrumental to the way legislative schemes interact with the Internet. According to Keith Whittington, the US constitution ought to be understood “as an actual contract” or at the very least understood through “contract law theory” (Graber 26). If the internet were to have a constitution, which legal theory of the corporation would most aptly describe its nature? Surely we wouldn’t be able to classify it as an artifice of government when it transcends borders and
any one national jurisdiction. To describe it as a partnership would be to inaccurately suppose that users have entered in contracts with each other, or with platform corporations as the mediating body of governance that relates all users together, however, this too would be inadequate. The internet precedes the existence of platform corporations, in fact, their existence wasn’t made readily available until the Web underwent its third transformation in the late 2000s. Users enter into social relationships with each other as they engage with cyberspace itself. Yet, the internet isn’t a physical place, it does not exist on a server, instead, it emerges naturally by virtue of the collective association of users represented formally by nodes in a network. The image that most accurately describes this organism begins to echo the natural emergence of medieval corporations. Gierke describes the german fellowship as “no piece of the State’s machinery...but a living organism and a real person, with body and members and a will of its own” (Maitland XVI). When masses of individuals consent to associating as a community or a government “they are thereby presently incorporated, and make a one Body Politick, wherein the Majority have a Right to act and conclude the rest” (Locke 331). The sheer scale of a movement of consent over associating into a veritable global civilization that draws its legitimate sovereignty from individuals rather than existing governmental bodies seems impossible. However, such a movement already exists. Platform corporations have already funneled the general will of their users through mandatory terms and agreements which grant them the right over user information. The difference is that this general will would now serve to empower a new form of global government.

 

Citations: 
A, Sedler Robert. “Citizens United v Federal Election Commission Case (US).” Max Planck Encyclopedia of Comparative Constitutional Law, 2010, 
doi:10.1093/law-mpeccol/e548.013.548. 
Balkin, Jack M. “Information Fiduciaries and the First Amendment.” UC Davis Law Review, vol. 49, ser. 4, Apr. 2016, pp. 1183–1234. 4. 
Barlow, John Perry. “A Declaration of the Independence of Cyberspace.” Commonplace, 1996, doi:10.21428/6ffd8432.ea8cd895. 
Blackstone, William. “Commentaries on The Laws Of England.” The Oxford Edition of Blackstone: Commentaries on the Laws of England, Vol. 1: Of the Rights of Persons, 1753, doi:10.1093/oseo/instance.00248899. 
Cadwalladr, Carole, and Emma Graham Harrison. “Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach.” The Guardian, Guardian News and Media, 17 Mar. 2018, 
www.theguardian.com/news/2018/mar/17/cambridge-analytica-facebook-influence-us-elec tion. 
Ciepley, David. “Beyond Public and Private: Toward a Political Theory of the Corporation.” American Political Science Review, vol. 107, no. 1, 2013, pp. 139–158., 
doi:10.1017/s0003055412000536. 
“Dartmouth College v. Woodward (1819).” Encyclopedia of the First Amendment, doi:10.4135/9781604265774.n404.
“Dartmouth College v. Woodward (1819).” Encyclopedia of the First Amendment, 1819, doi:10.4135/9781604265774.n404. 
Easterbrook, Frank, and Daniel Fischel. “The Corporate Contract.” Columbia Law Review, vol. 89, 1989, pp. 1416–1448. 
Frug, Gerald E. “The City as a Legal Concept.” Harvard Law Review, vol. 93, no. 6, 1980, p. 1057., doi:10.2307/1340702. 
Gierke, Otto von, and Frederic William Maitland. Political Theories of the Middle Age. Cambridge University Press, 1900. 
Graber, Mark A. “Introduction to American Constitutionalism.” A New Introduction to American Constitutionalism, 2013, pp. 1–13., 
doi:10.1093/acprof:oso/9780199943883.003.0001. 
Horwitz, Morton J. “The Transformation of American Law, 1870-1960: The Crisis of Legal Orthodoxy.” Oxford University Press, 1992, doi:10.2307/2079877. 
Jensen, Michael C., and William H. Meckling. “Theory of the Firm: Managerial Behavior, Agency Costs, and Ownership Structure.” Economic Analysis of the Law, 1976, p. 162., doi:10.1002/9780470752135.ch17. 
Klonick, Kate. “The New Governors: The People, Rules and Processes Governing Online Speech.” Harvard Law Review, vol. 131, ser. 1598, 2018, pp. 1600–1670. 1598. 
Kolbert, Elizabeth, and Nathan Heller. “Who Owns the Internet?” The New Yorker, www.newyorker.com/magazine/2017/08/28/who-owns-the-internet. 
Lessig, Lawrence. Code. Basic Books, a Member of the Perseus Books Group, 2006. 
Lessig, Lawrence. “Code Is Law.” Harvard Magazine, 29 Feb. 2012, 
harvardmagazine.com/2000/01/code-is-law-html.
Livermore, Shaw, and Edwin Merrick Dodd. “American Business Corporations until 1860: With Special Reference to Massachusetts.” Political Science Quarterly, vol. 70, no. 1, 1954, doi:10.2307/2145440. 
“Lloyd Corporation, Ltd. v. Tanner (1972).” Encyclopedia of the First Amendment, 1972, doi:10.4135/9781604265774.n799. 
Locke, John, and Peter Laslett. Two Treatises of Government: a Critical Edition. Cambridge University Press, 1988. 
Maier, Pauline. “The Revolutionary Origins of the American Corporation.” The William and Mary Quarterly, vol. 50, no. 1, 1993, pp. 51–84., doi:10.2307/2947236. 
Marwick, Alice, and Rebecca Lewis. Media Manipulation and Disinformation Online. 2017. 
Mccurdy, Charles W. “Justice Field and the Jurisprudence of Government-Business Relations: Some Parameters of Laissez-Faire Constitutionalism, 1863-1897.” The Journal of American History, vol. 61, no. 4, 1975, p. 970., doi:10.2307/1890641. 
Means, Gardiner, and Berle AAdolf. “The Modern Corporation and Private Property.” New York: Macmillian Company, 1932, doi:10.4324/9781315133188. 
Morawetz, Victor. “A Treatise on the Law of Private Corporations Other than Charitable.” Boston: Little Brown, 1882. 
Mozur, Paul. “A Genocide Incited on Facebook, With Posts From Myanmar's Military.” The New York Times, The New York Times, 15 Oct. 2018, 
www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html. 
North, Douglass C., et al. “Open Access Orders.” Violence and Social Orders, pp. 1–147., doi:10.1017/cbo9780511575839.005. 
United States, Congress, Packingham v North Carolina. 2017.
Pendergast, Tom. “The Next Cold War Is Here, and It's All About Data.” Wired, Conde Nast, 27 Mar. 2018, www.wired.com/story/opinion-new-data-cold-war/. 
Robins, Nick. “Loot: in Search of the East India Company, the World’s First Transnational Corporation.” Environment and Urbanization, vol. 14, no. 1, 2003, pp. 79–88., doi:10.1177/095624780201400107. 
Robins, Nick. “The Corporation That Changed the World : How the East India Company Shaped the Modern Multinational.” London Pluto Press, 2006, 
doi:10.26530/oapen_625263. 
Rousseau, Jean-Jacques, et al. On the Social Contract. Hackett Publishing Company, Inc., 2019. 
United States, Congress, Santa Clara County v Southern Pacific Railroad. 1886. Smith, Adam. The Wealth of Nations. Seven Treasures Publications, 2009. 
Spillenger, Clyde, and Morton J. Horwitz. “The Transformation of American Law, 1870-1960: The Crisis of Legal Orthodoxy.” The Journal of American History, vol. 80, no. 2, 1992, doi:10.2307/2079877. 
Tankovska, H. “Facebook Ad Revenue 2009-2018.” Statista, 5 Feb. 2021, www.statista.com/statistics/271258/facebooks-advertising-revenue-worldwide/#:~:text=In 2020, about 97.9 percent,increase in comparison to the. 
Taylor, Charles. “Modern Social Imaginaries.” Duke University Press, 2004, doi:10.1215/9780822385806. 
The New York Times. “Breaking Down Apple's IPhone Fight With the U.S. Government.” The New York Times, The New York Times, 3 Mar. 2016, 
www.nytimes.com/interactive/2016/03/03/technology/apple-iphone-fbi-fight-explained.ht ml.
Townsend, Mark. “Facebook Algorithm Found to 'Actively Promote' Holocaust Denial.” The Guardian, Guardian News and Media, 16 Aug. 2020, 
www.theguardian.com/world/2020/aug/16/facebook-algorithm-found-to-actively-promote holocaust-denial. 
United States, Congress, US v Martin Linen Supply Co. 1977. 
United States, Congress, US West, Inc v. FCC. 1999. 
Vile, John. “Smith v. Maryland (1979).” Encyclopedia of the Fourth Amendment, doi:10.4135/9781452234243.n709. 
Weber, Rolf H. “The Right to Be Forgotten More Than a Pandora’s Box.” JIPITEC, 2011, pp. 120–129. 
West, Sarah M, et al. “Discriminating Systems, Gender, Race and Power in AI.” AI Now Institute, Apr. 2019, pp. 1–33. 
Wood, Gordon S. “The Emergence of the Public-Pivate Distinction in Early America.” Osaka: Japan Center for Area Studies, 1999, pp. 1–12. 
“Zeran v. America Online, Inc. (4th Cir. 1997).” Encyclopedia of the First Amendment, 1997, doi:10.4135/9781604265774.n1434. 
Zuboff, Shoshana. The Age of Surveillance Capitalism: the Fight for a Human Future at the New Frontier of Power. PublicAffairs, 2020. 
Zuckerman, Ethan. “Cute Cats to the Rescue? Participatory Media and Political Expression.” Youth, New Media and Political Participation, Apr. 2006.

bottom of page