Peter Thiel’s Palantir: Carl Schmitt, the Friend/Enemy Distinction, and the Surveillance State
- David Lapadat | Music PhD

- 5 hours ago
- 9 min read
The Seeing Stone: Why Palantir Named Itself After Sauron’s Eye
In the minor theology of Silicon Valley, where companies name themselves after fruits, verbs, and mathematical concepts, only one chose to name itself after a weapon belonging to the enemy. Palantir. Sauron’s seeing stone — the orb that corrupted everyone who looked into it. Sauron did not invent the seeing stones, and they were not simply his weapons. But one palantír falls under his power, and the instrument of vision becomes an instrument of domination. The company took the name in 2003.
In Tolkien’s mythology the palantíri were tools of communication that turned into dark instruments once one will proved stronger than another; the stronger will preserved the stone and captured the gaze, and whoever looked in saw what the dominant eye permitted and mistook the governed vision for knowledge.
The company that took the name inherited that pressure: vision as power, communication as command, knowledge as a field that can be bent before it is believed.
Born from a Failure of Architecture: How 9/11 Created the Surveillance Startup
September 11th exposed failures of imagination, policy, capability, management, and information-sharing. The CIA held fragments, the FBI held fragments, the NSA held fragments, and the state failed to turn scattered signals into a coherent warning. The commission named imagination as part of the failure, not the whole of it. The fatal weakness was architectural: separated institutions could possess pieces of vision without producing vision itself.
At PayPal, Thiel had already seen software used to read fraud patterns across millions of transactions, surfacing anomalies before human analysts could name them.
He looked at the intelligence failure and saw the same problem at a different scale. The solution he funded was shaped around the intelligence community itself: software built for defense and intelligence analysts, refined against operational need until the tool and the institution grew difficult to distinguish.
Most conventional venture capital had little appetite for it. The client was the state, and the state’s appetite for seeing had grown, in the smoke of that morning, functionally limitless. The conventional venture-capital objection — no consumer market, no viral loop, no path to a billion users — was irrelevant, since the customer here was a state mandate, and the mandate carried the weight of three thousand deaths and a national security apparatus that had just discovered, in the worst possible way, that its own architecture was the vulnerability.
What Thiel built was, in its essential function, a prosthetic organ for the intelligence community — a system that extended the state’s capacity to see across its own institutional walls, to lay one agency’s suspicion beside another’s, and to surface patterns that no single analyst could have assembled from the fragments available to any single desk.
Inside Gotham: How Palantir’s Ontology Turns Data into Targets
Gotham integrates what other systems only hold. But we should point out Palantir’s own distinction: the company says it does not collect, mine, or sell data; it builds software that lets customers integrate data they already possess. The political danger, then, lies less in ownership alone than in the architecture that makes separated data operational.
Structured and unstructured data enter a single ontological layer where every entity is registered as an object: a person, a place, a vehicle, a phone number, an event, and the system maps the relationships between them. An analyst who once spent months cross-referencing filing cabinets watches a network of associations light up until the pattern resolves into a name.
Ontology matters here, and not in the philosopher’s sense: philosophy asks what exists, while this system decides what counts.
Consider a phone call between two numbers — on its own it registers as background noise, but rendered as a line connecting two nodes inside a graph containing a financial transfer, a border crossing, and a known alias, the same call reads as legible through the geometry of its position, a position inside a pattern whose shape no human assembled but which now presents itself as pattern.
The graph displays without arguing, and in the space between display and decision a form of power operates that has no name in any constitution: responsibility is distributed across integration, calibration, thresholds, and human review until the act of choosing becomes difficult to locate. The dashboard can make a target look less chosen than discovered.
The architecture of deniability was built in from the start. The machine never pulls the trigger. Its danger is subtler: it can make the trigger appear already indicated.
Here is the structure’s elegance and its danger. The analyst who acts on the pattern may experience the target as disclosed rather than chosen, because the target appears through a convergence of data points no single analyst assembled alone. The algorithm can be defended as processing the data it was given. The data carries previous priorities, collection practices, mandates, and earlier crises.
Responsibility travels backward through the chain and arrives nowhere. The result is a system in which consequential decisions are made continuously and accountability is distributed to the point of dissolution — dissolved by architecture itself, with no malice required.

The Friend/Enemy Distinction as Software: Carl Schmitt’s Theory Inside the Dashboard
Carl Schmitt, the German legal and political theorist whose work became permanently stained by his support for National Socialism, argued that the fundamental category of politics was the distinction between friend and enemy — beneath justice, beneath freedom, beneath law — and that sovereign power appears most nakedly in the decision over the exception. Citizen, criminal, refugee, ally: each category flows downstream of that cut.
Schmitt himself was discredited after 1945, but the friend/enemy grammar did not vanish from statecraft. It reappeared in bureaucracy, emergency administration, security architecture, and software, often without Schmitt’s name attached.
Palantir’s ontology can operate like a friend/enemy sorting machine below the threshold of explicit political language. The platform renders associations and lets the geometry of connection imply risk, priority, or threat. The decision Schmitt theorized as the essence of politics — who is the enemy — can arrive as workflow. The translation is not literal. A risk score is not Schmitt’s enemy. The family resemblance appears when association becomes suspicion, suspicion becomes thus operational priority, and political judgment enters the room disguised as administrative sequence.
A pattern crosses a threshold, a name acquires the outline of a target, and on the dashboard the distinction between friend and enemy appears as nothing more consequential than a color gradient.
In other words, mildness of the interface is itself a form of power. A system that renders the sovereign decision as an administrative output — a shade of color, a ranked list, a flagged node — has removed from the process the one thing that might occasion resistance: the sense that a decision has been made at all.
The decision arrives in the grammar of data, where nothing argues and nothing is proposed, only presented, and in the space between presentation and action a political judgment of the highest order is executed with the emotional register of a weather forecast.
Schmitt would have recognized the family resemblance. The sovereign decision has been preserved in force while being stripped of its phenomenology — the drama, the declaration, the visible moment of choice.
What remains is workflow — power without costume, command without ceremony, sovereignty lowered into procedure until it reads to its operators as the ordinary texture of the working day. The people running it experience their shifts as administration; sovereignty, so thoroughly proceduralized, has now dissolved into the texture of the working day.
The Exception That Never Ended: From Terrorism to Pandemic to Permanent Readiness
Schmitt’s friend/enemy distinction and his theory of the exception belong to different parts of his work, but they meet at one point: ordinary law is most vulnerable when institutions claim that ordinary conditions no longer apply.
After September 11th, the exception did not simply end. Terrorism justified the system’s birth; later security, policing, immigration, and public-health problems gave similar data infrastructures new domains of use. Each crisis left behind tools the next crisis could inherit, and the infrastructure rarely asked whether the original crisis was still active.
Declaration is unnecessary where inheritance suffices. Each generation of officials receives capabilities built for a crisis that has technically ended and finds the capabilities too useful to dismantle. The justifications shift, but the architecture persists, and each new justification adds a layer of precedent that makes the next expansion easier.
The company was built for the condition of permanent readiness that follows modern national-security crises: a world in which institutions want tools that remain available after the emergency that justified them has passed.
From Screens to Soil: Palantir on the Battlefield in Ukraine
After Russia’s February 2022 invasion, the logic moved from screens to soil. Public reporting places Karp’s meeting with Zelensky roughly three months after the invasion, with Palantir beginning work with the Ukrainian government in summer 2022.
Its software has been reported to fuse satellite imagery, drone footage, open-source data, ground reports, radar, thermal imagery, commercial data, and allied government data for Ukrainian forces. Precision weapons depend not only on range and explosive force, but on the speed and quality of the data environment around them.
Karp has argued that software has become one of the central weapon systems of modern war and has claimed that Palantir is responsible for much of the targeting in Ukraine. That claim gives the software provider a kind of power older defense contractors did not possess in the same way. Lockheed may build the weapon; Palantir helps structure the information environment in which targeting decisions are made.
The grammar by which raw data becomes actionable military judgment can travel across wars, agencies, and alliances. The platform is indifferent to the moral content of the conflict; it cares for structure — nodes, edges, thresholds, clusters — and structure travels.

Monopoly on Seeing: Why the Watchers Cannot Be Watched Back
The architecture belongs to a philosophy that treats competition as weakness and monopoly as the natural terminus of serious innovation. A monopoly on seeing tends toward a monopoly on sorting, and a monopoly on sorting becomes a claim over deciding who belongs to the political order and who does not.
Palantir’s share structure gives its founders unusually durable voting power through Class F stock and related voting arrangements, while the firm remains embedded in defense and government architectures in the United States, the United Kingdom, Israel, and other allied systems. That structure sharpens the oversight problem: public institutions may rely on technical systems they cannot easily reproduce, inspect, or contest on equal terms.
Karp has acknowledged, on stage, in daylight, that Palantir’s work can support lethal operations. The candor is itself a form of power — it performs transparency while the system remains opaque. The acknowledgment does not, by itself, answer how targets are generated, ranked, reviewed, or acted upon; the ontology, thresholds, data sources, and human review process remain largely inaccessible to the public.
Tolkien’s seeing stones had one feature Schmitt never needed: the gaze could be returned by a stronger will. The danger was not only that the user could see, but that the user could be seen, directed, and mastered through the very instrument of sight.
The modern arrangement inverts the old warning. The watcher is real, proprietary, and embedded in the defense architectures it helped create, but its gaze is returned by no one with equivalent instruments. The people whose lives are classified rarely see the ontology that classified them. Oversight bodies may receive documents, briefings, audits, or procurement language, but they often do not possess a rival machine capable of testing the full operational reality on equal terms.
Even the engineers who tune the thresholds operate within constraints set by contracts they did not write, defending outputs they did not design, inside institutions shaped by data they did not collect.
Each tier of the chain has access to its immediate neighbor and to nothing beyond, and the architecture of limited sight-lines, reproduced at every layer, is the shape the palantír takes when it is wired into the administrative state.
Under Sauron’s hand, the old palantír conquered by returning a stronger gaze. Its modern namesake has learned a quieter art: to preserve the watcher, narrow the world, and let the captured field harden into reality itself.
Selected Bibliography
Bergengruen, Vera. “How Tech Giants Turned Ukraine Into an AI War Lab.” TIME, February 8, 2024.
Dastin, Jeffrey. “Ukraine Is Using Palantir’s Software for ‘Targeting,’ CEO Says.” Reuters, February 2, 2023.
Hamilton, Isobel Asher. “‘Our Product Is Used on Occasion to Kill People’: Palantir’s CEO Claims Its Tech Is Used to Target and Kill Terrorists.” Business Insider, May 26, 2020.
National Commission on Terrorist Attacks Upon the United States. The 9/11 Commission Report: Final Report of the National Commission on Terrorist Attacks Upon the United States. Washington, DC: Government Printing Office, 2004.
Palantir Technologies Inc. “Overview • Ontology.” Palantir Documentation. Accessed May 12, 2026.
Palantir Technologies Inc. Registration Statement on Form S-1. U.S. Securities and Exchange Commission. Filed August 25, 2020.
Palantir Technologies Inc. Definitive Proxy Statement. Schedule 14A. U.S. Securities and Exchange Commission. Filed April 23, 2026.
Schmitt, Carl. The Concept of the Political. Expanded ed. Translated by George Schwab. Chicago: University of Chicago Press, 2007.
Schmitt, Carl. Political Theology: Four Chapters on the Concept of Sovereignty. Translated by George Schwab. Chicago: University of Chicago Press, 2005.
Thiel, Peter. “Competition Is for Losers.” The Wall Street Journal, September 12, 2014.
Tolkien, J. R. R. The Lord of the Rings. 3 vols. London: George Allen & Unwin, 1954–1955.
Vinx, Lars. “Carl Schmitt.” Stanford Encyclopedia of Philosophy. First published August 7, 2010; substantive revision March 6, 2025.



Comments