You have to have 2 out of 3 of the following traits:
1. Exceptional skill
2. Luck
3. Lack of scruples
"There are three ways to make a living in this business: be first; be smarter; or cheat"
Though I'm sure people who are first tend to view that as skill or being a visionary, not luck
As a technology, it is just database joins. It is just that they are able to pull in data from everything from S3 to SAP to ArcGIS, and provide analytics, visualization etc. on top to provide global visibility into any system.
The visibility can be "show me all illegal immigrant clusters" or "show me bottlenecks and cost sinks in CAHSR construction".
When we offload the moral impetus for society from politics to technology, we also squander control. Tech is tech and can be used for both good and bad. It is not that a strategy that aims to cap downsides by preventing the proliferation of technology is inherently bad, but it is doomed to fail. The evidence for dysfunction is not the existence of Palantir but in the failure of the watchdog layer of society (also called the government).
No doubt Deloitte or any other contractor shop would be able to do the same thing - but they don’t choose to.
But if the major vendor and purveyor happens to be Blow Thine Enemies To Tiny Bits Incorporated, developing faster fuses and embedded shrapnel, then people are right to be concerned about "The TNT stuff".
I'm sorry but I absolutely disagree that the reason say Deloitte is leaving a few hundred billion dollars on the table is the presence of a moral compass.
The power shouldn't be solidified in a few hands period.
> And so its important for us to understand how they can be weaponized and to consider the social cost of that weaponization.
To be clear, I absolutely agree. Plenty of tech is double-edged. And Palantir very much so.
Let me restate my point. Palantir (or that class of tech products) is powerful at enabling visibility over a complex system. But visibility is not decisions, it is an input to decisions. If you had real-time telemetry from every single stomach, you could maybe automatically dispatch drones with food wherever someone is starving. Or you could use the data as a high-frequency indicator for a successful invasion. Morality is downstream of decisions not data.
"oh it's just database joins" is about like me ripping your arms off and describing it as "chemical reactions"
https://aircraft.airbus.com/en/services/enhance/skywise-data...
They have a thriving commercial business outside of their government work. (Disclaimer: long PLTR)
This argument is both inconsistent and counterproductive.
Inconsistent as in, the harm to me from having my arms being ripped off comes from you deciding to effect the intent to harm me. No photograph or x-ray of my arms can produce the intent of wanting to harm me.
Counterproductive as in, the "good vs bad" framing is pointless because it does not help with solutions. If your solution is to ban joins, you will have a hard time gaining traction for your cause. Strategic advocacy requires understanding axes along which you may be able to produce a coherent argument and gain leverage. "Ban joins" does not help.
Its data collection. Its privacy. If were waiting around for the day people start acting ethically, we'll experience the heat death of the universe.
Governments can always turn evil. Companies can always be compelled. People can always turn evil.
We need to not give them the ammunition. We've cornered ourselves into a situation where we sacrifice our data and privacy, and we are forced to blindly trust it will not be used against us.
If we do not collect data, we cannot have data breaches. If we do not collect data, we cannot have mass surveillance. If we do not collect data, we cannot have wiretapping.
We've simply allowed and encouraged tech companies to collect as much data as humanly possible. That starts with Google, Meta, et al. We then trust they will not abuse it.
But they certainly can, and they certainly will. What is done now cannot be undone. We cannot take back data immortalized. But, what we can do is prevent new data collection.
Use private services. Run software locally when feasible. Deny analytics. Block advertisments. Use end to end ecryption. Etc.
In truth, the rest of your arguement is fully correct. Palantir is often portrayed as the "hacking American businesses" group, but that's NSO. Palantir is merely buying out the data from morally-flexible telecoms and capricious cookie-laden websites. There is an uncomfortable truth about networked technology that America has swept under the rug for decades, and now we have entire businesses as a symptom of that failure. It's a sickening precedent for a free society.
I'd like to believe in a political solution to this. I've yet to see one, and the consequences of the Snowden leaks suggest we may never correct course here in America.
It's not that simple, since tech also enables bad that was previously not possible.
>> Sure. And if that bad it enables is worse than the good it enables, then tech is not really that neutral.
> Correct, but then there's all sorts of value judgments involved.
The problem is when the form of "bad" enabled has no remedy. For example, identifying potential dissidents and their "network of associates" to authoritarian regimes. In these cases, there is no amount of "things that are 'good'" which can offset the "bad".
Not to go [again] in the technical debate, I will summarise their stack: they use Spark as their foundation, with a simple pattern of materialisationAsTables when needed, possible synchronisation to RDBMS or to a graph database with a strong ontological layer on top.
They then provide a web app stack and a low code/noCode dev environment. [there are other components in the platform, but let’s keep it simple].
So no IT rocket science here, but the UX mostly hides all the IT bricks under a pure data oriented workflow. [very few of my colleagues know what python, spark, AWS are].
Three comments on this: could anyone rebuild such a platform? Yes. Is it worth it? Most companies will say no. Do SAP analytics tools compete? Time will tell.
Really in Foundry the scalability is MASSIVE ! [but keep in mind that this is an analytics platform, not a write-oriented platform]
Now let’s switch to the political side of [the company called] Palantir:
Can such a platform be used by Santa Claus to monitor the data for the next Christmas? yes
Can someone decide to aggregate all the data of all the citizens and hope to do mass control with that? Probably [but hey, Facebook/Netflix/TikTok are already doing that, plus they are actively hacking your *brain*, and no one complains]
As a business, it is far more. Their FDEs are intrinsic to unlocking capabilities for customers.
Most of the time companies who have systems like Palantir, I’m thinking the SAP, Oracle, blah Blah, have to report earnings to the street through a 10k or have to comply with regulations like Sarbanes Oxley.
They will also have in-house IT staff to monitor the logs etc.
The programs installing the Foundry system have an incentive to hide the data from prying eyes and therefore it never leaves the Palantir ecosystem. The government doesn’t hire independent consultants, auditors etc to confirm if it’s being used or not.
They simply have to demonstrate trustworthiness to a security officer and hope an IG doesn’t have an external equivalent of a Forward deployed engineer.
So while the technology is mediocre, it’s the nebulousness or the lack of audit-ability and the are the people writing checks the same people signing them.
So I sympathize with Karp talking about technology being fine it’s the apparatus surrounding it that says “just trust us” that gives pause, especially in today’s culture of conflict.
If I told you that 90% of all transactions get routed through a foreign companies software, you might pause but it’s been like that for years (SAP). The difference is there are controls in place.
Which don't work out all that well in practice.
> on top to provide global visibility into any system.
Global visibility into the data. There's no guarantee your data and your performance match. We have so much data the quality of much of it is fairly low.
> Tech is tech and can be used for both good and bad.
You can also just lie about what you're doing and use it as a cover for violations of civil rights and federal law.
I mean, if it's just "database joins," then why is the government buying this from a vendor? Shouldn't they just be able to _do that_?
Why do everyone go to Facebook [/or HN] instead of self-hosting their blog?
Because Facebook [/or HN] is a bunch of highly skilled experts that have shaped the proper UX for ubiquitous information exchange between humans [/ geeks].
So we, the users, can concentrate on our own business [/ trolls].
"[/or HN]" !
Jesus, lol.. Reminds me of when folks /emphasize/ words like /this/ in their comments. Anyways, I'm just joshin' ya. Personal blogs /would/ make for a superior internet as opposed to Facebook/X.
I mean Palantir exec and their employees are part of society too right ? They themself are making moral choice when working for such technology instead of joining tables for hospitals.
If the acts of law abiding individuals (or groups) are a net negative for society, that is not an individual failure. Fiduciary responsibility is a useful parallel: it is not the job of a sugar manufacturer to think about the public health aspects of sugar. Their responsibility to their shareholders is to produce clean, safe, and edible sugar at competitive prices and do a good job with marketing and distributin, that's all.
Or is it a corporation of people that is (I know, try to stifle the laughter) supposed to have at least some morality? I get it. Corporations haven't functioned as any institutional morality since their inception as a legal framework, despite the Supreme Court handing them immortal citizenship with effective privilege over any real citizen.
So far we have:
- masked paramilitary agents chasing down the lowest rungs of "at first they came for ..."
- deployed formal military to democratic cities for intimidation
- cowering, terrified tech CEOs embarrassingly kissing ass
- Threatened, capitulated universities, law firms, and fourth estate tv companies
- Massive amounts of purging of civilian institutions from any oversight
- Purging of military leadership based solely on blind loyalty to the president
- Massive fraud leading to multibillion dollar increase in Trumps wealth
- A supreme court that may as well have been disbanded that has handed unlimited privilege to Trump's executive branch
And waiting in the wings is Palantir-enabled TOTAL INFORMATION AWARENESS of the entire populace.
So back to Palantir, the absolved "just a tech firm" that has been providing turnkey authoritarian control to the US government for decades now. Of course it won't function as any bulwark against the coming storm.
Oh, I think I understand Palantir very well. Anyone that works there should know that you exist to set up totalitarianism. That is your function now. "Homeland defense" and all those weak USA PATRIOT act justifications and funding are now far in the rear view mirror.
Up ahead: Mount Totalitarianism.
Cloak yourself in doublespeak, Palantir.
I have likely marked myself for death.
When ChatGPT launched, Palantir's stock started climbing by selling its "AI platform".
The cycle follows a marketing funnel: AIDA - awareness, interest, decision, action. https://www.smartinsights.com/traffic-building-strategy/offe...
FUD: Awareness and interest (AI) - at the initial stages, doomer marketing by big tech to government about its dangers and regulations
https://en.wikipedia.org/wiki/Fear,_uncertainty,_and_doubt
FOMO: Decision and action (DA) - After selling, it is all about investing in infrastructure and adopting the technology
https://en.wikipedia.org/wiki/Fear_of_missing_out
Sentiment shift: https://news.ycombinator.com/item?id=44870777
Their primary technology predates any AI hype by a decade at least, and their strength has always been in deploying great engineers.
Sarah Brayne (2020) Predict and Surveil: Data, Discretion, and the Future of Policing, Oxford University Press.
As the book explains, Palanatir is one of the largest companies specializing in surveillance data management services for law enforcement, the military and other corporations. Palantir does not own its data but rather provides an interface that runs on top of other data systems, including legacy systems, making it possible to link data points across separate systems. Palantir gathers its data primarily from "data brokerage firms," including LexisNexis, Thomson Reuters CLEAR, Acxiom, CoreLogic, Cambridge Analytica, Datalogix, Epsilon, Accurint. As Brayne observes, these data brokerage firms "collect and aggregate information from public records and private sources, e.g., drivers licenses, mortgages, social media, retail loyalty card purchases, professional credentials, charities’ donor lists, bankruptcies, payday lenders, warranty registrations, wireless access points at hotels and retailers, phone service providers, Google searches and maps geolocation, and other sources who sell your data to customers willing to pay for it. Yet it is difficult to fully understand the scope of the data brokerage industry: even the FTC cannot find out exactly where the data brokers get their information because brokerages cite trade secrecy as an excuse to not divulge their sources."
Why is this a concern for people living in a democratic society with a legal system that supposedly protects individual freedoms? "Big data companies argue that their proprietary algorithms and data are trade secrets, and therefore they refuse to disclose their data, code and techniques with criminal defense attorneys or the public" (p. 135). This means that, "In many cases it is simply easier for law enforcement to purchase data from private firms than to rely on in-house data because there are fewer constitutional protections, reporting requirements and appellate checks on private sector surveillance and data collection, which enables police to circumvent privacy laws" (pp. 24-5, 41-2).
Another way to phrase this is:
Why transform government into Big Brother[0], with
all the hassle of oversight and accountability
this would entail, when outsourcing to Big Friends
via handsome contracts will achieve the same result
while enabling "plausible deniability" under oath?
0 - https://en.wikipedia.org/wiki/Big_Brother_(Nineteen_Eighty-F...And those third party companies can, if they choose, tell Palantir to pound sand if they don't have warrants.
The real problem is those third parties know a LOT about us, and it's essentially impossible to opt out of their data gathering. License plate scanners and credit bureaus, anyone?
And then those third party companies, if they're interesting enough to Palantic or those using Palantir, might get cancelled state contracts, or surprise tax audits, and other pressures... totally unrelated "of course"
I know this example is less exciting than spying on everyone but despite how they try to hype it up it's a lot more realistic use case.
Silicon valley was supposed to do no evil, no wonder this generation hates tech bros
The social media era (Fbook) is when it started feeling like "majority of new companies are evil". Of course, if Palantir is Sauron, Oracle is Morgoth..
Government contracting is an activity.
The two should be very far apart and yet somehow they're joined at the hip.
In large part, government contracting is what created Silicon Valley. https://steveblank.substack.com/p/if-i-told-you-id-have-to-k...
> This work began with the observation that certain expressions have a drive-releasing effect, and this effect occurs not despite but because of their apparent irrationality. Expressions that blatantly contradict their own content offer actors the opportunity to formally acknowledge the normative order of their cultural environment while simultaneously expressing forbidden desires that violate the rules of this order. This, in turn, does not trigger cultural or social sanctions. On the contrary, such expressions solidify integration processes by making integration and its psychological costs bearable. Drawing from Adorno, I refer to such expressions as "Jargon." Jargon is not just a self-deception; it is a particular form of self-deception. It not only relieves the speaker but also integrates them into the circle of those who belong. Through Jargon, the present is embellished, rendered promising for the future, and thus made acceptable.
> However, Adorno's descriptions of aggressive actions expressed in Jargon are conceptually challenging to grasp. They slip away under the scrutiny of a rigorously working scholar. The translation of such impressions into a durable conceptual model encounters the limits of various social scientific traditions and quickly runs into difficulties. As much as the advantages of transferring Adorno's critique into a different conceptual framework are apparent, there is a risk that by relinquishing Adorno's premises, their critical rigor may disappear.
> Furthermore, this raises a series of questions that need to be addressed. For example, how can the complexity of modern society be taken into account without ignoring the instinctual elements of social action? What does an aggressive action expressed in Jargon actually look like, and what cultural significance would an action have that is transmitted through Jargon? Adorno's concept of Jargon can ignite a discussion about this. However, it leaves some problems untouched that I must address from my perspective. Adorno refrains from providing answers to such questions. He can afford to do so because he relies on premises that willingly accept a de-differentiation of the social world. Similarly, he does not discuss the specific cultural framework in which the aggressive action expressed in Jargon acquires its meaning. From the perspective of this work, it takes some imagination to understand how Jargon can play a role in integrating aggressive impulses within a coherent culture. The culture-specific transformation of aggression must also be a part of such an exposition. Adorno only partially acknowledges the cultural context in which this aggression expressed in Jargon acquires any meaning, or he does so in its subliminal form. It is evident that Adorno's approach is built upon precisely such culture-specific elements of the expression of aggression.
Some more discussion on a related story then:
What does Palantir actually do?
Thanks, redirected from https://www.techdirt.com/2025/09/11/how-palantir-is-mapping-...
Should be an absolute red mark to have this company or any affiliated with it in your CV. Absolutely anti-societal.
They might change aspects of oversight. They might diversify to avoid contract capture.
Sorry to be blunt, but government tends to be amoral when it comes to public noise about things, and actual choices made. Agencies of all kinds from LEA out to health will ask for retained access to the joins over disparate data.
The same across the UK, Europe and the OECD. Plantir is going to do very well, into the future. Some politics will force change. The EU will eventually get robust, onshore, self controlled data analytics and management.