“Ice is just around the corner,” my friend said, looking up from his phone. We were writing at a coffee shop in one of the oldest neighborhoods of New York City, where schools and churches support thriving migrant communities as they have since long before the United States existed. Now the agents of this rogue federal agency – recognized for civil rights abuses like racial profiling, wrongful detention, medical neglect and inhumane detentions – were just footsteps away, shaking down our neighbors in their homes and at the park across the street.
A day earlier, I had met with foreign correspondents at the United Nations to explain the AI surveillance architecture that Ice is using across the United States. The law enforcement agency uses targeting technologies which one of my past employers, Palantir Technologies, has both pioneered and proliferated – tools I was once charged with illustrating as a graphic designer and writer, yet the consequences of which I am just coming to understand. Although largely invisible, technology like Palantir’s plays a major role in world events, from wars in Iran, Gaza and Ukraine to the detainment of immigrants and dissident students in the United States. But despite its ubiquity, lawmakers, technologists and the media are failing to protect people from the threat of this particular kind of weaponized AI and its consequences, partly because they haven’t recognized it by name.
Known as intelligence, surveillance, target acquisition and reconnaissance (Istar) systems, these tools, built by several companies, allow users to track, detain and, in the context of war, kill people at scale with the help of AI. They deliver targets to operators by combining immense amounts of publicly and privately sourced data to detect patterns, and are particularly helpful in projects of mass surveillance, forced migration and urban warfare. Also known as “AI kill chains”, they pull us all into a web of invisible tracking mechanisms that we are just beginning to comprehend, yet are starting to experience viscerally in the US as Ice wields these systems near our homes, churches, parks and schools.
The invisible nature of these surveillance structures – and how they influence our lives – is part of the reason the public understanding of what these tools do is so murky. It is also, however, what drew me to work for Palantir as an architecture writer. It was a chance to get to know the digital spaces where many people spend most of their lives today. Working with cloud software in offices, driving new cars in our commutes, doom-scrolling on social media at home – we all feed vast amounts of data to surveillance and targeting programs created by big tech which we often don’t recognize until it’s too late. This is why I continue trying to convey and illustrate how these Istar applications violate our civil rights and autonomy in increasingly perverse and violent ways.
The dragnets powered by Istar technology trap more than migrants and combatants – as well as their families and connections – in their wake. They appear to violate first and fourth amendment rights: first, by establishing vast and invisible surveillance networks that limit the things people feel comfortable sharing in public, including whom they meet or where they travel; and second, by enabling warrantless searches and seizures of people’s data without their knowledge or consent. They are rapidly depriving some of the most vulnerable populations in the world – political dissidents, migrants, or residents of Gaza – of their human rights.
There was a time when I wrote about the row homes in my neighborhood, with ornate windows and star-shaped iron studs, and how they welcomed migrants who sought hard work and opportunity in the US. With shared walls and affordable rents, they created tolerant and prosperous communities and accelerated the rise of the largest middle class in history. Now a new kind of architecture greets migrants and visitors to America and decides their future – one that is not made of bricks, mortar and lumber, but comprising these invisible and invasive digital surveillance systems.
With names like Investigative Case Management (ICM) and ImmigrationOS, the big data platforms Palantir provides for the Department of Homeland Security, and like those it offers the IDF, are fundamentally composed of four shared elements: the underlying data integrated into the system, the interpretation and modeling of that data through analytics, and the execution of automated actions – with or without human involvement. At every layer of this architecture, there are significant ethical questions regarding civil rights, data collection, data quality, bias, discrimination, accuracy, automation and, most importantly, accountability.
Ultimately, however, these platforms generate and track targets by exploiting a mind-boggling range of datasets. This can include deeply personal information such as biometric and medical data, social media data involving friends and family, precise location data derived from license plate readers, SIM card data, and surveillance drone data. They can also process data purchased from a thriving ecosystem of private data brokers, or subpoenaed from companies such as Waymo and Meta. The lack of transparency regarding datasets exploited in these applications, and how they are shared across systems, further distorts the picture. That’s why it’s important to focus on the victims.
Soon, Trump’s mass resettlement agenda – from targeting and tracking to managing the arrest and removal of migrants from the country – could be seamlessly coordinated using Istar tools. Ice recently paid Palantir tens of millions to enable “complete target analysis of known populations”, bolstering the Trump administration’s deportation efforts. In Gaza, Palantir provides the IDF with critical data infrastructure for war-related missions. The Israeli armed forces, meanwhile, have developed Istar tools of their own like “Where’s Daddy”, which follow targets to their family homes for execution via cheap, non-guided “dumb bombs”.
Palantir has contested reports that it conducts widespread surveillance of Americans and says it is “committed to defending human rights”. For all the reasons above, I reject those claims. It is time to embrace the cause of privacy again, or we will witness the unbridled proliferation of these targeting tools in our commercial and public lives. As AI targeting technologies become more normalized in the United States, they are also increasingly incorporated into the private sector as companies build their own dragnets of data with platforms like Palantir to target their customers and employees – not to kill or deport them, but to shape their behavior and maximize revenue, increasing further systems of control.
after newsletter promotion
Unfortunately, the fight for civil rights in the face of AI is struggling at the federal and state levels. In Colorado, the nation’s first consumer protection laws on AI – which aim to protect state residents from discrimination – are now under threat. This is the reason why last month I took to the streets of Denver, along with around 40 other activists, to march to Palantir’s headquarters from the state capitol. We were joined by protesters from coast to coast, in Washington DC, New York, Palo Alto and Seattle, who, driven by loose connections but a shared cause, also picketed the offices of Palantir in their cities. Four people were arrested in New York, and in Denver our small group was met with an impressive and coordinated show of force. We faced just about as many police officers as protesters throughout our two-mile journey, where they shut down many streets and followed our convoy with drones.
Riding in my truck bed, I yelled out for the release of my neighbors from Ice custody – such as Eric Sanchez Goitia, Jeanette Vizguerra, and Nixon and Dixon Perez – people who, like me, have built their entire lives in Colorado as immigrants who sought hard to contribute, learn and work for this country. The fact that they are imprisoned and driven out by technologies made in their home state, paid for with their taxpayer money, is a huge stain on our state’s history. As we approached the capitol on our return, I begged for the mayor of Denver, Mike Johnston, the governor of Colorado, Jared Polis, and our representatives to pay attention to the homegrown technologies that are harming our neighbors, to stop trying to prevent the country’s first AI consumer protections from being implemented and to defy the federal government as it seeks to build more detention centers in our state.
The Colorado senate is now meeting for a budgetary special session, where the state’s first-in-the-nation AI consumer protections are under risk of being watered down, delayed or dismantled by venture capital interests. Therefore, our protests this week will not only target the Palantir headquarters but the state capitol as representatives deliberate on the measure. They will support the new AI Sunshine Bill, a streamlined version of the consumer protection bill, and oppose a bill supported by big business that would strip individuals of their right to sue AI businesses. Our movement has also grown, with more than 40 marches planned this weekend against big tech and Palantir across the country.
These days, I’m more worried about when, not if, I’ll become a target. I am a freelance journalist, I am an immigrant and I have demonstrated in support of Palestine – and these technologies have been used to target all three categories of people. Nonetheless, my fears don’t compare to the experiences of those being targeted by the IDF or Ice using Istar tools – including journalists killed in Gaza with targeted airstrikes, or migrants suffering inhumane conditions every night in their prison cells. If anything, I am exactly the kind of person that should try hardest to understand the real world consequences of this tech, having once participated in its dissemination. I only hope more tech workers and policymakers can do the same.
-
Juan Sebastian Pinto is a writer, designer and civil rights organizer based in Denver and New York City