Palantir Secures FCA Contract, Raising Privacy Concerns
The Financial Conduct Authority (FCA), the UK’s financial regulator, has awarded a three-month contract to Palantir Technologies to analyze its internal intelligence data in an effort to combat financial crime. The deal, worth over £30,000 per week, has sparked renewed debate over the US company’s expanding role within the British state and potential privacy implications.
Palantir’s Role and the FCA’s Objectives
The FCA intends to leverage Palantir’s AI-powered Foundry platform to analyze a vast “data lake” encompassing intelligence related to fraud, money laundering, and insider trading. The regulator oversees 42,000 financial services firms, ranging from major banks to cryptocurrency exchanges, and aims to improve its ability to identify and address rule-breaking within the sector. The Guardian reports that the trial period could lead to a full procurement of an AI system.
Growing Concerns Over Data Privacy
The contract has triggered significant privacy concerns, particularly given Palantir’s history and the sensitive nature of the data involved. The data includes recordings of phone calls, emails, and social media posts, as well as consumer complaints. Sources within the FCA have expressed worries about the potential for Palantir to misuse the information gained from understanding the regulator’s threat detection methods.
Palantir’s Expanding UK Presence
This latest contract adds to Palantir’s already substantial footprint in the UK public sector, exceeding £500 million in deals. The company currently holds contracts with the National Health Service (NHS), the Ministry of Defence (MoD), and various police forces. Newsbytes highlights previous contracts including a £330 million NHS deal and a £240.6 million MoD contract.
Mitigation Measures and Contractual Safeguards
The FCA has implemented several measures to mitigate privacy risks. Palantir will operate as a “data processor” rather than a “data controller,” meaning it can only process data under the FCA’s direct instruction. The regulator will maintain exclusive control over encryption keys for sensitive files, and all data will be hosted and stored within the UK. Palantir is also obligated to destroy the data upon completion of the contract and relinquish any intellectual property derived from it. The FCA considered using synthetic data but ultimately decided real data was necessary for a meaningful test.
Political and Ethical Considerations
Palantir, co-founded by Peter Thiel, a prominent supporter of Donald Trump, has faced scrutiny due to its work with controversial entities like the Israeli military and US Immigration and Customs Enforcement (ICE). Critics have labeled the company “highly questionable” and “ghastly.” The UK government has acknowledged the need to reassess its partnerships with large technology firms, with science minister Patrick Vallance promising a focus on national control and security.
Expert Perspectives
Professor Michael Levi, an expert in money laundering at Cardiff University, acknowledges the potential value of AI in tackling financial crime but raises concerns about the possibility of Palantir sharing methodologies with its associates. Christopher Houssemayne du Boulay, a barrister specializing in financial crime, emphasizes the privacy risks associated with ingesting vast amounts of personal data into an AI system.
Palantir referred comment requests to the FCA. The FCA stated that effective use of technology is vital in the fight against financial crime and that a competitive procurement process was followed with strict data protection controls in place.