Forum Post: https://forum.logos.co/t/logos-circle-london-chapter-8/1632
Attendees: 10 (including host). 5 cancelled on the last day.
Summary:
Theme: The Cost of Automation & The Cost of Automation Failures Working framing: Britain is buying surveillance with money it doesn't have, on rails that don't work, from a state that won't tell us what it's doing.
Logos Circle 8 took the Big Brother Watch report Suspicion by Design (July 2025) as its starting point and used it to open a wider question:
What happens when a state with a 50-year track record of IT failure, the highest tax burden since 1948, and collapsing public trust decides to industrialise algorithmic profiling of its poorest citizens?
The data discussion converged on a single, hard-to-dispute fact: the DWP's Targeted Case Review programme, which is being scaled toward roughly 20 million people, is currently flagging benefit recipients as suspected fraudsters at a rate where four in five reviewed claimants are already receiving the correct amount of Universal Credit. The circle spent significant time unpacking what this means morally, fiscally, and operationally.
The most consequential outcome of the meeting was a shift in posture. The circle moved beyond a pure campaigning frame ("call out the abomination") toward an action-oriented question: what could Logos actually build for the people on the receiving end of these systems? A working concept emerged for a privacy-preserving reporting tool - a kind of structured whistleblower channel for benefit claimants caught in the algorithmic net - which could feed a journalist-facing evidence base and, in a later iteration, support a "social loan" mechanism for people whose payments are suspended during review.