UK Firm Not Racist for Rejecting Chinese Applicant Due to Security Concerns, Tribunal Rules
The Guardian
SKIPPED
Details
- Date Published
- 17 June 2025
- Priority Score
- 2
- Australian
- No
- Created
- 22 June 2025, 07:13 pm
Description
Judge says refusing to hire people from ‘hostile’ states to roles that deal with national security is not discrimination
Summary
The article reports on a UK employment tribunal ruling that exonerated Binary AI Ltd from allegations of racial discrimination after it rejected a Chinese applicant citing national security concerns. The tribunal found that denying employment based on inability to secure security clearance from 'hostile' nations like China, Russia, and others, does not constitute racial discrimination. This case intersects with AI governance and safety by highlighting the complexities of national security in AI development and the regulatory measures in place to mitigate risks from foreign actors. The tribunal's decision has implications for global policies on AI safety, particularly in industries sensitive to geopolitical conflicts or espionage threats.
Body
Binary AI had had a contract with the Defence Science and Technology Laboratory at Porton Down and the Ministry of Defence to develop AI.Photograph: Martin Argles/The GuardianView image in fullscreenBinary AI had had a contract with the Defence Science and Technology Laboratory at Porton Down and the Ministry of Defence to develop AI.Photograph: Martin Argles/The GuardianUK firm not racist for rejecting Chinese applicant over security concerns, tribunal rulesJudge says refusing to hire people from ‘hostile’ states to roles that deal with national security is not discriminationRefusing to give a job to Chinese and Russian people in companies that deal with issues of national security and require security clearance is not racist, an employment tribunal has ruled.It is not discriminatory to stop people from “hostile” states taking up certain jobs in the defence sector because of the risk to British security, the judgment says.Therulingrelates to the case of a Chinese scientist who accused a British AI company with ties to the UK and US defence departments of racism after she was not given a job because of security concerns.Tianlin Xu applied for a role at Binary AI Ltd but the founder of the software company, James Patrick-Evans, turned her down and employed a British man instead.He emailed her: “Disappointingly I’ve come to the decision not to proceed with your application on the sole basis of your nationality.“As a company, we work closely in sensitive areas with western governments and wish to continue to do so. We’re simply not big enough of a company to ensure the separation and security controls needed to hire someone of your nationality at this stage.”Judge Baty, sitting in London, described the email as clumsy and said: “In complete isolation, it looks like an admission of direct race discrimination on the basis of nationality.”But he said in fact Xu had been turned down as she would not get security clearance because of her nationality.The judge said: “That reason would apply to people of any nationality where it was not possible to get security clearance (including Russian, North Korean and Iranian nationality as well as Chinese nationality). The reason is not nationality per se.”Patrick-Evans was “strongly advised against hiring a Chinese national” by defence officials that he worked with, the tribunal heard.Binary AI had had a contract with the Defence Science and Technology Laboratory – the secret site based at Porton Down in Wiltshire – and the Ministry of Defence to develop AI that could identify hidden “back doors” inside software.Baty said in his judgment: “It is obvious that software drives the modern world. It underpins our everyday lives and runs every sector of our state.“Therefore, it is paramount that the security and operational capability of the software that drives our everyday lives should remain intact and free from malicious hackers and state actors wanting to persuade political outcomes or obtain sensitive information.”Xu’s complaints of direct and indirect race discrimination both failed.Explore more on these topicsEmployment lawChinaData protectionArtificial intelligence (AI)RaceEmployment tribunalsRussianewsShareReuse this content