Cambridge, August 2026

Hardware Assurance Programme

Export controls are tightening. The EU AI Act enters force in August. International agreements on AI development are being drafted. All of them depend on something we can’t yet do: verify what’s running on AI hardware. This six-day programme brings experienced hardware engineers and strong technical graduates to Cambridge to work on that problem. Fully funded, with a £1,500 stipend.

When

August 2026

Where

Cambridge, UK

Cohort

20 engineers

Support

Fully funded + £1,500 stipend

How it works

Six days of talks, working sessions, and project scoping with researchers and practitioners building hardware assurance infrastructure. By Day 6, you present a technical proposal.

A presentation at Meridian, Cambridge
Pre-work

Context loading

Four weeks out, a kick-off call introduces the programme and distributes the core reading list (O’Gara, FlexHEG, Petrie’s research priorities). Two weeks out, small-group discussion calls work through the readings and start surfacing project interests. You arrive ready to work.

Days 1 – 2

Why this matters

The regulatory and geopolitical landscape. The threat models that make this hard. What’s been proposed so far, from compute accounting to location verification. You meet the residents, form project groups, and start scoping.

Days 3 – 4

Technical deep dives

Tamper resistance and tamper-evident enclosures. Confidential computing and multi-chip TEEs. Attestation protocols, network taps, inference verification. Cryptographic approaches including ZKPs and proof-of-useful-work. Sessions led by people actively building these systems, paired with working time on your own project.

Day 5

Who is building what

Organisations working on hardware assurance present their current work, where they are stuck, and what they need. Amodo, Lucid Computing, Longview Philanthropy, and others. Time reserved for direct conversations about collaboration and hiring.

Day 6

Presentations

Present your scoping document to peers, residents, and representatives from the organisations working in this space. You leave with a technical proposal, feedback on it, and relationships with the people who can help take it further.

People who can do the work

Backgrounds we’re looking for

Experienced hardware engineers with five or more years in chip design, firmware, FPGAs, embedded systems, secure hardware, or cryptographic implementations. PhD students in electrical engineering, computer engineering, or related disciplines are also welcome. If you are willing to tackle a new and unfamiliar problem and can bring technical depth to it, please apply.

What you leave with

A scoped technical proposal you developed during the week. Direct relationships with organisations hiring and funding in this space. A path into further work if you want one.

Open problems

A few examples to give a flavour. The field has many more than these, and new ones are surfacing as the work matures. The programme is designed as an on-ramp: participants who want to keep going after the week will be connected with organisations, funders, and projects where they can spend more time on the problem.

Tamper-evident enclosures for AI hardware

AI accelerators dissipate 1,200W and need liquid cooling. How do you seal them in a tamper-evident enclosure while keeping them cool? Insights from nuclear monitoring and banking HSMs may transfer, but nobody has done this for GPUs.

Mutually trusted attestation hardware

For international agreements, both parties need to trust the attestation hardware. The prover needs confidence it won’t exfiltrate data. The verifier needs confidence the attestations are genuine. How do you build hardware that highly sceptical actors on both sides can trust?

Inference verification at scale

Can you verify that a deployed AI model matches the one that was safety-tested, without disrupting the workload? Approaches include network taps with randomised recomputation and input-output fingerprinting. Prototyping and red-teaming are needed.

Workload classification from hardware telemetry

Can you determine what a chip is doing from external signals like power draw, memory access patterns, and network traffic, without seeing the workload itself?

Who you'll work with

Researchers and practitioners who join the programme to give talks, run sessions, and work alongside participants.

Péter Drótos

Péter Drótos

Hardware Engineer at the Future of Life Institute. 8 years as an ASIC verification engineer at ARM.

Harrison Gietz

Harrison Gietz

AI Programme Associate at Longview Philanthropy.

James Petrie

James Petrie

Compute Security and Governance Specialist at the Future of Life Institute. Former firmware engineer at Intel.

Yannick Mühlhäuser

Yannick Mühlhäuser

Policy Researcher at the Future of Life Institute.

More to be confirmed

Interested?

Twenty spots. Fully funded with a £1,500 stipend. Register interest and we’ll be in touch.

Register Interest

Small cohort. Fully funded. £1,500 stipend.

Questions? hello@cambridgeaisafety.org