Cambridge, 17-22 August 2026
Hardware Assurance Programme
Hardware assurance is a new, unsolved technical field: building chips that can make verifiable claims about what they’re computing, where they are, and at what scale. Six days in Cambridge to work on it together.
The week
How it works
Six days of talks, working sessions, and project scoping with researchers and practitioners building hardware assurance infrastructure. By Day 6, you present a technical proposal.
Pre-work
Context loading
Four weeks out: kick-off call and core reading list.
Two weeks out: small-group discussions to work through the readings and surface project interests.
Day 1
Arrive and orient
Opening session. Meet the residents and your cohort. The regulatory and geopolitical landscape.
Day 2
Threat models and proposals
The threat models that make this hard. What’s been proposed so far, from compute accounting to location verification. Form project groups.
Day 3
Technical deep dive I
Tamper resistance, confidential computing, attestation. Afternoon: working time on your project.
Day 4
Technical deep dive II
Inference verification and cryptographic approaches including zero-knowledge proofs. Afternoon: working time on your project.
Day 5
Who is building what
Organisations working on hardware assurance present their current work, where they are stuck, and what they need. Talks from Future of Life Institute, Amodo, Tampersec, and Seldon Labs. Time reserved for direct conversations about collaboration and hiring.
Day 6
Present your proposal
Present your scoping document to peers, residents, and representatives from the organisations working in this space. You leave with a technical proposal, feedback on it, and relationships with the people who can help take it further.
Residents
Who you'll work with
Residents are domain experts in hardware assurance who join the programme to give talks, run technical sessions, and give feedback on participant projects.
Who this is for
Who should apply
Backgrounds we’re looking for
Experienced engineers working on chip design, firmware, FPGAs, embedded systems, secure hardware, cryptography, formal verification, or systems engineering. PhD researchers in electrical engineering, computer engineering, cryptography, or related disciplines. People with unusual backgrounds who can bring technical depth to the problem.
What you leave with
A scoped technical proposal you developed during the week. Direct relationships with organisations hiring and funding in this space. A clear path into further work.
What you'd work on
Open problems
A few examples to give a flavour. The field has many more, and new ones are surfacing as the work matures. The programme is designed as an on-ramp: after the week, participants are connected with organisations, funders, and projects to take their work further.
01
Tamper-evident enclosures for AI hardware
AI accelerators dissipate 1,200W and need liquid cooling. How do you seal them in a tamper-evident enclosure while keeping them cool? Insights from nuclear monitoring and banking HSMs may transfer, but this hasn’t been done for GPUs.
02
Mutually trusted attestation hardware
For international agreements, both parties need to trust the attestation hardware. The prover needs confidence it won’t exfiltrate data. The verifier needs confidence the attestations are genuine. How do you build hardware that highly sceptical actors on both sides can trust?
03
Inference verification at scale
Can you verify that a deployed AI model matches the one that was safety-tested, without disrupting the workload? Approaches include network taps with randomised recomputation and input-output fingerprinting. Prototyping and red-teaming are needed.
04
Workload classification from hardware telemetry
Can you determine what a chip is doing from external signals like power draw, memory access patterns, and network traffic, without seeing the workload itself?
Next Step
Interested?
Twenty spots. Travel, accommodation and food covered, plus a £1,500 stipend. Register interest and we’ll be in touch.