šŸ‘‹ hi, I’m basile

Bridging Journalism and Justice at IJF Perugia

I was back in Perugia in April for the International Journalism Festival, this time as part of a small group co-hosting a closed-door side event called Bridging Journalism and Justice. Starling Lab organised it alongside Airwars, IrpiMedia, Paper Trail Media, and Videre, and we deliberately kept it off the main programme to get a working conversation rather than a panel.

The room had journalists, investigators, and lawyers in it. The question we came in with was narrow and practical: what does it actually take for documentation of atrocity crimes to survive contact with a legal proceeding?

What I argued

My contribution tried to reframe what “good” looks like when digital evidence meets a courtroom – and to push back against the instinct to reach for a new tool as the answer.

Courts assessing digital material care about three things. Can you establish that this content is what you say it is, traceable to a real source? Has it been altered, and can you prove it? And can you account for everywhere it’s been and everyone who’s touched it? These are the questions of authenticity, integrity, and chain of custody – and they interact in ways that matter. Material can fail on any one of them even when the other two are solid.

What I find underappreciated is how many of these failures happen before anything reaches a legal team. Missing timestamps, unrecorded file transfers, compression applied without logging – none of these look like problems at the time. They become problems when someone needs to reconstruct the history of a file months later and the record simply isn’t there.

The tool conversation

We did talk about tools – there are genuinely good ones now. Capture applications that bake metadata in at the moment of documentation. Archiving infrastructure that survives platform takedowns. Emerging standards that carry provenance information with a file across its whole lifecycle, including evidx.de, which I’m building with this in mind.

But the most interesting part of that conversation was where participants felt the gap most acutely. It wasn’t lack of software. It was the absence of simple, shared, consistent practice: an investigation plan people actually follow, a policy for how files get named and stored, someone whose job it is to know where the assets are. The boring stuff that makes sophisticated tools usable.

What pulled at the room

A few tensions came up that I don’t think anyone resolved, but which deserve to stay in the conversation.

The people doing this work aren’t looking to add another system to their workflow. The bar for adoption is much higher than developers often assume, and solutions that require parallel processes will be ignored regardless of their merits.

There’s also a real equity problem lurking in any push to raise the evidentiary bar. If “verifiable” becomes a requirement, that requirement lands hardest on witnesses in conflict zones, community documenters, and sources operating without institutional support – exactly the people whose testimony is often most critical. A framework that systematically disadvantages them isn’t serving accountability.

And consent doesn’t end at collection. Managing what people agreed to share, under what conditions, over time, is an obligation that most current workflows aren’t built around at all.


The Starling Lab dispatch has the full write-up, including what the group is working on next.