New blog post: Post-OCSP certificate revocation in the Web PKI.
With OCSP in all forms going away, I decided to look at the history and possible futures of certificate revocation in the Web PKI. I also threw in some of my own proposals to work alongside existing ones.
I think this is the most comprehensive current look at certificate revocation right now.
#security #WebPKI #LetsEncrypt #TLS #OCSP
This entry was edited (3 weeks ago)
Seirdy reshared this.
jacksonchen666 :ms_agender_flag: might just delete mastodon*
in reply to Seirdy • • •typo (emphasis mine)
Seirdy likes this.
Seirdy
in reply to jacksonchen666 :ms_agender_flag: might just delete mastodon* • • •Seirdy
in reply to Seirdy • • •Seirdy
in reply to Seirdy • • •kittens!
in reply to Seirdy • • •Seirdy likes this.
Seirdy
in reply to kittens! • • •Seirdy
in reply to Seirdy • • •kittens!
in reply to Seirdy • • •Seirdy likes this.
Seirdy
in reply to kittens! • • •Ryan Bolger
in reply to Seirdy • • •Regarding ACME clients that support not before/notAfter, Posh-ACME also supports this via the LifetimeDays parameter.
poshac.me/docs/latest/Function…
I also wasn’t aware ZeroSSL had added support on the server side. So thanks for that.
Seirdy likes this.
Seirdy
in reply to Ryan Bolger • • •@rmbolger Sorry for the delay; updated to mention Posh-ACME.
Aside: I usually associate the term “Posh” with “POSIX Shell”, so the name really threw me for a loop.
Seirdy
Unknown parent • • •my rationale for using basic security measures as a filter is that i have to efficiently narrow down millions of domains to something I can manually check, and I might as well pick something positive.
after the “good security” filter, I’ll isolate domains with a
main
andh1
tag with no trackers in a “good page content” filter. Then I’ll figure out how to narrow it down further before cursory accessibility reviews and reading what people post in the Tor Browser.Seirdy
in reply to Seirdy • • •Seirdy
in reply to Seirdy • • •Seirdy
in reply to Seirdy • • •Partway through, I decided to start filtering out Nextcloud and Searx(Ng) instances. I was already filtering out Masto instances and some others. I ran a second filter to check for the existence of hyperlinks on the page to avoid dead-ends, and to ensure they don’t block Tor.
I filtered a subset of duplicates and handled a subset of redirects. I’m down to around 1.1k domains, around 350 of which are the ones that qualified from Tranco’s top 2.6M domains. Many more are from the HSTS Preload list and Internet.nl Hall of Fame. Around a couple dozen more are uniquely from my browsing history, site outlinks, old chatrooms, web directories, and other more obscure locations.
I can manually pare this down over a couple weeks but that’s too much work. Need to figure out the right set of additional filters. Maybe a “points system” for privacy, security, and accessibility features and then taking the top 250 domains with the most points.
Tim Bray
in reply to Seirdy • • •Seirdy likes this.
Seirdy
in reply to Tim Bray • • •@timbray Right now the filter is TLSv1.3, has a strict content-security policy header (with the exception of allowing unsafe-inline styles), has no common tracking third-parties in the CSP, allows Tor. Then it needs a
main
,h1
,a
, andmeta viewport
element.I’ll then add a points system to cut it in 1/3 and manually review a few domains per day.
Seirdy
Unknown parent • • •Seirdy
in reply to Seirdy • • •Or I could run a subset of Axe-Core on every page and let my fans spin up.
Axe-Core is one of the only page-content checkers out there that doesn’t have a ton of false positives. Even the Nu HTML checker (often incorrectly referred to as the HTML5 Validator; HTML5 can’t be validated) has a ton of them. But some of Axe’s errors, like dupe link names, are way too trivial compared to easy-to-spot manual-only checks like “this
h1
is used for the site name but it should be used for the page title”.Tanith the Gay
in reply to Seirdy • • •Seirdy
Unknown parent • • •khm
in reply to Seirdy • • •main
element. I usearticle
at the moment and this is the first I'm hearing ofmain
. otherwise I think sciops.net meets these requirements... except not only do I not use hsts, I expose content over http for accessibility reasonsSeirdy
Unknown parent • • •@khm its existence hearkens back to the “standard” page layout most settled on early in the Web’s history: a
header
, amain
, maybe a coupleaside
elements on the side, and afooter
. A “skip to content” link, if it exists, should typically skip to the first non-decorative thing inmain
.Viewing your post on the remote instance, I imagine that
main
may begin just before your profile banner.khm
in reply to Seirdy • • •my activitypub software (snac2) does not use
main
. I'm willing to open a pull request to fix this if I can grasp the intent properly...one
main
tag for the feed body, with each post wrapped inarticle
tags?Seirdy
in reply to Seirdy • • •I ran an aggressive filter on the sites, but scrapped it because I had already seen too many of the personal sites that passed.
that filter mandated multiple of the following:
and all of the following:
Instead I’ll just manually comb through 100-200 domains a day in the Tor Browser to trim my way down to 500-600 sites or so, then figure out how to proceed. I’ll throw out dead ends, login pages, cryptocurrency, very corporate pages, pages for large organizations without much interesting reading material, LLM-related pages, and anything that doesn’t work in the Tor Browser’s “safest” mode (no media, JS, or a bunch of other features).
When I’m down to a few hundred I’ll probably run a mini version of Axe, decide on an actual points system, and spend more than a few seconds on each site looking for original writing, projects, and/or art and reviewing accessibility.
Seirdy
in reply to Seirdy • • •