New blog post: Post-OCSP certificate revocation in the Web PKI.
With OCSP in all forms going away, I decided to look at the history and possible futures of certificate revocation in the Web PKI. I also threw in some of my own proposals to work alongside existing ones.
I think this is the most comprehensive current look at certificate revocation right now.
#security #WebPKI #LetsEncrypt #TLS #OCSP
This entry was edited (1 month ago)
Seirdy reshared this.
jackso
in reply to Seirdy • • •typo (emphasis mine)
Seirdy likes this.
Seirdy
in reply to jackso • • •Seirdy
in reply to Seirdy • • •Seirdy
in reply to Seirdy • • •kbity...
in reply to Seirdy • • •Seirdy likes this.
Seirdy
in reply to kbity... • • •Seirdy
in reply to Seirdy • • •kbity...
in reply to Seirdy • • •Seirdy likes this.
Seirdy
in reply to kbity... • • •Ryan Bolger
in reply to Seirdy • • •Regarding ACME clients that support not before/notAfter, Posh-ACME also supports this via the LifetimeDays parameter.
poshac.me/docs/latest/Function…
I also wasn’t aware ZeroSSL had added support on the server side. So thanks for that.
Seirdy likes this.
Seirdy
in reply to Ryan Bolger • • •@rmbolger Sorry for the delay; updated to mention Posh-ACME.
Aside: I usually associate the term “Posh” with “POSIX Shell”, so the name really threw me for a loop.
Seirdy
Unknown parent • • •my rationale for using basic security measures as a filter is that i have to efficiently narrow down millions of domains to something I can manually check, and I might as well pick something positive.
after the “good security” filter, I’ll isolate domains with a
main
andh1
tag with no trackers in a “good page content” filter. Then I’ll figure out how to narrow it down further before cursory accessibility reviews and reading what people post in the Tor Browser.Seirdy
in reply to Seirdy • • •Seirdy
in reply to Seirdy • • •Seirdy
in reply to Seirdy • • •Partway through, I decided to start filtering out Nextcloud and Searx(Ng) instances. I was already filtering out Masto instances and some others. I ran a second filter to check for the existence of hyperlinks on the page to avoid dead-ends, and to ensure they don’t block Tor.
I filtered a subset of duplicates and handled a subset of redirects. I’m down to around 1.1k domains, around 350 of which are the ones that qualified from Tranco’s top 2.6M domains. Many more are from the HSTS Preload list and Internet.nl Hall of Fame. Around a couple dozen more are uniquely from my browsing history, site outlinks, old chatrooms, web directories, and other more obscure locations.
I can manually pare this down over a couple weeks but that’s too much work. Need to figure out the right set of additional filters. Maybe a “points system” for privacy, security, and accessibility features and then taking the top 250 domains with the most points.
Tim Bray
in reply to Seirdy • • •Seirdy likes this.
Seirdy
in reply to Tim Bray • • •@timbray Right now the filter is TLSv1.3, has a strict content-security policy header (with the exception of allowing unsafe-inline styles), has no common tracking third-parties in the CSP, allows Tor. Then it needs a
main
,h1
,a
, andmeta viewport
element.I’ll then add a points system to cut it in 1/3 and manually review a few domains per day.
Seirdy
Unknown parent • • •Seirdy
in reply to Seirdy • • •Or I could run a subset of Axe-Core on every page and let my fans spin up.
Axe-Core is one of the only page-content checkers out there that doesn’t have a ton of false positives. Even the Nu HTML checker (often incorrectly referred to as the HTML5 Validator; HTML5 can’t be validated) has a ton of them. But some of Axe’s errors, like dupe link names, are way too trivial compared to easy-to-spot manual-only checks like “this
h1
is used for the site name but it should be used for the page title”.Tanith the Gay
in reply to Seirdy • • •Seirdy
Unknown parent • • •khm
in reply to Seirdy • • •main
element. I usearticle
at the moment and this is the first I'm hearing ofmain
. otherwise I think sciops.net meets these requirements... except not only do I not use hsts, I expose content over http for accessibility reasonsSeirdy
Unknown parent • • •@khm its existence hearkens back to the “standard” page layout most settled on early in the Web’s history: a
header
, amain
, maybe a coupleaside
elements on the side, and afooter
. A “skip to content” link, if it exists, should typically skip to the first non-decorative thing inmain
.Viewing your post on the remote instance, I imagine that
main
may begin just before your profile banner.khm
in reply to Seirdy • • •my activitypub software (snac2) does not use
main
. I'm willing to open a pull request to fix this if I can grasp the intent properly...one
main
tag for the feed body, with each post wrapped inarticle
tags?Seirdy
in reply to Seirdy • • •I ran an aggressive filter on the sites, but scrapped it because I had already seen too many of the personal sites that passed.
that filter mandated multiple of the following:
and all of the following:
Instead I’ll just manually comb through 100-200 domains a day in the Tor Browser to trim my way down to 500-600 sites or so, then figure out how to proceed. I’ll throw out dead ends, login pages, cryptocurrency, very corporate pages, pages for large organizations without much interesting reading material, LLM-related pages, and anything that doesn’t work in the Tor Browser’s “safest” mode (no media, JS, or a bunch of other features).
When I’m down to a few hundred I’ll probably run a mini version of Axe, decide on an actual points system, and spend more than a few seconds on each site looking for original writing, projects, and/or art and reviewing accessibility.
Seirdy
in reply to Seirdy • • •Seirdy
in reply to Seirdy • • •nav
, avoidsdiv
soup), and a quick run of axe-core. about a minute per site. this will take several more days before i’m ready to build a directory of the survivors and give a proper look at each one.Seirdy
in reply to Seirdy • • •I should document how I do these incomplete-but-helpful “lightning audits” more thoroughly. After looking at a hundred sites the process has become automatic.
biggest things I look for in an automated audit like Axe are skipped heading levels, missing landmarks (
main
is big one), and missing alt attributes (mainly on non-decorative images, though decorative images should also have an emptyalt
).with inspect element i also look for some semblance of page structure. is it all
div
soup or is there aheader
,nav
,main
, andfooter
when applicable?I open the site in a regular browser profile and in my personal profile with an adblocker and forced colors mode, and make sure that tabbing around works in both with focus indicators.
Automated contrast checks are good but also not terribly nuanced. A more nuanced check like APCA with awareness of font size, the type of element (decoration? spot element like a superscript? fluent text?), font weight, etc. is what we should use but that takes time. For a lightning audits i just eyeball it and flag it if the contrast seems very obviously bad.
Seirdy
in reply to Seirdy • • •I used to think that contrast was only talked about so much only because violations were common and it was easy to spot, not because it was one of the most important issues.
Then I started using a shitty dim screen at night with screen gamma adjustment and extra-strong nighttime orange-tinted blue-blocking computer glasses and it got personal.
I don’t think everything should be perfect under such extreme conditions; your visited links and unvisited links appear to have the same hue with a low-contrast night-optimized display. but I should be able to read a paragraph of text, and see the beginnings and ends of links.
Seirdy
in reply to Seirdy • • •www.marginalia.nu
marginalia.nuSeirdy
in reply to Seirdy • • •almost done checking the ten millionth domain lmao
i narrowed 5m domains to around 300. i’m hoping my quality filters will give me 500 sites to work with. then I can start being ✨subjective✨ and narrow it down to 200-300 interesting ones for a directory, plus a hall of fame containing maybe 25 sites.
Seirdy
in reply to Seirdy • • •main
andh1
element in the raw HTML response. Content outside landmarks and misuse of headings are the most common non-color violations, and a missingh1
happens almost as often as usingh1
as a site title instead of a page title.Seirdy
in reply to Seirdy • • •Seirdy
Unknown parent • • •the esoteric programmer
in reply to Seirdy • • •Seirdy likes this.
Seirdy
Unknown parent • • •www.marginalia.nu
marginalia.nuthe esoteric programmer
in reply to Seirdy • • •Seirdy
in reply to Seirdy • • •Some of the most common #accessibility issues I see in the shortlist of 300-400 sites (filtered from 10 million):
header
,main
,section
,footer
, and/oraside
are what you typically want on the top-level, directly underbody
.main
is the most important.h1
that titles the page, not your entire website. Don’t skip heading levels just to get smaller text. Don’t use headings for stylized text. A lower heading following a higher heading looks like a subtopic of the higher heading, not its own thing.prefers-reduced-motion
.Link imperceptibility, missing landmarks, and heading misuse are really common.
A common nit-pick: lists of links (e.g. in
nav
) would benefit fromul
orol
parents.A common issue that isn’t exactly an accessibility issue: toggles like hamburger menus that require JS don’t work in the Tor Browser’s “safest” mode. I’m looking at simple websites that have no need to exclude anonymous visitors.
Seirdy
in reply to Seirdy • • •h1
descendants of other headings. orh2
descendants of anything other thanh1
. Levels do not reset when you enter a child sectioning element, evenarticle
.she hacked you
in reply to Seirdy • • •Seirdy
in reply to she hacked you • • •Seirdy
Unknown parent • • •Tanith the Gay
in reply to Seirdy • • •Seirdy
Unknown parent • • •@toastal AT users are used to list navigation. Screen readers also do neat things like announce the number of items. “list with 136 items” may not be worth hearing all the way through, but “list with eight items” might be different.
If something semantically makes sense, it should receive the appropriate semantic markup even if the presentation is visually worse in a given browser. Presentation should not be a major concern of the markup.
to⟁st⟁l
in reply to Seirdy • • •Seirdy
Unknown parent • • •@toastal A list of navbar links being marked up as a list is a very standard pattern that people and ATs have come to expect, just like how pagination links or table of contents links are list entries.
If you have a list of short non-landmark items or several consecutive standalone items of the same type (single standalone sentences, images in a gallery, links, entries in a post archive, etc) they should be a list for consistent navigation.
If each paragraph is its own item and not part of the same work or part of the same article (e.g. untitled entries on a microblog) they should also be contained in list entries. See the list of
h-entry
microblogs in tantek.com/ for an example.Tantek Çelik
tantek.comto⟁st⟁l
in reply to Seirdy • • •