Countering toxicity and fake takes is not just a technical question (moderation, admins etc) but one of social curation both collectively and privately: which people and groups do you/we trust for what. No tool can automate that.

#deltachat is for private e2ee messaging and mutually curated group chats, and not random social media propelled onboarding. We are with #Signal here and not with #simplex and #matrix who offer public anonymous discovery of large chat groups with lots of unknown folks.

in reply to Delta Chat

How do you reckon with the notion that your app could be used by groups to organise and cause real harm in the world. Is this just the cost of creating secure messaging? Do you think there are ways to cultivate a positive culture around your software, or is this something you think you should be separate from? (Genuine questions, I'm not trying to “gotcha” you here. I think these are real and difficult considerations for a lot of FOSS projects right now)
in reply to Aaron Caskey-Demaret

very fair questions! We don't take a neutral position on how and who uses and benefits from our efforts. We choose and curate our audiences, collaborations and designs carefully. The challenges with authoritarianism and violence are real even if impact is distributed very unevenly across regions, skin colors and genders. Supremacists tend to thrive and depend on hierarchical (mob) organizing and we are lacking much UX there. Somehow we never quite get to prioritize it :)
This entry was edited (7 hours ago)
in reply to Delta Chat

agreed! Signal groups do have admins though, who can add & remove members.

support.signal.org/hc/en-us/ar…

For medium sized groups ~30 to ~50 people this can be essential.

Is this a use case that you would like to support?