What is Hackers' Pub?

Hackers' Pub is a place for software engineers to share their knowledge and experience with each other. It's also an ActivityPub-enabled social network, so you can follow your favorite hackers in the fediverse and get their latest posts in your feed.

0
0
0
0

V2 BETA IS OUT. Whatever bugs have been plaguing you in Lip Gloss, Bubble Tea, or Bubbles v1 are more than likely already fixed in this release. We have put a lot of design consideration into improving performance and user experience while minimizing the affected API surface.

Let us know whatcha think about it! If there are any changes you'd like to see that may require changes to the existing API, now's the time!

github.com/charmbracelet/lipgl
github.com/charmbracelet/bubbl

0
0

そういえばここ数年テック系カンファレンスから足が遠のいていたのだけど、最近はトーク中にブレードを振るという文化ができていると聞いて、横転している。

0
0
0

Oh no! My friend Scot Kamins died! Four months ago. The nicest curmudgeon you'd ever meet. He'd got a bit hermitish in later years and so I didn't get to visit him.

He wrote the big hypercard book for Apple, back when. Great guy. We were good friends in 80s and 90s. He wrote FIDO THE BOOK, the printed manual for FidoNet.

We were regulars at Cafe Flore in San Francisco back when Doris Fish used to dance on the counter, and Mahmoud still worked there.

He loved dogs. When I knew him his big dog Walter was the laziest dog in the world. But goofy sweet. I think that was the last time Scot put white carpeting in the living room.

modernlib.com/FAQpages/modernl

0
0
0

I'm excited to share that The Indie Beat has added @futzleDeborah Pickett 's "Texts With Void" to the spoken word channel!

theindiebeat.fm/spoken-word/

HT to @limebar for getting the ball rolling ❤️

Originally recorded for the launch of the Radio Free Fedi Word channel in August 2024, these audio dramatizations depict imagined text message conversations with The Void, into which one screams. And they're funny!

0
0
0

I'm excited to share that The Indie Beat has added @futzleDeborah Pickett 's "Texts With Void" to the spoken word channel!

theindiebeat.fm/spoken-word/

HT to @limebar for getting the ball rolling ❤️

Originally recorded for the launch of the Radio Free Fedi Word channel in August 2024, these audio dramatizations depict imagined text message conversations with The Void, into which one screams. And they're funny!

0
0
0
0
0
0
0
0
0
0
0
0

French civil engineer Charles Joseph Minard was born in 1781. He was known for his contributions to information graphics, including his famous map of the losses suffered by Napoleon during the 1812 Russian campaign.

Writing about Minard's map, Edward Tufte said “It may well be the best statistical graphic ever drawn.”

Image: Charles Minard / Public domain

Minard’s map famously encodes multiple streams of data describing Napoleon’s disastrous 1812 Russia campaign. A thick beige, which grows narrower from the left edge (Europe) to the right edge (Moscow) of the image, shows the number of troops making up Napoleon’s force. A black line that narrows from right to left across the page shows the same number on the return journey. A plot at the bottom shows temperature on the journey, and numerous other piece of information are inscribed on the map.
0
0
0

아이덴티티커피랩의 맛난 디카페인 원두, 콜롬비아 나리뇨 엘 타블론 핑크 버번. 최근 접한 디카페인 중에서는 과일과 같은 산미가 잘 표현되어서 즐겨 마시고 있어요. 아쉽게도 지금은 품절이네요..

0
0
0

One of my main gripes: HTML forms can only use GET and POST methods.

If <form> also supported PUT and DELETE, you could easily map CRUD to HTTP methods and that would feel so much cleaner than splitting functionality for the same types over a bunch of different paths.

0
0
0
0
0

Content Classification System Post Mortem

The IFTAS CCS project was a pilot project to provide CSAM detection and reporting for Mastodon servers. The bulk of the project ran for 26 weeks, and while we cannot afford to maintain the service any longer, the findings below can inform future projects. All numbers are rounded for readability.

Pilot Activity

CCS received posts from eight services with roughly 450,000 hosted user accounts, 30,000 active monthly.

Our participants represented a range of service sizes from <10 to >100,000 accounts, and a range of registration options (open registration, open subject to approval, invitation only).

During the pilot period, CCS received 3.9 million posts via webhook, or 23,000 per day. These posts represent messages that were authored by or interacted with by the participating services’ active users, leading to media being stored on the host service.

Just under 40% (1.55 million) of all posts received included one or more media attachments to classify, leading to 1.86 million media files to hash and match. Posts with no media were discarded by the service.

Of the 1.86 million files, small numbers were either unsupported formats (~2,000) or no longer available when CCS attempted to retrieve the media for classification (~1,600). An additional ~3,100 media files failed to download.

In total, of the 1.86 million media files sent to IFTAS for classification, 99.665% were hashed and matched.

The hash matcher flags media for human review if it finds a match, or a near match, and after review IFTAS filed 53 reports related to 80 media files with NCMEC. This works out to 4.29 matches per 100,000 media files. An additional number of media files that matched were beyond our human review expertise to adequately classify, and therefore we elected to not report these files.

All of the matched media and subsequent reports were of real human victims, none were fictional, drawn, or AI generated. We did not receive matches for “lolicon”.

We elected to match against a broad array of databases to ascertain their effectiveness, and we found that databases maintained by child hotline NGOs (e.g. NCMEC, Arachnid) were far more effective than databases available from commercial service providers. We saw a handful of false positives, and the vast majority of them came from commercial providers. If we had continued, we would have narrowed down the databases in use.

All matched media generated a notification to the affected service provider, and IFTAS performed any necessary media retention for law enforcement.

Context

4.29 matches per 100,000 may not sound like a large number. However, to be clear, this is a higher number than many services would expect to see, and it includes a broad range of media, from “barely legal” minors posted publicly, to intimate imagery shared without consent, to the very, very worst media imaginable. In some cases, it was apparent that users were creating accounts on host services to transact or pre-sale media before moving to an encrypted platform, under the belief that Mastodon would not be able to detect the activity.

There are 1.6 billion posts on the ActivityPub network today, and if our numbers hold true, this means there are currently many tens of thousands of copies of known CSAM on the network, likely significantly more as our service adopters by definition do not include providers that are not inclined to mitigate this issue, and criminals looking for anonymous accounts are likely to target less-moderated services.

If IFTAS found it happening so brazenly on the first servers we happened to look at, no doubt this activity is still occurring on servers that have no such protections. Mastodon is – at its simplest – a form of free, anonymous web hosting. The direct messaging feature precludes moderators and administrators from being aware of illegal content (it will never be reported by potential customers), and only a hash and match system is able to find these media and flag them.

Not only does inadvertently hosting CSAM revictimise the children involved, it also serves as an attack vector for the service to be targeted by law enforcement. We are aware of several instances of CSAM being uploaded for the express purpose of causing moderator trauma or an immediate report to law enforcement, leading to a significant amount of legal issues. This is essentially a form of swatting; simply upload CSAM, report it to the authorities, sit back and watch the server get taken down and possible criminal charges for the administrator.

Responsible Shutdown

We ensured that all webhooks were disabled by the host services, and once all review and reporting was completed, we hard-deleted all remaining data on the service, excepting the metadata and media required to be held for one year for possible law enforcement action. The AWS environment was then dismantled, deleted, and removed from service.

All associated staff and consultants were removed from the relevant IT services, and IFTAS retains no data nor metadata from any of the activity other than the bare minimum required by law pertaining to the encrypted media stored for law enforcement.

Some observed services that were clearly unmoderated and/or willing to host this content to the degree that federating with them would generate legal concerns were added to the IFTAS DNI denylist.

Next Steps

Moderation Workflow

We hope that Mastodon, Pixelfed, Lemmy and other platform developers will quickly implement safeguards within moderation dashboards to minimise moderator trauma.

Content moderators commonly experience trauma similar to those suffered by first responders. Even though the development team may have never reviewed traumatic content, the app or service will at some point deliver this traumatic content to users of the moderation workflow. When presenting reported content to a service provider or moderator:

  • Always show the report classification clearly, so the moderator is aware of the type of content they are about to review,
  • Blur all media until the moderator hovers to view greyscale version (re-blur when hover not detected or mouseleave event),
  • Greyscale all media until the moderator clicks to toggle full colour (allow toggle state back to greyscale),
  • Mute all audio until the moderator requests audio, and
  • Allow the moderator to reclassify the report.

CSAM Detection

If you are a service provider, lobby your web host or CDN provider to perform this service for you, and ask them if they have resources you can use.

Cloudflare offers a free service worldwide, if you are a Cloudflare customer, consider enabling this option.

If you are a web host that hosts a large number of Fediverse providers, consider adding this safeguard at the network level.

Free Support from Tech Coalition

Tech Coalition has a program aimed at small and medium services called “Pathways“, and they are very interested to hear from Mastodon and other Fediverse service providers. While this does not offer detection, it does offer background, guidance, and access to experts. Sign up to explore these options, and to demonstrate a good faith effort to address this issue. The more providers they hear from, the more likely we are to get better options.

Ongoing Work

We are aware of noteworthy efforts to continue this work. @thisismissem is working on a prototype implementation of HMA, and Roost is exploring an open source solution for small and medium size services.

Consider following and monitoring https://mastodon.iftas.org/@sw_isac to receive alerts when services are confirmed to be sources of this content.

A range of services and resources that can help mitigate this issue are available on our CSAM Primer page in the IFTAS Connect Library. We will continue to research and share resources that can help mitigate this issue for service providers. Please let us know if you are aware of additional resources we can add to this guide.

IFTAS intends to continue its relationships with INHOPE, NCMEC, Project Arachnid, Internet Watch Foundation and other organisations to advocate for the Fediverse, and to ensure these entities understand the network and have someone to talk to if they have questions.

To everyone who participated, asked to participate, or supported this project, thank you! We are extremely sad to have to end this project, but we have safeguarded the underlying codebase and – should the opportunity arise – we will restart with this or another resource to provide this service to any who need.

0
0
0
0
0
0
0
0

Content Classification System Post Mortem

The IFTAS CCS project was a pilot project to provide CSAM detection and reporting for Mastodon servers. The bulk of the project ran for 26 weeks, and while we cannot afford to maintain the service any longer, the findings below can inform future projects. All numbers are rounded for readability.

Pilot Activity

CCS received posts from eight services with roughly 450,000 hosted user accounts, 30,000 active monthly.

Our participants represented a range of service sizes from <10 to >100,000 accounts, and a range of registration options (open registration, open subject to approval, invitation only).

During the pilot period, CCS received 3.9 million posts via webhook, or 23,000 per day. These posts represent messages that were authored by or interacted with by the participating services’ active users, leading to media being stored on the host service.

Just under 40% (1.55 million) of all posts received included one or more media attachments to classify, leading to 1.86 million media files to hash and match. Posts with no media were discarded by the service.

Of the 1.86 million files, small numbers were either unsupported formats (~2,000) or no longer available when CCS attempted to retrieve the media for classification (~1,600). An additional ~3,100 media files failed to download.

In total, of the 1.86 million media files sent to IFTAS for classification, 99.665% were hashed and matched.

The hash matcher flags media for human review if it finds a match, or a near match, and after review IFTAS filed 53 reports related to 80 media files with NCMEC. This works out to 4.29 matches per 100,000 media files. An additional number of media files that matched were beyond our human review expertise to adequately classify, and therefore we elected to not report these files.

All of the matched media and subsequent reports were of real human victims, none were fictional, drawn, or AI generated. We did not receive matches for “lolicon”.

We elected to match against a broad array of databases to ascertain their effectiveness, and we found that databases maintained by child hotline NGOs (e.g. NCMEC, Arachnid) were far more effective than databases available from commercial service providers. We saw a handful of false positives, and the vast majority of them came from commercial providers. If we had continued, we would have narrowed down the databases in use.

All matched media generated a notification to the affected service provider, and IFTAS performed any necessary media retention for law enforcement.

Context

4.29 matches per 100,000 may not sound like a large number. However, to be clear, this is a higher number than many services would expect to see, and it includes a broad range of media, from “barely legal” minors posted publicly, to intimate imagery shared without consent, to the very, very worst media imaginable. In some cases, it was apparent that users were creating accounts on host services to transact or pre-sale media before moving to an encrypted platform, under the belief that Mastodon would not be able to detect the activity.

There are 1.6 billion posts on the ActivityPub network today, and if our numbers hold true, this means there are currently many tens of thousands of copies of known CSAM on the network, likely significantly more as our service adopters by definition do not include providers that are not inclined to mitigate this issue, and criminals looking for anonymous accounts are likely to target less-moderated services.

If IFTAS found it happening so brazenly on the first servers we happened to look at, no doubt this activity is still occurring on servers that have no such protections. Mastodon is – at its simplest – a form of free, anonymous web hosting. The direct messaging feature precludes moderators and administrators from being aware of illegal content (it will never be reported by potential customers), and only a hash and match system is able to find these media and flag them.

Not only does inadvertently hosting CSAM revictimise the children involved, it also serves as an attack vector for the service to be targeted by law enforcement. We are aware of several instances of CSAM being uploaded for the express purpose of causing moderator trauma or an immediate report to law enforcement, leading to a significant amount of legal issues. This is essentially a form of swatting; simply upload CSAM, report it to the authorities, sit back and watch the server get taken down and possible criminal charges for the administrator.

Responsible Shutdown

We ensured that all webhooks were disabled by the host services, and once all review and reporting was completed, we hard-deleted all remaining data on the service, excepting the metadata and media required to be held for one year for possible law enforcement action. The AWS environment was then dismantled, deleted, and removed from service.

All associated staff and consultants were removed from the relevant IT services, and IFTAS retains no data nor metadata from any of the activity other than the bare minimum required by law pertaining to the encrypted media stored for law enforcement.

Some observed services that were clearly unmoderated and/or willing to host this content to the degree that federating with them would generate legal concerns were added to the IFTAS DNI denylist.

Next Steps

Moderation Workflow

We hope that Mastodon, Pixelfed, Lemmy and other platform developers will quickly implement safeguards within moderation dashboards to minimise moderator trauma.

Content moderators commonly experience trauma similar to those suffered by first responders. Even though the development team may have never reviewed traumatic content, the app or service will at some point deliver this traumatic content to users of the moderation workflow. When presenting reported content to a service provider or moderator:

  • Always show the report classification clearly, so the moderator is aware of the type of content they are about to review,
  • Blur all media until the moderator hovers to view greyscale version (re-blur when hover not detected or mouseleave event),
  • Greyscale all media until the moderator clicks to toggle full colour (allow toggle state back to greyscale),
  • Mute all audio until the moderator requests audio, and
  • Allow the moderator to reclassify the report.

CSAM Detection

If you are a service provider, lobby your web host or CDN provider to perform this service for you, and ask them if they have resources you can use.

Cloudflare offers a free service worldwide, if you are a Cloudflare customer, consider enabling this option.

If you are a web host that hosts a large number of Fediverse providers, consider adding this safeguard at the network level.

Free Support from Tech Coalition

Tech Coalition has a program aimed at small and medium services called “Pathways“, and they are very interested to hear from Mastodon and other Fediverse service providers. While this does not offer detection, it does offer background, guidance, and access to experts. Sign up to explore these options, and to demonstrate a good faith effort to address this issue. The more providers they hear from, the more likely we are to get better options.

Ongoing Work

We are aware of noteworthy efforts to continue this work. @thisismissem is working on a prototype implementation of HMA, and Roost is exploring an open source solution for small and medium size services.

Consider following and monitoring https://mastodon.iftas.org/@sw_isac to receive alerts when services are confirmed to be sources of this content.

A range of services and resources that can help mitigate this issue are available on our CSAM Primer page in the IFTAS Connect Library. We will continue to research and share resources that can help mitigate this issue for service providers. Please let us know if you are aware of additional resources we can add to this guide.

IFTAS intends to continue its relationships with INHOPE, NCMEC, Project Arachnid, Internet Watch Foundation and other organisations to advocate for the Fediverse, and to ensure these entities understand the network and have someone to talk to if they have questions.

To everyone who participated, asked to participate, or supported this project, thank you! We are extremely sad to have to end this project, but we have safeguarded the underlying codebase and – should the opportunity arise – we will restart with this or another resource to provide this service to any who need.

0
0
0
0
0

단독 尹 선고 늦어지자 피로 누적된 경찰... 숙박비만 13억 원 입력 2025.03.27 19:30 1월 초과근무 시간 기동대 1인 113시간 평일 기준 20~30개 지방청 기동대 상경 탄핵 이후 집회 계속… 尹 선고만 기다려 초과근무 수당 제한 없앴지만 '피로감' 우려 www.hankookilbo.com/News/Read/A2...

[단독] 尹 선고 늦어지자 피로 누적된 경찰... 숙박...

0
0
0
0

Two queries for you, fedi friends :blobfoxconfused:

1) Do any of those "erase your data from the " tools the YouTube creators are always advertising actually help? I feel like the spam texts are getting oddly specific, and also I have talked my fair share of shit about the Cheeto-in-Charge online which is seeming more dangerous by the day

2) Also, I unfortunately used my real last name when I jumped on Mastodon. Is there any way to change screen name? I know I looked awhile back and the answer seemed to be no but thought I'd ask again 🙏

Thanks friends!

0
0
0
0

I've been accepted as a speaker at @LASLinux Appp Summit (LAS) to talk about GNOME Circle. I was excited to travel to Albania and have my talk there but the Linux App Summit hasn't been answering any of my emails! Since I'm from Iran, I really need an Invitation Letter to be able to get my visa (e-visa) since Albania does not give visas to Iranians so easily, and it needs lots of paperwork (and also no embassy in Iran). I emailed LAS on 11th March...

Linux App Summit banner
0