What is Hackers' Pub?

Hackers' Pub is a place for software engineers to share their knowledge and experience with each other. It's also an ActivityPub-enabled social network, so you can follow your favorite hackers in the fediverse and get their latest posts in your feed.

0
0
0
1

Thank you to membership renewal from Two Rock Software! Two Rock Software, a custom software development firm focused on increasing operational efficiency and client satisfaction for the legal sector.

0
1
1
0
0
0

Okay, someone is looking for accessibility focused reading and it's been so long since I started from zero, I'm going to crowdsource it.

My accessibility network, what work would you point a newer person to in 2026?

(Please feel free to spread! I know the accessibility focused community isn't a monolith and would also love works outside my corner.)

0
0
0

The Illustrated Transformer

Discussions: Hacker News (65 points, 4 comments), Reddit r/MachineLearning (29 points, 3 comments) Translations: Arabic, Chinese (Simplified) 1, Chinese (Simplified) 2, French 1, French 2, Italian, Japanese, Korean, Persian, Russian, Spanish 1, Spanish 2, Vietnamese Watch: MITโ€™s Deep Learning State of the Art lecture referencing this post Featured in courses at Stanford, Harvard, MIT, Princeton, CMU and others Update: This post has now become a book! Check out LLM-book.com which contains (Chapter 3) an updated and expanded version of this post speaking about the latest Transformer models and how they've evolved in the seven years since the original Transformer (like Multi-Query Attention and RoPE Positional embeddings). In the previous post, we looked at Attention โ€“ a ubiquitous method in modern deep learning models. Attention is a concept that helped improve the performance of neural machine translation applications. In this post, we will look at The Transformer โ€“ a model that uses attention to boost the speed with which these models can be trained. The Transformer outperforms the Google Neural Machine Translation model in specific tasks. The biggest benefit, however, comes from how The Transformer lends itself to parallelization. It is in fact Google Cloudโ€™s recommendation to use The Transformer as a reference model to use their Cloud TPU offering. So letโ€™s try to break the model apart and look at how it functions. The Transformer was proposed in the paper Attention is All You Need. A TensorFlow implementation of it is available as a part of the Tensor2Tensor package. Harvardโ€™s NLP group created a guide annotating the paper with PyTorch implementation. In this post, we will attempt to oversimplify things a bit and introduce the concepts one by one to hopefully make it easier to understand to people without in-depth knowledge of the subject matter. 2025 Update: Weโ€™ve built a free short course that brings the contents of this post up-to-date with animations: A High-Level Look Letโ€™s begin by looking at the model as a single black box. In a machine translation application, it would take a sentence in one language, and output its translation in another.

jalammar.github.io

0
0
0
0
0
0
0

Iโ€™m really happy with the migration of to Angular Material M3. The changes in how these components are used are definitely a big improvement - theyโ€™re much easier to work with and customize to my needs now.

Next up, Iโ€™ll probably focus on hashtag following. Iโ€™m still waiting for a decision from NLnet and really hoping it works out. Getting that grant would help me a lot with many things around ๐Ÿคž

0
1
0
0

Mh, SA-trauma, asking for a bit of a reality check

So Iโ€™ve been meaning to bring up to my therapist that I probably donโ€™t need an introduction to trauma, how it affects me and how to cope in general.
And I donโ€™t think this assessment is unreasonable.

Itโ€™s been only a bit over two years since I made it out of an also sexually quite abusive relationships and at large my relationships and intimate life are no longer affected for a while now.

I am still cautious around new people, but this is largely due to more recent events. And the rest of the things I do struggle with on more regular bases are more anxiety shaped and have been hammered into me for years or are moreso the result of a lack of opportunities.

Is this coherent? Iโ€™ve largely been around very traumatised ppl and this is all hard to compare so I genuinely canโ€™t tell.

0
0
0
0
0
0
1
0
0

This piece nails it! The "new" TikTok "deal" is simply an exercise in decorative shape shifting for a situation that has basically been in place for 3 years.

This was never about addressing data privacy, propaganda, or national security. It was always about the US grabbing a big chunk of ownership one of the most popular and successful social media apps. The US Congress just made sure that is was Trump sycophants who got the "win". techdirt.com/2025/12/19/tiktok

TikTok Logo
0
0
0

ไธ‡ๅนด็ญ†ใฎ่ชฟๅญๆ‚ชใใชใฃใŸใชใ‚ใฃใฆใ—ใ‚‡ใ‚“ใผใ‚Šใ—ใ‹ใ‘ใŸใ‘ใฉใ‚คใƒณใ‚ฏใŒๆธ›ใฃใฆใ‚‹ใ ใ‘ใงใ—ใŸ๏พ–๏ฝถ๏ฝฏ๏พ€

0
0
0
0
0
0
0
0
0
0
0

ใ‚ชใƒผใƒซใƒ‰ใƒฌใƒณใ‚บใงๅทกใ‚‹ใ‚ฟใ‚ค No6๏ฝœAsiaCoffeeTrip

ๅˆฉ็”จๆฉŸๆใ€€ฮฑ7โ…ฃ & Canon 35mm f1.8 LTM BTS Saphan Taksinใ‹ใ‚‰ไธญ่ฏ่ก—ใƒคใƒฏใƒฉใƒผใƒˆใธใถใ‚‰ใถใ‚‰ๆญฉใ„ใฆๆ’ฎๅฝฑใ‚’ใ—ใฆใใŸใ€‚้€”ไธญใงไฝ•ใ‹ใฎๅฎ—ๆ•™็ณปใฎใ‚คใƒ™ใƒณใƒˆใซๅ‡บใใ‚ใ—ใŸใ€‚ ใ‚ใˆใฆใชใ‚“ใฎๅฎ—ๆ•™ใ‹ใ‚„ใ€ๅ†…ๅฎนใฏ่ชฟในใฆใ„ใชใ„ใ€‚ใฟใ‚“ใช็†ฑๅฟƒใงๅคงๅˆ‡ใช่กŒไบ‹ใฎใ‚ˆใ†ใชใฎใงๅŠ็ซฏใซ่ชฟในใฆใ€ใ‚ใ‹ใฃใŸๆฐ—ใซใชใ‚‹ใฎใฏ้ฟใ‘ใ‚ˆใ†ใจๆ€ใฃใŸใ€‚ ใŸใพใŸใพ็™ฝใ„Tใ‚ทใƒฃใƒ„ใ‚’็€ใฆใ„ใŸใฎใงใ€ๅฅฅใฎๅปบ็‰ฉใฎไธญใซๅ…ฅใ‚Œใฆใใ‚ŒใŸใ€‚ใŠใ˜ใ•ใ‚“ใŒใ‚ฐใƒƒใƒ‰ใƒใƒผใ‚บใงใ€ๅ›ใฏOKใ ใจๆ‹›ใ„ใฆใใ‚ŒใŸใ€‚่‰ฒไป˜ใใ‚„ๆŸ„ไป˜ใใฎใ‚ทใƒฃใƒ„ใ‚’็€ใŸ่ฆณๅ…‰ๅฎขใฏ่ฟฝใ„่ฟ”ใ•ใ‚Œใฆใ„ใŸใ€‚ ใ“ใฎ็†ฑๆฐ—ใจๆดปๆฐ—ใซๅŒ…ใพใ‚Œใฆใ€ไปŠใ“ใฎ็žฌ้–“ใ€ใ“ใ“ใซใ„ใฆ่‰ฏใ‹ใฃใŸใจๆ€ใฃใŸใ€‚ ใใ‚Œใ‚‚1ใคใฎไฟกไปฐ

note.com ยท note๏ผˆใƒŽใƒผใƒˆ๏ผ‰

0
0
0
0
0
0
0
0
0
0
0
0
0
0
0