What is Hackers' Pub?

Hackers' Pub is a place for software engineers to share their knowledge and experience with each other. It's also an ActivityPub-enabled social network, so you can follow your favorite hackers in the fediverse and get their latest posts in your feed.

0
1
0

Bicycle riding people, when you get on or off your bike, which side of the bike do you do it from/to?

(Boosts welcome. I could just assume everyone does it from the left like I do, but why not run a poll.)

0
17
0
0
0
0
0
0
3

시크릿 스카이는 님들이 쓴 글의 데이터는 암호화돼서 AT프로토콜에 공개하고 제 서버에는 키만 저장해뒀다가 님들이 '이사람은 보여줘도 돼요'라고 해둔 사람이 보여달라고 할 때만 복호화해서 보여줍니다 못 믿겠으면 코드 공개되어 있으니 직접 까보시거나 아님 포크따서 직접 운영하셔도 됩니다 이슈/PR 편하게 올려주시면 감사링 github.com/2chanhaeng/s...

GitHub - 2chanhaeng/secret-sky

0
0
0
0
0
0

다행히 블스에는 인용에 대해 많이 관대해진 것 같은데 트위터를 막 시작했을 때 인용에 대해 너무 적대적으로 반응해서 좀 힘들었음 그래서 그거 관련해서 글 썼더니 "인용으로 무례하게 글 쓰는 사람이 많아서 그렇다" 라고 무례하게들 반응하더라고 인용으로 무례하게 굴면 무례한 게 잘못이고 인용으호 욕을 하면 욕을 한 게 잘못 아냐?? 왜 그게 남이 서비스에 있는 멀쩡한 기능을 사적으로 제재하는 근거가 되는지 이해가 안 됐음 블스도 초창기엔 그런 얘기 나오다가 그냥 인용 싫은 사람은 인용 금지 걸어놓을 수 있어서 별 얘기 안 나오는 듯

0
1
0
0
0
0
0
0

아는 사람이 올해에 컴퓨터 싸게 조립하자는 마음에
드래곤 볼 모은다며
원하는 부품들이 싼 가격 나올 때마다 사들였는데

다른건 다 샀지만
그래픽카드와 램 가격만 떨어지지 않아서 여태 못 사고 있다가 가격이 몇배 되버림 -_-

.... 현명한 소비를 하려고 머리 쓰다가 오히려 망한 경우.

0
0
0
12
2
0

Apparently ICE abducted parents and just left their young kids just sitting there alone, unassisted and unattended — in not one but •two• separate incidents in the Twin Cities in the last few days.

In one case, they left school-age kids alone in the house after a pre-dawn raid. In another, they snatched a father driving his kids home and just left the kids there in the car.

In both cases, the kids are OK — but only because strangers came along to help.

0
0
0
0
0

ついでに実家Ubuntu 24.04.3 LTSちゃんも

$ sudo sh -c 'apt update && apt dist-upgrade -y'
:
The following packages will be upgraded:
apparmor libapparmor1 python-apt-common python3-apt
4 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
2 standard LTS security updates
:
Restarting services...
/etc/needrestart/restart.d/systemd-manager
systemctl restart apt-news.service

Service restarts being deferred:
/etc/needrestart/restart.d/dbus.service
systemctl restart networkd-dispatcher.service
systemctl restart unattended-upgrades.service
:
$ cat /var/run/reboot-required*
cat: '/var/run/reboot-required*': No such file or directory

ヨシ? (再起動をdeferされたサービスの再起動をあとでやるのは誰なんだろう)

0
0
0
0

@catsaladCat 🐈🥗 (D.Burch) :blobcatrainbow: It's kinda a limitation of the English language, which uses the one term, “spicy”, to describe a bad amount of chemical reactions.

The most common are reactions are caused by capsaicinoids, like in ghost peppers. That's what the popular Scoville scale is trying to measure, and they kinda burn anywhere they touch.

Then there's the sharp sting of isothiocyanates found in wasabi and horseradish, which is more of a scent than a specific taste.

There are also the sanshools, which cause a numbing sensation on your tongue. Those are most commonly found in Sichuan peppers.

And then the fragrance of the piperines in peppercorn is also called spicy by some white folks.

Some languages have specific common terms for the above sensations, like the Chinese “麻”/“má” for the numbing sensation caused by sanshools.

Other languages have even fewer common terms, such as the German “scharf”, which can refer to all of the above and even the warmth from gingerols in ginger.

0
0
0
1

家サーバUbuntu 24.04.3 LTSちゃんの復電記念に

$ sudo sh -c 'apt update && apt dist-upgrade -y'
:
The following upgrades have been deferred due to phasing:
apparmor libapparmor1
The following packages will be upgraded:
python-apt-common python3-apt
2 upgraded, 0 newly installed, 0 to remove and 2 not upgraded.
2 standard LTS security updates
:
Service restarts being deferred:
systemctl restart unattended-upgrades.service
:
$ cat /var/run/reboot-required*
cat: '/var/run/reboot-required*': No such file or directory

ヨシ?

0
0
0
0
1
0

Let's Encrypt의 10주년
------------------------------
- 2015년 첫 *공인 인증서 발급* 이후, Let’s Encrypt는 전 세계에서 가장 많은 인증서를 발급하는 *최대 규모 인증 기관(CA)* 으로 성장
- *자동화 기반 확장성* 을 핵심으로, 하루 1천만 개 이상 인증서를 발급하며 약 10억 개의 웹사이트 보호에 근접
- HTTPS 암호화 비율을 전 세계적으로 *30% 미만에서 80…
------------------------------
https://news.hada.io/topic?id=24960&utm_source=googlechat&utm_medium=bot&utm_campaign=1834

0
0
0

Post-transformer inference: 224× compression of Llama-70B with improved accuracy

Link: zenodo.org/records/17873275
Discussion: news.ycombinator.com/item?id=4

Post-Transformer Inference: 224× Compression of Llama-70B with Improved Accuracy

This paper introduces the first verified method to eliminate transformers from inference while preserving, and in many cases improving, downstream accuracy. We show that a frozen 70-billion-parameter Llama-3.3-70B model can be replaced by a 256-dimensional meaning field extracted from seven internal activation layers. A lightweight compressor (AN1) reduces these fields by 224× with an average +1.81 percentage point gain across classification tasks, including +3.25 pp on low-resource RTE (R² = 0.98 inverse-scaling fit, p < 0.01). A 30M-parameter student then learns to regenerate these fields directly from raw text, enabling full transformer-free inference at 60× higher throughput with only 0.35 pp average accuracy loss. The core insight is that task-aligned semantics in modern transformers occupy a remarkably low-rank manifold. Across layers we observe 72–99 percent of variance in the top one to three dimensions. Once this structure is extracted and learned, the transformer becomes unnecessary. It serves as a one-time sculptor of meaning rather than the permanent home of inference. This work establishes Field Processing Units (FPUs) as a post-transformer compute primitive that replaces deep matrix multiplication with shallow field operations. All results are averaged over five seeds with statistical significance reported. Ablations isolate the causal contributions of field supervision, geometric regularization, and anchor-layer selection. This Zenodo release provides the complete scientific manuscript and the baseline reference implementation for the AN1 Core system. Proprietary optimizations (AN1-Turbo) have been removed to support independent verification and further research into post-transformer inference.

zenodo.org · Zenodo

0
1
1
0