Here's Our First Gemini Deep Think LLM-Assisted Hardware Design

We've been using LLMs for software and firmware for years... now we're trying hardware. Threw a MAX44009 datasheet at Gemini Deep Think, asked for an EagleCAD library file, and about 10 minutes later it popped out working XML. Loaded it in Eagle, checked the pins and dimensions, rolled with it.

Composite image showing AI-generated EagleCAD library for the MAX44009 lux sensor: upper left shows the schematic symbol editor with VCC, SDA, A0, SCL, EP, GND, and INT pins correctly assigned with power, input, and I/O directions; upper right shows the UTDFN-OPTO-6 footprint with exposed pad and pin 1 indicator dot; lower left shows the finished Adafruit STEMMA QT breakout board render with the MAX44009 mounted, surrounded by pink neon glow; lower right shows a retro bowling alley "FIRST TRY" celebration graphic with chrome 3D text and starburst effects... because the AI-generated footprint actually worked on the first attempt.

@adafruit but… why? You talk about how this piece was made but don’t actually describe what it does. Why would I want yo buy this hardware? The only advantage you’re giving me is it was designed by AI.

AI sucks, you should feel bad for using such environmentally regressive tech, and you should feel embarrassed that your embrace of this imprecise, unproven technology shows your institution has poor critical thinking.

0

If you have a fediverse account, you can quote this note from your own instance. Search https://kolektiva.social/users/sidereal/statuses/116075219450307575 on your instance and quote it. (Note that quoting is not supported in Mastodon.)