Show HN: LocalLLM – Recipes for Running the Local LLM (Need Contributors) https://ift.tt/InbTD9H

Show HN: LocalLLM – Recipes for Running the Local LLM (Need Contributors) I built localLLLM: a small community project for running local models. Live: https://ift.tt/lVFI6Wj The goal is simple: if someone has model + OS + GPU + RAM, they should get steps that actually work (ideally one liner) I need help populating and validating guides. If you run local models, please submit one working recipe (or report what failed). Would love to hear general feedback as well! https://ift.tt/lVFI6Wj April 23, 2026 at 05:01AM

Show HN: Honker – Postgres NOTIFY/LISTEN Semantics for SQLite https://ift.tt/4J0xlRb

Show HN: Honker – Postgres NOTIFY/LISTEN Semantics for SQLite https://ift.tt/OZuk6FN April 23, 2026 at 01:53AM

Show HN: Built a daily game where you sort historical events chronologically https://ift.tt/KEHmI73

Show HN: Built a daily game where you sort historical events chronologically https://hisorty.app/ April 23, 2026 at 12:14AM

Show HN: We built an OCR server that can process 270 dense images/s on a 5090 https://ift.tt/rxTGtVz

Show HN: We built an OCR server that can process 270 dense images/s on a 5090 https://ift.tt/FfLbJyz April 22, 2026 at 11:21PM

Show HN: A free tool for non-technical folks to easily publish a website https://ift.tt/Sgt8LwJ

Show HN: A free tool for non-technical folks to easily publish a website It's easier than ever for anyone to make a website, even without paying for a drag-and-drop builder like Squarespace. But there are still too many barriers for your average non-technical person to publish a site on the web. I'd bet most people don't know there are free ways to host a website, and even if they find an explainer, technical platforms like Cloudflare and GitHub (let alone the command line) can be intimidating. So I made weejur, which is basically a super simple UI front-end for GitHub Pages. You log in with OAuth, and then you can just paste HTML or upload files to publish a website. If you don't have a GitHub account, you can sign up right in the OAuth flow. It's completely free, and you can view the source here [1]. My hope is this makes it easier for people who don't know anything about web hosting to create and share their own websites. Feel free to try it out and please share any questions/ideas/feedback! [1] https://ift.tt/MSZcxqk https://weejur.com April 22, 2026 at 06:06AM

Show HN submissions tripled and are now mostly "look" vibe-coded https://ift.tt/qDw1F0h

Show HN submissions tripled and are now mostly "look" vibe-coded https://ift.tt/Wazo3vO April 22, 2026 at 04:44AM

Show HN: Ohita – a tool to simplify API key management for AI agents https://ift.tt/MmOR0Qk

Show HN: Ohita – a tool to simplify API key management for AI agents I have been trying out numerous AI agent setups to find out which one I would like to run as my personal assistant. One thing that kept constantly bothering me was dealing with API keys, especially those that need jumping through hoops to keep working. Not an uncommon sight was trying to get my agent to fetch me some data or post to X/Twitter and then it would return an error as my API key had stopped working. So I built a tool that you can give to your AI agent and with one API key it can call all of the services. The tool acts as a central auth and handles individual API's requirements like refreshing tokens, making sure rate limits are adhered, sends the correct user-agents and everything else that each API might require. At first I wanted to provide all of the users no need to setup their own API keys, but that proved to be impossible. Most API providers state in their ToS that proxying the API is prohibited. Also there was the problem with identities: if an agent posts to Reddit or X the post is from the shared account. So I decided to add a bring-your-own-key architecture where you can setup your own keys (if you want to!) but the tool still handles all the token refreshing etc. Some generous services allow pretty lenient use of their API so I included those ready out of the box, no config required to getting started! Right now I am happy using this tool myself but I wish more people used it so that I could work on improving it. Since I am a single dev there is a lot of work, I am adding new providers every day, fixing bugs and all that. But if anyone would give me their honest thoughts and tested the features I could work on improving the tool even more. There is an option to pay for the usage to cover some running costs but the free tier is more than enough to get building. https://ohita.tech/ April 22, 2026 at 04:08AM

Show HN: LocalLLM – Recipes for Running the Local LLM (Need Contributors) https://ift.tt/InbTD9H

Show HN: LocalLLM – Recipes for Running the Local LLM (Need Contributors) I built localLLLM: a small community project for running local mod...