Hacker Newsnew | past | comments | ask | show | jobs | submit | dmingod666's commentslogin

very good alternative is wails

6-10mb single binary

use any frontend js tech you want

automatically go functions exposed to the js side so you don't need to think about making endpoints, it makes and uses a websockets between the binary and the ui

uses WebView2 and not chrome


Sounds like Tauri, which is built on Rust instead of Go.

> By using the OS's native web renderer, the size of a Tauri app can be less than 600KB.


most impacted are technical consultants ( outside experts that merely provide high level consulting ) -- their recommendations will be scrutinized more closely


Maybe it was just a member of Congress and thus no crime was committed


As a full time Dev. I'm on ChatGPT (paid) and use co-pilot.

My teams first go-to is ChatGPT as well. SO is no longer 'that site' to go to ( been on SO for 10+ years ). I have 0 love for that site. It was 'The resource' before, now it's just another site..


I like the fact that he's still quite approachable on social media


You have things like rocksdb, SQLite etc that are simple have serialization-deserialization built-in, so no external app dependencies


No can't do it realtime. Each image would be about 20-60 seconds (depends on what you're doing with it)


A frame is like 20-30 seconds for one pass on my 3090, you may have multiple tools and multiple passes.. say 60 seconds a frame x 300 frames about 5 hours -- 3090 costs 0.44$ per hour on runpod.. a little more than 2$


Kinda crazy how averse to ongoing costs i am. I have a 3080TI for Blender rendering and sculpting and general blender, so i definitely need something good local.. but i'm debating buying top of the line on the next iteration so i can do this stuff locally. Thousands of dollars perhaps just to avoid a $2 cost lol.


The 2$ adds up + thats an ephemeral environment and these things are super finiky to get working properly (dependency and initial setup + adding tools to your workflow). Plus there is a more of a craft like - ad-hoc approach to how you want to do things, plus a learning curve, with what tools and scripts you want to do or write simple stuff on your own -- runpod sounds good on paper but after a while, you then want AWS/Google VM so things are stable and your environment is sane and stable. Then if you calculate, you'll come up to a cost of buying a nice card in 6-10 monthly payments.. So if you want to play with it in peace you spend once and only worry about electricity costs :D


Or they could just use the models we are using right now and get the same effect, not like the models are going away anywhere


Computationally the early models might be easier/cheaper to run in a few years as well.


You apparently can't use anything to do with automatic-111 webui from colab, don't remember the exact restriction but I hear even if you do a simple, print with the word it freaks out..


You can with paid Colab. You just can't on the free tier.


cool, I have the pro, haven't tried it myself.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: