Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Jupyter notebooks are a favorite among our data scientists. However, we have gone back to plain python scripts for our bigger projects due to a simple reason - one must keep alive the notebook page while running lengthy experiments on a remote server. Due to some rogue windows updates we had a couple of destroyed experiments, which (as these things go), happened at a very inopportune moment.

OTOH for quick experiments notebooks are great, although I feel like the more modern the GUI the farther back we go in terms of experience. The latest updates to visual studio code's Jupyter extension for example have turned this into a thoroughly frustrating experience for the visually impaired - gray-on-gray-on-gray text and even more gray and transparent thin lines that are supposed to clearly mark where a cell ends and where the output begins. Unfortunately no amount of fiddling with the color scheme could fix these 'design' choices...



>However, we have gone back to plain python scripts for our bigger projects due to a simple reason - one must keep alive the notebook page while running lengthy experiments on a remote server.

Known issue (it's a six year old issue IIRC). They're working on it if I'm not mistaken. They're also working on real-time collaboration.

Plug: We have long-running notebook scheduling in the background and the output is streamed and saved whether you close your browser or visit from another device.

We run the notebooks on your own Kubernetes cluster on GCP's GKE, AWS' EKS, Azure's AKS, DigitalOcean, and pretty much anything.

https://iko.ai/static/assets/img/landing/async-notebook-on-c...

The run saves everything as an experiment: it automatically detects model parameters without tagging cells, tinkering with metadata, or you calling a tracking library. We also automatically detect the model that is produced, and the model's metrics (again, without you doing anything).

Show HN: https://news.ycombinator.com/item?id=29450940


> Due to some rogue windows updates we had a couple of destroyed experiments, which (as these things go), happened at a very inopportune moment.

PSA: on all process control equipment running Windows 10, install O&O shutup10 and enable the default set of disablements. Finding out that an incubator has been sitting there baking $300,000 of Andor cameras for 61 hours while the organism library died because the Windows 10 box running the Python control stack decided to update: it’s a bad time. https://www.oo-software.com/en/shutup10


Did you try running jupyter locally? You can store notebooks and snippets in a git repo too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: