Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Time to market and developer productivity matter so much more than capital expenses that it is ridiculous. I could buy another server in the time it takes to pay a developer for a days work. Not to mention, not all the features on a web application need to serve more than 2500 requests per second. The ones that do can be refactored into a web service with higher throughput or designed to be scaled separately from the rest of the features. It doesn't make sense to daisy-tank (to pick daisies with a tank) every single feature to support throughput it doesn't need at the expense of developer time. More over until there are analytics for what your users are using deciding what to optimize is entirely speculation. Bad science. The best way to get those analytics is to be live and the best way to be live is to have a built product.

The benchmarks of linux on a physical machine rather than a virtual machine is not representative of performance for (gawd help me) cloud-centric deployments. These benchmarks assume the maximum benefit from compiler optimizations that would not be available on a virtualized machine.

I'm not sure what compilers check more than syntactical errors. Those aren't any slower to fix in a dynamic language. I'd recommend checking on Sandi Metz on testing

The rest is just language preference. And my preference is ruby. Python is fine with me too. Javascript makes me a sad panda, but it runs in all the browsers and it's a lot better with the magic of coffeescript.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: