Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tbh, I'm really not sure why something like this should take 15 seconds to compute. That's roughly a few trillion floating point ops for a problem that has been solvable for decades. I have trouble imagining any reasonable model for mapping price -> sales needing that much compute.

Also, fwiw, I really wouldn't expect clients slider clicks to follow a normal distribution. A normal distribution occurs when you have the sum of a large number of random variables with Finite (and bounded) expectation and variance; or alternatively, when you're modelling a process with known Expectation and Variance, but not any higher order moments. If anything, I'd expect human beings to play with the slider more around extremal points, like the start and end.



Yeah, this is like if you asked your friend how much more productive they think coffee makes them, and they replied with a four hundred thousand degree polynomial over the milliliter. Reality is never that predictable; reasonable people can recognize when they have run out of significant digits. Something has gone severely wrong in your data-modeling, your invocation of the model, or both. If it actually is doing compute for 15s, then to the extent that this works, it is wrapping a vastly simpler function, which I would suggest you graph and use going forward instead. It will save you the runtime, its outputs will be more reliable, and you will get actual insights.


Also very curious about what kind of model this is and how it could (so far as it sounds) take 100% of the hardware for 15 seconds per request.


My first instinct would be to do a one or two in the middle then both of the extremes. The assumption that it would be a normal distribution is so strange to me in this situation.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: