For Jupyter, it depends on the workflow. Especially with data sciences. In data science, you spend a lot of time playing with the data, testing things, drawing charts, computing, etc. When you do that, the cost of starting a python interpreter, loading the imports, loading the (usually big) data becomes a real pain 'cos you iterate like hell. Working in a REPL becomes really important.
But even more, working with Jupyter allows you to work out a very detailed explanation of your thought process, describing your ideas, your experiments. Being able to mix code and explanations is really important (and reminiscent of literate programming). You got the same kind of flow with R.
As data scientist, I'm concerned about data, statistics, maths, understanding the problem (instead of the solution). I don't care about code. Once I get my data understanding right then comes the time of turning all of that into a software that can be used. Before that, Jupyter really gives a productivity boost.
For the code part, yep, you need other principles where Jupyter may not be suitable.
It's interesting, I never feel like I get these exploratory benefits from jupyter notebooks. I just end up feeling like one hand and half my brain is tied behind my back. I'm most productive iterating in a similar way to what you describe, but in an ipython terminal, running a script and libraries that I'm iterating on in a real editor. If there are expensive computations that I want' to check point, I just save and load them as pickle files.
I have to say I think a jupyter notebook format is a 10x improvement in productivity over ipython. It's just so much easier to work with - and a step more reproducible too, my scribbles are all there saved in the notebook, at least!
really interesting. I may have overlooked IPython a bit (I just thought Jupyter was its improved version). For the moment, maybe like you, I prerpocess the data (which takes minutes) into numpy array which then take seconds to load. But once I add imports, everything takes about 5 or 6 seconds to load everything I need. So Jupyter remains a good idea. Moreover, I love (and actually need) to mix math and text, so markdown+latex maths is really a great combo. I dont' know if one can do that in IPython, I'll sure look!
But even more, working with Jupyter allows you to work out a very detailed explanation of your thought process, describing your ideas, your experiments. Being able to mix code and explanations is really important (and reminiscent of literate programming). You got the same kind of flow with R.
As data scientist, I'm concerned about data, statistics, maths, understanding the problem (instead of the solution). I don't care about code. Once I get my data understanding right then comes the time of turning all of that into a software that can be used. Before that, Jupyter really gives a productivity boost.
For the code part, yep, you need other principles where Jupyter may not be suitable.