I'm not a commercial software developer, but a scientist working on technology development. I've been programming for 30+ years.
Jupyter has become my lab notebook. In the past, I always had illegible, disorganized notebooks, files, and program code, all over the place. A Jupyter notebook lets me organize all of that stuff in one place, in a narrative fashion, allowing me to reconstruct what I did, long after I've forgotten the details. The reasons for open communication of methods and results to the public, also apply to internal work.
My notebooks become my reports. I've abandoned PowerPoint, and my colleagues, including managers, don't seem to mind. Seeing the actual work might actually give them a feeling of involvement, like inviting them into the lab. They're also a good way of communicating a prototype of a process to the software development team, when an idea ends up in a product. Even if they don't like Python, the programmers can read and understand it.
I can actually run some of my data acquisition code directly within Jupyter. A code cell that spits out an inline graph is practically the default interface for a lot of this kind of work, so I don't have to build a unique GUI for every kind of test. This speeds up incremental refinement of an experimental technique, even if the routines that I write end up in a "straight" Python program when it's time to let an experiment run for a few hours or days.
Granted, Jupyter won't turn bad programmers into good. Learning good programming methods is still a gap in the education of scientists.
I feel grumpy and old. :(
I'm glad I did in this case because an open-source equivalent of Mathematica is a pretty sweet tool, but the site navigation sucks enough that it's likely limiting your audience a bit.
And yes, the "home" button should go to the project home, not the blog home (or there should be a separate button for that).
(I am looking at you, http://matplotlib.org)
Though I am not sure why their blog doesn't link to the landing page of the site.
One is for R code and one is for asymptote graphs, which are amazing!
The best place to start is probably https://try.jupyter.org/
https://zeppelin.incubator.apache.org
Comparison: https://www.linkedin.com/pulse/comprehensive-comparison-jupy...
Made by the people at https://www.twosigma.com/.
edit: fixed a word
And I think that's the project that should implement this : https://github.com/jupyter/nbdime
They talk about how Jupyter has "evolved from a Python-specific tool to a general data science tool that supports many different languages."
It is definitely not a tool to replace a typical Python workflow.
I use it to ask a whole bunch of exploratory questions about a dataset then productionize the result in PyCharm (my preference, other ways work great too :)
I would love something that combines this style with support for good software practices. For example, that let's you seamlessly move snippets of code into functions, classes, modules, and then create tests for them. RStudio is actually the closest I have found, which is ironic since as a language R is horrible for encouraging good software practices.
I haven't used it too much, but it looks interesting.
What works is that you get a subset of your data and try to develop some code to process it and generate a handful of graphs. You can then save the code in its true text form and edit with your favorite editor, and run it on your real data.
I also used them when we did a capture the flag contest to help explain visually how a multi time pad vulnerability works.
I don't think I would ever use this tool as a seasoned software engineer, but I can definitely see the power it has for newer people who want to learn, or simply people like him who know a little bit of code and just wanted to run it.
Congrats to the team building this tool!
It also allows for reproducibility of results, which is arguably even more important, especially in the data science case.
http://jupyter.readthedocs.org/en/latest/install.html#new-to...
use something like:
>pip install jupyter -U