Personally, I think Python's success is down to the productivity of its peudocode-like syntax letting you hack prototypes out fast and easy. In turn, that makes building libraries more attractive, and these things build on each other. FORTRAN is very fast but it's a less forgiving syntax, especially coming from Python.
In that regard, I'm surprised Nim hasn't taken off for scientific computing. It has a similar syntax to Python with good Python iterop (eg Nimpy), but is competitive with FORTRAN in both performance and bit twiddling. I would have thought it'd be an easier move to Nim than to FORTRAN (or Rust/C/C++). Does anyone working in SciComp have any input on this - is it just a lack of exposure/PR, or something else?
Most code in science is written by grad students and postdocs. For them, trying new language is an enormous career risk. Your advisor might not understand it, and you might be all alone in your department if you try Nim.
That makes any sort of experimentation a really tough sell.
As a rule, I have found scientific computing (at least in astronomy, where I work) to be very socially pressured. Technical advantages are not nearly as important as social ones for language or library choice.
Change does happen, but extremely slowly. I am not exaggerating when I say that even in grant applications to the NSF as recently as 2020, using Python was considered a risky use of unproven technology that needed justification.
So, yeah, Nim is going to need a good 30 years before it could plausibly get much use.
Yep, going against the grain in graduate school is counterproductive unless there's a compelling reason.
Many grad students forget that their main purpose is to generate research results and to publish papers that advance the field, not to play around with cool programming languages (unless their research is about coding).
Here's a bunch of mistakes I made in grad school which unnecessarily lengthened my time in the program (and nearly made me run out of stipend money):
* Started out in Ruby because I liked the language, but my research involved writing numerical codes, and at the time there just wasn't much support for it so I ended up wasting a lot of time writing wrappers etc. There was already an ecosystem of tools I could use in MATLAB and Python but nooo, I wanted to use Ruby. This ended up slowing me down. I eventually gave in to MATLAB and Python and boy everything just became a lot easier.
* Using an PowerPC-based iBook instead of an Intel Linux machine. Mac OS X is a BSD (plus I was using a PPCarch) and Brew didn't exist back then, so I ended up troubleshooting a lot of compile errors and tiny incompatibilities because I liked being seen to be using a Mac. When I eventually moved to Linux on Intel, things became so much easier. I could compile stuff without any breakages in the one pass.
I also knew a guy who used Julia in grad school because it was the hot new performant thing when all the tooling was in Python. I think he spent a lot of time rejigging his tooling and working around stuff.
Ah the follies of youth. If only someone had pulled me aside to tell me to work backwards from what I really needed to achieve (3 papers for a Ph.D.) and to play around with cool tech in my spare time.
I guess the equivalent of this today is a grad student in deep learning wanting to use Rust (fast! memory-safe! cool!) even though all the tooling is in Python.
A grad student using a new language definitely definitely does not face any career risk IMO... I cant imagine a single professor or recruiter caring about something like this over material progress in their work.
My guess is that grad students are swamped and are looking for the shortest path to getting an interesting result, and that is most likely done with a tool they already somewhat know.
The question for Nim, like many other new products, is: why is it worth the onboarding cost?
My professor would have asked me what the relevance of Nim is to the actual subject of the research. Going against the grain has a cost, unless you're studying Nim itself.
And not only that, your code is likely to become the next student's code. The professor doesn't need to understand it, per se, but they do need to ensure it's useful for future maintainers/extenders. Will the next Aerospace Engineering grad student coming in understand Nim or be motivated enough to learn Nim and have time to continue the work? They likely already had Fortran, Matlab, or Python experience (which depends on their undergrad and when they went to school). Picking a novel language for the research group needs to have value for the group going forward, not just to satisfy the curiosity or taste of the RA.
Depends on the surrounding body of work. In my case, 99% of papers in my references had Python/PyTorch implementations. Which is the entire point of this post.
Python itself isn't really used for scientific computing. Pythons bindings to high performance libraries, many of which use Fortran under the hood, are used for scientific computing.
Combined with the ease of displaying results ala Matlab but much less of the jank, and you have an ideal general purpose sci comp environment
Back when I worked in scientific programming, we adopted a similar approach. The heavy lifting functions we wrote in C, but they were called from R which allowed us to plot the results, etc., easily. And the libraries we used (for solving differential equations) were all old school Fortran libraries.
If I were to start again today, I think I'd give Julia a look, though.
There is often no real value in optimizing such code, if the computation finishes in a time that doesn’t mess with your workflow. Spending more time on it will often just take time away from something more valuable to the research.
Ahh yes, that's a good point. If you're, for example working in a Jupyter Notebook, it absolutely doesn't matter if a cell needs 3 seconds or 3 milliseconds to execute.
Frequently because those performance gains aren't actually needed. We live in an age where you can cheaply and quickly scale the hardware for 99% of tasks. Tasks that are too expensive to compute inefficiently are also unlikely to be profitable enough to be doing at all.
I love Nim and would absolutely use it for every piece of native code I need to write. Unfortunately, I find it suffers from a few big problems. First, the developer tooling and documentation is kind of inconsistent. The build tool, which is also used to install packages, supports a different set of args than the main compiler, which causes some weirdness. Second, the network effect. Most libraries are maintained by a single person, who put in a lot of effort, but a lot of bugs, edge cases, missing features and other weirdnesses remain. It's usually best to use libraries made for C or Python instead, really.
I work in scientific computing and I'm a huge fan of nim. I started writing a few tools at work in nim and was quite quickly asked to stop by the software development team. In their eyes, they are responsible for the long term maintenance of projects (it's debatable how much they actually carry out this role), and they didn't want the potential burden of a codebase in a language none of them are familiar with.
It's sad, as I feel nim would be easier to maintain compared to a typical c or R codebase written by a biologist, but that's what's expected.
I second to this. There's often a huge difference between the languages and tools we'd love to be using, and those that we are allowed / forced to use on the workplace.
I for instance just moved to a company where the data stack is basically OracleSQL and R. And I dislike both. But as _Wintermute pointed out, a whole company / department won't change their entire tech stack just to please one person.
Python is very easy to teach because syntax doesn't get as much in the way as with other languages. You van basicallly start with mostly english and then slowly introduce more complex concepts. With C for example you would have to delve into data types as soon as you declare the first variable.
I'm trying to switch from traditional software engineering to something sciencier--I've been taking computational biology classes and learning Nim.
I like Nim a lot. And I know that it'll scratch a necessary itch if I'm working with scientists. I also know that it's too much to ask that the scientists just buckle down and learn Rust or something like that.
But as someone who is not afraid of Rust but is learning Nim because of its applicability to the crowd that I want to help... The vibrancy of the Rust community is really tempting me away from this plan.
I've really enjoyed the Nim community also. I even contributed some code into the standard library (a first) and was surprised at how easy they made it.
But I have also written issues against Nim libraries which have gone unanswered for months. Meanwhile, certain rust projects (helix, wezterm, nushell) just have a momentum that only Nim itself can match.
Python benefitted from there being no nearby neighbors which resembled it (so far as I'm aware). If you needed something like python, you needed python.
Rust and Go and Zig are not for scientists, but they're getting developer attention that Nim would get if they didn't exist. Also, Julia is there to absorb some of the scientist attention. It's a Tower of Babel problem.
I can't say why the scientists aren't flocking to Nim, but as someone who wants to support them wherever they go, this is why I'm uncertain if Nim is the right call. But when I stop and think about it, I can't see a better call either.
> I can't say why the scientists aren't flocking to Nim, but as someone who wants to support them wherever they go, it's why I'm uncertain if it was the right call.
Because most scientists are only using programming as a tool and don't care one bit about it beyond what they need it to do. They don't go looking for new tools all the time, they just ask their supervisor or colleague and then by default/network effects you get Python, Fortran, or C"++". You need a killer argument to convince them to do anything new. To most of them suggesting a new language is like suggesting to use a hammer of a different color to a smith - pointless. With enough time and effort you can certainly convince people, but even then it's hard. It took me years to convince even just one person to use matplotlib instead of gnuplot when I was working in academia. You can obviously put that on my lack of social skills, but still.
Why is Go often lumped in with languages that don't have garbage collectors? I'm always confused by this. Is Go suitable for systems programming? I myself use Go, but for web development.
It’s advertised as a systems programming language, though the system definition it uses casts a much wider net (think kubernetes) than some people’s understanding of system programming (think bare metal bit banging).
Yes I agree that Python success most probably due to its productory of its peudocode-like syntax that makes building libraries more attractive.
In addition to Nim, D programming is also Phytonic due to its GC by default approach and it is a very attractive Fortran alternative for HPC, numerical computation, bit twiddling, etc. D support for C is excellent and the latest D compiler can compile C codes natively, and it is in GCC eco-system similar to Fortran. Heck, D native numerical library GLAS is already faster than OpenBLAS and Eigen seven years ago [1]. In term of compilation speed D is second to none [2].
[1] Numeric age for D: Mir GLAS is faster than OpenBLAS and Eigen:
The nim syntax only looks like python on the surface. It actually feels quite different when more complex language features are involved. Nim is more restrictive than python and harder to write. IMHO, nim is not the language that common python programmers would like especially if they only know python.
Absolutely. Fortran about 500 lines of code vs <20 lines for Python. The ease of use and flexibility of Python across so many application types makes for a good reason for its popularity. The rise of hardware computing performance makes speed tradeoff trivial.
For code implementing a numerical algorithm, I think the ratio of lines needed in Fortran vs. Python is much less than 25, maybe 2 or 3. And once the code is written in Fortran you just compile with -O3 etc. to get good performance and don't need to thnk about translating to Cython, Numba, or some other language.
I think I asked this in a Nim thread a month or two ago, but to me I don’t see a chance at competing in scientific computing without a good interactive EDA story, and python with a good out-of-the-box IDE and Jupiter Notebooks and iPython has an amazing story for interactive scientific computing.
IMO the popularity of Python has as much, if not a lot more, to do with the available libraries and frameworks as the language itself. The language itself seems more inherently suited as a successor to Perl - as a powerful scripting language, rather than one really suited to large, complex, multi-person projects.
What seems to have bootstrapped the success of Python for ML and scientific use was early adoption by people in these communities who were not hard core programmers, and found it easy to get started with. Once SciPy and NumPy were available, and NumPy became used for ML, then the momentum of ML helped further accelerate the adoption of Python.
> popularity of Python has as much, if not a lot more, to do with the available libraries and frameworks as the language itself
> What seems to have bootstrapped the success of Python for ML and scientific use was early adoption by people in these communities who were not hard core programmers
What if these people (non-hard-core programmers) were attracted to the language itself because it is almost pseudo-like? So it becomes a gift that keeps on giving. Attract domain experts and you get more batteries attached for your project.
> hard core programmers
What if these people are 'hard-core' in their specific domain, but not 'hard-core' in whatever hardware architecture carries the day due to historical mishaps and marketing trends of the day?
The availability of libraries is because of the language. It's a (good) flexible general purpose dynamically typed language which makes writing libraries and good code in general easy.
Why it's deemed unsuitable for large, complex, multi-person projects is that enterprise types know only the byzantine OOP mess. And when all you have is OOP, everything is a FactoryFactoryFactory and everything else is "unmanageable".
The reason it's not well suited for larger/etc projects has nothing to do with OOP - it's about things like dynamic typing and indent-based structure that make it good for interactive REPL use, scripting and rapid prototyping, but less suited for more complex cases where you'd prefer more compile-time vs run-time checking, and where ease of lifetime maintenance trumps up-front coding time.
I disagree that the availability of good libraries for Python is because of the language - especially if we're talking about scientific and ML libraries. In many of these cases the Python libraries are just pass-thrus to the underlying libraries written in C, which was chosen because of it's performance and suitability to the task.
Because in the real world there are crappy programmers writing 1000 line functions and code nested 10 levels deep. Having explicitly delimited code blocks makes code, especially bad code, easier to read. I'd rather see 3 or 4 "}", maybe with comments, rather than just seeing a large reduction in indent and having to scroll up 3 pages to figure out the code structure.
I guess if you're doing team-based Python development then everyone is going to be forced to be sensitive to indent, and maybe use a Python-aware editor, but often in a large non-Python project there are many different editors and indent configurations being used and the code structure is unreadable - but at least you can use a language aware formatter or indent tool to recover it. In Python any sort of loss of proper indenting would obviously change the functionality of the code. Maybe this never happens ?
> Why it's deemed unsuitable for large, complex, multi-person projects is that enterprise types know only the byzantine OOP mess. And when all you have is OOP, everything is a FactoryFactoryFactory and everything else is "unmanageable".
Have you ever worked in a large, dynamically typed codebase written by other people?
Yes! Python has worked tremendously well in those scenarios - the things that make it good for small projects can scale up very, very well.
Similarly, I've worked on very large, multi-team projects where the language was statically typed, compiled, etc. and they've been total disasters, and others that have been a huge success.
I've yet to see any language that can fully negate the power of sloppy, undisciplined programmers. Those programmers are like water and always find a way.
> the things that make it good for small projects can scale up very, very well
Sorry, but that makes zero sense.
Dynamic typing is a language design choice where one is trading off automated error detection for faster development. The larger and more complex a project becomes, the more moving parts and interfaces (APIs) it has, and the more potential there is for API errors. Choosing NOT to prioritize automated (compile time) error detection is NOT something that scales up "very, very well".
It's not just about size of project, but also about expected project lifetime. For a one-time use script, or experiment, or thesis project, then maybe development time is the prime concern. You're going to hack it to get it working ASAP, and don't care what happens later (there is no later).
However, for a corporate project that will be in production use for years, then initial coding effort is only a small part of the lifetime project costs, and optimizing this at the expense of ease of maintenance is a poor decision. Years down the road the project will have had staff turnover, developers will have forgotten the details of all the code, the original design has probably been compromised due to feature creep, too many hands, etc, etc. At this stage you want all the help you can get so that people can still maintain the code, and the larger and more complex the project, the more so this will be. Again, the small/throwaway project language choice trade off of optimize development time at the cost of ease of bug detection will NOT scale up "very, very well".
If you anticipate sloppy undisciplined programmers working on a project, then all the more reason to give them less weapons with which to shoot themselves in the foot.
Well, I hate to break it to you, but I've been using Python in large production environments for nearly a quarter of a century now and either I'm just extraordinarily lucky, or it does in fact make quite a bit of sense and really does work quite well. :)
Here's one example: we once had an enterprise Java system of roughly 1 million lines of code. For a number of reasons the whole thing got replaced by a port of it to Python, and the Python version weighed in at just over 100kLOC. Disregarding all of the other benefits (which were many), there were meaningful advantages to maintaining a codebase 1/10th the size.
> Dynamic typing is a language design choice where one is trading off automated error detection for faster development.
No, not at all. First of all, if you are dealing with any sort of production code, you have to be investing in good testing. The idea of dynamic languages hiding bugs that don't show up into production is mostly a bogeyman to cover inadequate testing. We've also found that languages like Python tend to encourage a style of iterative development that lends itself to each section of code being very well tested as it gets written, so in the end it's easy to wind up with code that is both tested more during the dev process but then also well-tested due to the automated tests that all software should end up having anyway. I mean, all software gets tested, it's just a question of whether you do it or your customers do it.
(anecdotally we've seen evidence that people come to over-rely on compile time error checking in lieu of good iterative testing, and that's an interesting topic in and of itself, but beyond the scope of this discussion and IMO falls under the 'sloppy developer' umbrella anyway)
> The larger and more complex a project becomes, the more moving parts and interfaces (APIs) it has
The implicit argument here is based mainly on the assumption of lots of (even exponential) growth in coupling as a project grows, and the reality is that, the level of coupling tends to go in the opposite direction as projects grow large, unless it's simply poorly architected. For example, instead of intercommunication between modules, you're dealing with intercommunication between entire systems - the points of contact between systems tends to be over very well-defined interfaces and are small in number relative to the amount of communication internally between modules.
> initial coding effort is only a small part of the lifetime project costs, and optimizing this at the expense of ease of maintenance is a poor decision
Speed of initial implementation is certainly one benefit, but it actually pays dividends over the lifetime of the project. Code is read more than it is written, so having fewer lines of code, in a language that is easy to read, in a language that removes so much of the noise associated with more verbose languages, is always helpful. It helps you implement stuff to begin with, it helps you add new features later, and it helps you fix bugs.
There are also lots of second-order advantages too, such as making it easier to bring new hires up to speed and needing fewer people both initially and long term. It's about a lot more than just dynamic typing though; a higher level language just provides a ton of benefits that are often worth the tradeoffs. There are so many parallels to e.g. when we moved from assembly to C.
> If you anticipate sloppy undisciplined programmers working on a project
That point was just that so many knocks against certain things are actually covering the problem of sloppy developers. Rather than just take them for a given, a better approach is to set them on a path to improvement if they are willing, or let them go if they are not.
Is every successful Rails project work as a counterexample that expressive dynamically typed languages work? Ruby has to be the worst programming language to anyone who likes rigid structure and static typing where metaprogramming and monkeypatching is first class and not a sometimes feature for library authors. But it nonetheless works well for people.
"Useful" is very relative. I work with python daily and I try typing everything I can, but it can be such a pain in the ass compared to TypeScript that I want to throw my computer at the wall sometimes. And I can't imagine using Python's typing even 3 years ago, as many essential features that I constantly use are very new (typed kwargs for example). Many people that like typed languages just don't bother and don't use typing in Python because of how cumbersome it is.
Can you elaborate on the distinction you're making?
Python is the default language in which people express their scientific computations. It may execute C code in the end, but so does any language that ever executes a system call.
Little fun fact: Numpy doesn't even come with an efficient, blocked matmul procedure. It has to be linked against a BLAS implement to really provide any decent performance. This also explains why Numpy performance can vary from distribution to distribution. Anaconda ships it with a different BLAS than Pypi.
As someone that was at CERN when Python started to be adopted in the early 2000's, Python got popular as saner Perl alternative for UNIX scripting, build tools (CMT), and as means to provide a REPL to C++ and Fortran libraries instead of dealing with ROOT.
HEP is a somewhat peculiar community. They tend to rely on their own tools, their own libraries. I wouldn't take them as a model to understand historical python adoption in science.
I'd bet that their python usage is still mostly as a REPL to ROOT (which by the way has its own REPL), so no numpy, maybe little pandas, no matplotlib.
Programming languages are a bit like social networks. There's some network effect. People go where other people are. Python is currently where things happen.
I imagine part of it is also that a lot of the code isn't the science part. It's all the setup, things like parsing data for input or output. Languages like Python and Perl have very rich standard library stuff for massaging strings, data formats, etc.
If your data is big you may be amazed at how expensive string parsing can be. You can do a lot of FLOPS in the time it takes to serialize and deserialize a large matrix to ASCII for instance.
I'm so old that Fortran was actually my first language. Over the years I've seen language bindings to the old Fortran numerical libraries we all rely on but Python/numpy is the first wrapper I've actually enjoyed using. It's more than a wrapper in that it brings a compact representation of slices and vectors.
However, if I didn't know how things work underneath I'd be a little uneasy. You can always profile after the fact but it helps knowing how to avoid inefficient approaches.
The slowness of Python meant that nobody thought "it'll be easier just to write this routine" as opposed to looking to re-use existing (most often compiled) code. And if you are doing science, the less time you spend re-inventing code, the more science you will get done.
The date of this discussion (July/2020) really caught my eye. Fortran is actually making quite a comeback (to the extent an ancient scientific language can :-). I casually looked at the TIOBE index[1], flawed as it may be, it gives you sense of the trends of the larger languages. To my not-a-surprise, turns out precisely in 2020/July Fortran was at it's lowest rank on the index since it was tracked from 2001, and has steadily climbed since then. No doubt it is due to vast number of libraries already existing and python is really glue code not something you write the main code in. As someone joked to me, the first rule of python programming in scientific computing is make sure you don't have any of your program go through python. You cannot avoid Fortran/C++ unless you have very trivial or textbook type code.
Even with your disclaimer, Tiobe is useless, even as a vague indicator of anything. It ranks Prolog as having higher market share than Typescript for christ's sake, stop poisoning your brain with that snakeoil shop.
Tooling is the thing. To keep older languages popular with people new to the industry (and who never get properly introduced to older tools) it would take someone like JetBrains need to create COBOL and Fortran IDEs. People will always go for shiny and new, like it or not.
Python isn't "shiny and new". Python is "almost" ancient. Julia is "new", but also not as easy to use. It isn't about that kind of tooling, but it is about ease of use, and having access to tools that deal with silly things while you do the actual science.
Python, first appearance: 20 February 1991; 32 years ago [1]
Rust, first appearance: May 15, 2015; 8 years ago [2]
Julia, first appearance: 2012; 12 years ago [3]
Go, first appearance: November 10, 2009; 14 years ago [4]
Java, first appearance: May 23, 1995; 28 years ago [5]
Javascript, first appearance: December 4, 1995; 28 years ago[6]
Please read what I said again, more carefully, starting from the first word, and try not to be so contrary. I was referring to the fact that the shiny and new tooling will be what attracts users to a given platform / language / ecosystem, regardless of the latter's age. Other participants on this thread apparently seem to have understood this.
And I was expressing that it isn't that shiny new tooling that attracts users, nor is it the "new" stuff. Nobody uses python because of jetbrains, or whatever. It has nothing to do with IDE support.
Jetbrains has support because python is popular, and not the other way around.
I was also expressing that Python is an old, almost ancient language, and yet it more used than languages that, by your reasoning, had better "tooling" in the sense that you are using it. Making the point that it has nothing to do with the "age" of the language, the fact that it is perceived as new, or having better tools.
Python isn't popular because of its tooling, but despite it.
It is popular because it makes it easy to leverage the ecosystem and get things done.
I think that it wasn't so much a matter of one replacing another. My interpretation of the history is that Matlab had steady usage, but within relatively mature domains such as traditional engineering. Meanwhile, Python grew in fields that had no prior loyalty to Matlab, such as physics, data science, ML, the life sciences, children and hobbyists, and software development.
The "free" thing is important, because programming tools have all migrated to the open source model. No professional coder is willing to work on paid tools today. If scientists have a basic awareness of their career options, they will borrow tools from the software development world, and not from the engineering world. Those tools are all free.
Also, free software changes how you use it. I install a complete Python toolchain (up to and including Jupyter Lab) on every computer that I touch: In the office, the labs, and at home. This allows me to truly use Python as my brain. No software budget is lavish enough to pay for as many "licenses" as I use in a day.
Matlab also does have relatively bad tooling, though that is partially due to it being non-free (for ex the library installation is so cumbersome because of all the licensing stuff)
I agree with TFA, it's the ecosystem. You're not entirely wrong that free is a component, because impossibly draconian licensing terms means no ecosystem.
I still maintain that Python, when evaluated strictly on its merits as a programming language, is the most ass of the "scripting" bunch, but its ecosystem is such that it more than makes up for the difference and I always end up using it for side projects or personal stuff.
At my alma mater it also replaced all existing programming courses for non ee/cs students, replacing Java and Delphi still lingering around from days past. Multi platform support is very important which is one of the reasons to use java.
This has the big advantage that in multi disciplinary subjects everybody knows something of the programming language involved.
And of course it saves on the insane licensing costs since Mathematica is no longer required in all student software packages (MATLAB still is afaik).
I'm not sure what "more multi-platform" means here. It's been a long time since I've run into an environment that didn't have easy access to Java and it's tools.
I think if you account for all of the 2000s devices that had embedded java (but no web browsers) (think feature phones) java is probably "more multiplatform"
I'm a big Fortran now for anything fast. If I was doing EDA or any one-of data science stuff I'd be more likely to use python though (or coreutils and gnuplot depending on the circumstances).
I worked hard to replace LabView and Matlab with Python in the lab when I was a PhD student - about 15 years ago. By far the biggest motivation to do this was to remove the friction and cost of obtaining licenses for proprietary software.
I wish Python existed when I was in high school. My school’s compiler wouldn’t remind me that I was missing a semicolon and I would constantly be wasting time looking at syntax rather than grasping bigger concepts.
There's also GNU Octave as a free (and largely compatible) alternative to MATLAB. Andrew Ng's original ML course on Coursera used Octave. As a developer who'd never use it (or MATLAB) before, I found Octave very easy to use.
Newer ML courses use Python because of it's subsequent adoption by PyTorch.
Python wins because it’s the lowest common denominator.
We have a bunch of people programming, most of them scientists. Even if Python is poorly suited for us, it’s pretty much the only thing everyone can work with.
When you look at the "monster" that is ROOT-CERN, which is a C/C++ interpreter* with very efficient statistics and plot tools (it can digest tons of data), it is completely logic that Python is a good language for daily lab tasks and analysis.
But I'm not sure if it is really used beyond this daily lab task or intern project where performance is not critical. Big simulations I participated were made in C/C++.
Nowaday, it seems that at least physics particle community looks enthusiastic regarding Julia development.
*: it mimics most of core python features such as no strict typing, data structures that can store different types, on-the-fly coding, graphic interface for representation, no compilation needed.
Python is also a more transferable skill than Fortran.
I bet that plays some role too in its popularity in the scientific community, which has many young anxious grad students/postdocs looking to ensure they are employable.
Many of the introductory programming courses seem to favor Python. If you already know a little Python, I can see the temptation to stick with it instead of learning Fortran.
This is less the difference between fortran and python and more the difference between run once for an experiment culture and run as part of a pipeline culture.
Languages are mostly funglible, coding culutre is not.
Like just about any technology in our industry, people use it because it’s popular. Competitive advantage be damned - just use what your competitors use.
> The miracle of python is that it has standard syntax for vectorized code across many packages.
If you care about performance you should care about how vectorization is implemented. Python makes vectorization look like magic, but scientists shouldnt do magic. In Steel Bank Common Lisp I can implement SIMD procedures in a straightforward manner. The language is more high level and more low level (yes) than python, and much much faster.
I mean that's exactly how python won. Everyone wrote code and got used to slicing idioms without understanding how they were implemented in SIMD, and then when the implementation got changed from SIMD to GPGPU, the high level code and more importantly idioms didn't have to change.
Except I dont think that is a feature. I think optimizing compilers is something numerical scientists should excell at and that it will be a much sought after quality. Python seems to do everything it can to hide those details from the user
In that regard, I'm surprised Nim hasn't taken off for scientific computing. It has a similar syntax to Python with good Python iterop (eg Nimpy), but is competitive with FORTRAN in both performance and bit twiddling. I would have thought it'd be an easier move to Nim than to FORTRAN (or Rust/C/C++). Does anyone working in SciComp have any input on this - is it just a lack of exposure/PR, or something else?