Scientific computing and parallel computing C++23/C++26

sfp sfp at cims.nyu.edu
Thu Jan 13 16:10:39 UTC 2022


On Thursday, 13 January 2022 at 15:09:13 UTC, Ola Fosheim Grøstad 
wrote:
> On Thursday, 13 January 2022 at 14:50:59 UTC, bachmeier wrote:
>> If you write code in Python, it's realistically only for the 
>> Python world. Probably the same for Julia.
>
> Does scipy provide the functionality you would need? Could it 
> in some sense be considered a baseline for scientific computing 
> APIs?

SciPy is fairly useful but it is only one amongst a constellation 
of Python scientific computing libraries. It emulates a fair 
amount of what is provided by MATLAB, and it sits on top of 
numpy. Using SciPy, numpy, and matplotlib in tandem gives a user 
access to roughly the same functionality as a vanilla 
installation of MATLAB.

SciPy and numpy are built on top of a substrate of old and stable 
packages written in Fortran and C (BLAS, LAPACK, fftw, etc.).

Python, MATLAB, and Julia are basically targeted at scientists 
and engineers writing "application code". These languages aren't 
appropriate for "low-level" scientific computing along the lines 
of the libraries mentioned above. Julia does make a claim to the 
contrary: it is feasible to write fast low-level kernels in it, 
but (last time I checked) it is not so straightforward to export 
them to other languages, since Julia likes to do things at 
runtime.

Fortran and C remain good choices for low-level kernel 
development because they are easily consumed by Python et al. And 
as far as parallelism goes, OpenMP is the most common since it is 
straightforward conceptually. C++ is also fairly popular but 
since consuming something like a highly templatized header-only 
C++ library using e.g. Python's FFI is a pain, it is a less 
natural choice. (It's easier using pybind11, but the compile 
times will make you weep.)

Fortran, C, and C++ are also all standardized. This is valuable. 
The people developing these libraries are---more often than 
not---academics, who aren't able to devote much of their time to 
software development. Having some confidence that their 
programming language isn't going to change underneath gives them 
some assurance that they aren't going to be forced to spend an 
inordinate amount of time keeping their code in compliance for it 
to remain usable. Either that, or they write a library in Python 
and abandon it later.

As an aside, people lament the use of MATLAB, but one of its 
stated goals is backwards compatibility. Consequently, there's 
rather a lot of old MATLAB code floating around still in use.

"High-level" D is currently not that interesting for high-level 
scientific application code. There is a long list of "everyday" 
scientific computing tasks I could think of which I'd like to be 
able to execute in a small number of lines, but this is currently 
impossible using any flavor of D. See 
https://www.numerical-tours.com for some ideas.

"BetterC" D could be useful for developing numerical kernels. An 
interesting idea would to use D's introspection capabilities to 
automatically generate wrappers and documentation for each 
commonly used scientific programming language (Python, MATLAB, 
Julia). But D not being standardized makes it less attractive 
than C or Fortran. It is also unclear how stable D is as an open 
source project. The community surrounding it is rather small and 
doesn't seem to have much momentum. There also do not appear to 
be any scientific computing success stories with D.

My personal view is that people in science are generally more 
interested in actually doing science than in playing around with 
programming trivia. Having to spend time to understand something 
like C++'s argument dependent lookup is generally viewed as 
undesirable and a waste of time.


More information about the Digitalmars-d mailing list