D for sciencetific scripting / rapid protoryping
Prokop Hapala
prokophapala at gmail.com
Tue Oct 22 05:58:50 UTC 2019
I'm examining the possibility to move from Python+C/C++ to D or
Python+D. I read
(https://wiki.dlang.org/Programming_in_D_for_Python_Programmers)
and
(https://jackstouffer.com/blog/nd_slice.html), where is mentioned
PyD, Mir-algorithm, all seems very promising. But I did not test
it yet.
The main motivation why I like D over Julia
(https://julialang.org/) is that D has smaller footprint (Julia
basically needs kind of managed environment (like java virtual
machine), it does not really compile to standalone native
executable or library) and compilation of D seems to be
paradoxically faster than initialization of Julia (despite Julia
is JIT, in practice many operations (like recompiling,
installing, importing libraries) are quite slow) - waiting 1
minute for that considerably hamper fast development cycle.
Nevertheless I have several concerns now:
1) rdmd - it seems like exactly what I want (to use D as a script
language, with super fast compilation). However, I cannot figure
out how to make it work with libraries (e.g. If I use
mir-algorithm, derelict-gl3, derelict-sdl). It does not seem to
work together with dub and ignores dependencies from dub.json.
2) D footprint - the size of binary produced by D is kind of
large. This simple project
https://github.com/ProkopHapala/SimpleSimulationEngine/tree/master/Dlang/TEMP/firstTriangle-derelict produce 2.9MB with dmd and 3.5MB with ldc. This is because libraries are lineked statically, I guess. There should be a way how to link them dynamically (https://forum.dlang.org/thread/eimyvbojdmanbbshwvmc@forum.dlang.org), at least with ldc, but I cannot figure out how. (sorry for cross posting, but I got no answer to the previous post, I guess because it is >1year old). Still I would like dynamic linking with dmd, since it has faster compilation time than ldc.
I'm concerned about binary size because I want to generate many
different small dynamic libraries (.so) on-the-fly. With 3MB per
library I would easily waste Gigabytes of disk space that way.
Basically I want to achieve functionality a bit like python weave
(https://docs.scipy.org/doc/scipy-0.18.1/reference/tutorial/weave.html).
3) Dynamic loading of D-libraries with ctypes - from what I see
in PyD examples
(https://github.com/ariovistus/pyd/tree/master/examples) it all
uses distutils/setup.py. That is perhaps nice for installation by
end user, but not so convenient for rapid prototyping and
development. There is this special construction:
extern(C) void PydMain() {
def!(hello)();
module_init();
}
I'm not sure what def! exactly does and how limiting it is? I
would prefer just to generate standard .so library which exposes
standard extern(C) functions. And than to load it dynamically
with ctypes like any other c-library. Is that possible? (I did
not yet tried to investicate it in detail). The reason is that
while I often use python for prototyping, I'd like to produce .so
libraries which I can than load from anywhere else (not just from
python, but e.g. from C/C++) without changes. And it seems there
is some problem loading D-library functions from C
(https://dlang.org/spec/betterc.html).
More information about the Digitalmars-d-learn
mailing list