I just created a dub package. Frankly, the whole thign is backward.
Ola Fosheim Grøstad
ola.fosheim.grostad at gmail.com
Mon Apr 25 09:20:23 UTC 2022
On Monday, 25 April 2022 at 06:43:47 UTC, FeepingCreature wrote:
> I disagree with this post. Dub's requirements seem eminently
> sensible to me, if you publish source code without a license,
> you're basically setting a copyright trap for the unwary.
I don't know. I recently spent quite a bit of time with Faust and
its standard library has different licenses for different
functions (GPL, LGPL, MIT etc). This makes perfect sense when you
want to use the best tools (DSP algorithms) for the job and have
them all at your fingertips. So it isn't obvious that having one
license for a library make sense. For some projects it would make
more sense to have a "license filter" of some sort at the
compiler-level.
So, for instance instead of having one FFT function in your
"numerical" package you can have an FFT interface and have many
FFT implementations (with various licenses) in the same
"numerical" package. Then you can easily test many of them for
performance/quality in your application with little effort.
In general, it is a problem if D has a package system that
authors don't use. It would probably make more sense to use what
is/becomes common for C++/C and build on top of that. Larger D
projects will use a lot of C++/C code anyway.
> system, just build with that system. Dub is off in its own
> little world of D packages, and that's fine, it limits the
> breakage surface, encourages internal standardization and makes
> it easy to get started for a newcomer.
Ok, so you have to decide what the purpose of Dub is. If it is
primarily for newbies, then you focus more on quality testing
packages before making them public, requiring good documentation
and making sure that they are removed when outdated.
If it is for more advanced projects you have to make sure that
everybody wants to use it. If one commonly used library choses
not to use it, then it becomes unworkable.
In Java, Javascript, Go and Rust you have a culture of not using
much FFI, so they are not really comparable to a language where
you use a lot of FFI. More importantly they have a large enough
user base to maintain libraries over time. In some of these
languages you cannot avoid using their leading package
managers/build systems, it is close to mandatory. And some of the
builds that these package managers lead to adds bloat.
Anyway, there is a reason for why there is an increasing demand
for header-only libraries in C++. People learn the hard way that
libraries that require setup will suck up time, increase
complexity and usually add bloat, regardless of build system used…
*Fo D I don't really think it matters either way, as the main
issue seems to be to get enough maintained high quality libraries
to drive the adoption in the first place.*
The best strategic option right now seems to be to have
maintained C-library bindings that uses "import C" and ship them
with the compiler (funded by the D foundation) and provide an
install-option that uses the operating system's package manager
(linux distribution, macports, homebrew etc) so that the install
becomes 100% build-free.
I personally try to avoid libraries, regardless of language, that
require building beyond the standard compiler features. I think
most pure D libraries should be nothing more than a github fetch
and it would be better to reuse a system that is likely to be
dominating C/C++ builds in the future, or rather to use a fork of
such a system that adds information useful for D builds in a
registry, such as classifying existing C-libraries as "import C"
compatible.
More information about the Digitalmars-d
mailing list