Lost a new commercial user this week :(

Paulo Pinto via Digitalmars-d digitalmars-d at puremagic.com
Sun Dec 14 10:59:44 PST 2014


On Sunday, 14 December 2014 at 18:25:26 UTC, Joakim wrote:
> On Sunday, 14 December 2014 at 17:09:31 UTC, Paulo Pinto wrote:
>> You mean scale like Twitter and LinkedIn?
>
> Maybe that's why they still lose money hand over fist, 
> especially Twitter, because of all the extra servers they have 
> to buy. :p By comparison, Whatsapp was able to put millions of 
> users on a server with erlang and become profitable with much 
> less revenue:
>
> http://forum.dlang.org/post/bmvwftlyvlgmuehrtvlg@forum.dlang.org
>
>> On my case, two examples of such project was a software stack 
>> for network monitoring, data aggregation and monitoring for 
>> mobile networks all the way down to network elements.
>>
>> The old system was a mix of Perl, C++/CORBA and Motif. The new 
>> system is all Java, with small C stack for resource 
>> constrained elements.
>>
>> Another example was replacing C++ applications in medicine 
>> image analysis with  90% .NET stack and a mix of C++/Assembly 
>> for image filters and driver P/Invoke glue.
>
> It is instructive that you're dropping down to C/C++/Assembly 
> in each of these examples: that's not really making the case 
> for java/.net on their own.

I was expecting a comment like that. :)

On the first case, certain network elements are quite resource 
constrained,
so you just have a real time OS, doing SNMP stuff and other 
control operations. So only a small C library could bt delivered 
on those.

Everything else capable of running a JIT enabled JVM was doing so.

On the medicine example, the C++/Assembly code was being used in 
two cases:

- SIMD (maybe with the upcoming SIMD support on .NET, this 
wouldn't be needed any longer)

- COM, many manufacturers provide only COM drivers for their 
devices, no way around it. If the was public, .NET networking 
code could have been used instead as the devices used ethernet.


>
>> The problem is that the average coders don't learn to optimize 
>> code and in the end most business will just shell out money 
>> for more hardware than software development time.
>
> Yeah, it's all about the particular job and what the tradeoffs 
> are there.  Most online apps don't need to scale to extremes, 
> which is why they're mostly not written in C++.

Yes, agreed there.

However, there are many places that developers use C and C++, not 
because of speed, rather because for the last decades all the 
other programming languages with native code compilers faded away.

With the current ahead of time native compilation renaissance and 
GPU support on other languages, the need for C and C++ in such 
use cases will decrease.

Still there are quite a few places where C and C++ will matter 
(e.g. embedded, OS drivers, HPC, AAA games, ...) in regards to 
overall computing landscape.

--
Paulo


More information about the Digitalmars-d mailing list