Seg fault when calling C code

Andrew Brown via Digitalmars-d-learn digitalmars-d-learn at puremagic.com
Thu May 15 13:55:52 PDT 2014


I'm trying to calculate residuals after fitting linear 
regression, and I've got some code in C using the gsl which 
should do it. Everything works fine if I use static arrays 
(below, defining X[15], y[5] etc.). Trouble is, I won't know the 
number of individuals or covariates until runtime, so I'm stuck 
using dynamic arrays. But passing them to C causes a seg fault. 
I'm very sure I'm doing something (lots of things) stupid. If 
someone could point out what, I'd be very grateful. Thanks very 
much.

Andrew

Here's a toy D example:

import std.stdio;

extern(C) {
   void regress(int nInd, int nCov, ref double[] x, ref double[] 
y, ref double[] rOut);
}

void main(){
   int nInd = 5;
   int nCov = 3;
   double[] x = new double[nCov * nInd];
   x = [1, 4, 3, 1, 4, 3, 1, 4, 2, 1, 6, 7, 1, 3, 2];
   double[] y = new double[nInd];
   y = [5, 3, 4, 1, 5];
   double[] residuals = new double[nInd];
   regress(5, 3, x, y, residuals);

   writeln(residuals);
}

and the C code it calls:

#include <gsl/gsl_multifit.h>

void regress(int nInd, int nCov, double *x, double *y, double 
*rOut){
   int i, j;
   gsl_matrix *xMat, *cov;
   gsl_vector *yVec, *c, *r;
   double chisq;

   xMat = gsl_matrix_alloc(nInd, nCov);
   yVec = gsl_vector_alloc(nInd);
   r = gsl_vector_alloc(nInd);
   c = gsl_vector_alloc(nCov);
   cov = gsl_matrix_alloc(nCov, nCov);
   for(i = 0; i < nInd; i++)
     {
       gsl_vector_set(yVec, i, *(y+i));
       for(j = 0; j < nCov; j++)
         gsl_matrix_set(xMat, i, j, *(x + i * nCov + j));
     }

   gsl_multifit_linear_workspace *work =  
gsl_multifit_linear_alloc(nInd, nCov);
   gsl_multifit_linear(xMat, yVec, c, cov, &chisq, work);
   gsl_multifit_linear_residuals(xMat, yVec, c, r);
   gsl_multifit_linear_free(work);


   for(i = 0; i < nInd; i++)
     rOut[i] = gsl_vector_get(r, i);
}


More information about the Digitalmars-d-learn mailing list