enum conversions

bearophile bearophileHUGS at lycos.com
Fri May 13 13:48:56 PDT 2011


Some musings, feel free to ignore this post.

Sometimes I have to convert enums to integers or integers to enums. I'd like to do it efficiently (this means with minimal or no runtime overhead), and safely (this means I'd like the type system to prove I am not introducing bugs, like assigning enums that don't exist).


This function classifies every natural number in one of the three classes (deficient numbers, perfect numbers, and abundant nubers, according to the sum of its factors), so I use a 3-enum:

enum NumberClass : int { deficient=-1, perfect=0, abundant=1 }

NumberClass classifyNumber(int n)
    auto factors = filter!((i){ return n % i == 0; })(iota(1, n));
    int difference = reduce!q{a + b}(0, factors) - n;
    return cast(NumberClass)sgn(difference);
}


std.math.sgn() returns a value in {-1, 0, 1}, so this first version of the function uses just a cast, after carefully defining the same values for the NumberClass enums. But casts stop the type system, so it can't guaranteed the code is working correctly or safely, so if I change the values of the enums the type system doesn't catch the bug.

This version is safer, works with any value associated to the enum items, but it performs even two tests at run-time:

NumberClass classifyNumber(int n)
    auto factors = filter!((i){ return n % i == 0; })(iota(1, n));
    int diff = sgn(reduce!q{a + b}(0, factors) - n);
    if (diff == -1)
        return NumberClass.deficient;
    else if (diff == 0)
        return NumberClass.perfect;
    else
        return NumberClass.abundant;
}


This version is about as safe, and uses one array access on immutable array (I have not used an emum array to avoid wasting even more run time):

NumberClass classifyNumber(int n)
    static immutable res = [NumberClass.deficient, NumberClass.perfect, NumberClass.abundant];
    auto factors = filter!((i){ return n % i == 0; })(iota(1, n));
    int sign = sgn(reduce!q{a + b}(0, factors) - n);
    return res[sign + 1];
}


Using a switch is another safe option, I can't use a final switch. This too has some run-time overhead:

NumberClass classifyNumber(int n)
    auto factors = filter!((i){ return n % i == 0; })(iota(1, n));
    int sign = sgn(reduce!q{a + b}(0, factors) - n);
    switch (sign) {
        case -1: return NumberClass.deficient;
        case 0:  return NumberClass.perfect;
        default: return NumberClass.abundant;
    }
}


In theory a bit better type system (with ranged integers too as first-class types) knows that sgn() returns the same values as the enum NumberClass, this allows the first version without cast and compile-time proof of correctness:


NumberClass classifyNumber(int n)
    auto factors = filter!((i){ return n % i == 0; })(iota(1, n));
    int difference = reduce!q{a + b}(0, factors) - n;
    return sgn(difference);
}

I don't know what to think.

Bye,
bearophile


More information about the Digitalmars-d mailing list