#incl...">

Wrong type inference during recursive generic lambda (original) (raw)

Although the code below is quite complicated, it would be valid as C++20. g++ 13.2.1 accepts this code. But clang++ 18.0.0 fails, saying "return type of self(n-2) is int".

#include #include

template <typename...> struct LambdaTraits;

template struct LambdaTraits : public LambdaTraits<decltype(&F::operator())> {};

template <typename F, typename... TArgs> struct LambdaTraits<F, TArgs...> : LambdaTraits<decltype(&F::template operator()<TArgs...>)> {};

template <typename C, typename Ret, typename... Args> struct LambdaTraits<Ret (C::*)(Args...) const> { using args_type = std::tuple<Args...>; using return_type = Ret; };

struct Any { template T operator()(T) const; };

template struct MemoFix { F f; using Arg = std::tuple_element_t<1, typename LambdaTraits<F, Any>::args_type>; using Ret = LambdaTraits<F, Any>::return_type; std::map<Arg, Ret> cache{};

Ret operator()(Arg x) {
    if (!cache.contains(x)) {
        cache[x] = f(std::ref(*this), x);
    }
    return cache[x];
}

};

int main() { auto fibonacci = MemoFix{[&](auto self, int n) -> std::string { if (n <= 0) { return "0"; } if (n == 1) { return "1"; }

    std::string str = "(" + self(n - 2) + " + " + self(n - 1) + ")";
    return str;
}};

std::cout << fibonacci(5) << std::endl;

}

compile command (godbolt): clang++ -std=c++20 -g -Wall -Wextra main.cpp -o main && ./main

outputs:

main.cpp:48:31: warning: adding 'int' to a string does not append to the string [-Wstring-plus-int] 48 | std::string str = "(" + self(n - 2) + " + " + self(n - 1) + ")"; | ~~~~^~~~~~~~~~~~~ main.cpp:11:71: note: in instantiation of function template specialization 'main()::(anonymous class)::operator()' requested here 11 | struct LambdaTraits<F, TArgs...> : LambdaTraits<decltype(&F::template operator()<TArgs...>)> {}; | ^ main.cpp:27:50: note: in instantiation of template class 'LambdaTraits<(lambda at main.cpp:40:30), Any>' requested here 27 | using Arg = std::tuple_element_t<1, typename LambdaTraits<F, Any>::args_type>; | ^ main.cpp:40:22: note: in instantiation of template class 'MemoFix<(lambda at main.cpp:40:30)>' requested here 40 | auto fibonacci = MemoFix{[&](auto self, int n) -> std::string { | ^ main.cpp:48:31: note: use array indexing to silence this warning 48 | std::string str = "(" + self(n - 2) + " + " + self(n - 1) + ")"; | ^ | & [ ] main.cpp:48:45: error: invalid operands to binary expression ('const char *' and 'const char[4]') 48 | std::string str = "(" + self(n - 2) + " + " + self(n - 1) + ")"; | ~~~~~~~~~~~~~~~~~ ^ ~~~~~ 1 warning and 1 error generated.

g++ compiles this code correctly. compile command (godbolt): g++ -std=c++20 -g -Wall -Wextra main.cpp -o main && ./main

and outputs:

((1 + (0 + 1)) + ((0 + 1) + (1 + (0 + 1))))

Details

This program displays the calculation process of the Fibonacci sequence.

However, for some reason, clang++ seems to incorrectly infer that the return value of self(n-2) is int.

Sorry for the complicated code; this bug only occurs when a lot of type inference is involved, so I think this is the minimal case. I hope someone can narrow down the problem more clearly.

Thanks!