Second standard conversion sequence of user-defined conversion

681 views Asked by At

I have a misunderstanding about standard-conversion sequence terms. I have come across the following quote N3797 §8.5.3/5 [dcl.init.ref]:

— If the initializer expression

— is an xvalue (but not a bit-field), class prvalue, array prvalue or function lvalue and “cv1 T1” is reference-compatible with “cv2 T2”, or

— has a class type (i.e., T2 is a class type), where T1 is not reference-related to T2, and can be converted to an xvalue, class prvalue, or function lvalue of type “cv3 T3”, where “cv1 T1” is reference-compatible with “cv3 T3” (see 13.3.1.6),

[..] In the second case, if the reference is an rvalue reference and the second standard conversion sequence of the user-defined conversion sequence includes an lvalue-to-rvalue conversion, the program is ill-formed.

What is the second standard conversion sequence here? I thought there is a standard conversion sequence including all necessary standard conversions (user-defined and implicit).

2

There are 2 answers

0
Columbo On BEST ANSWER

[over.ics.user]:

A user-defined conversion consists of an initial standard conversion sequence followed by a user-defined conversion (12.3) followed by a second standard conversion sequence. […]
The second standard conversion sequence converts the result of the user-defined conversion to the target type for the sequence.

E.g. for

struct A
{
    operator int();
};

void foo(short);

foo(A());

The second standard conversion sequence converts the int prvalue to the parameter of foo, that is, short. The first standard conversion sequence converts the A object to A (the implicit object parameter of operator int), which constitutes an identity conversion.
For references, the rule quoted by you provides an example:

struct X {
    operator B();
    operator int&();
} x;

int&& rri2 = X(); // error: lvalue-to-rvalue conversion applied to
                  // the result of operator int&

Here, operator int& is selected by overload resolution. The return value is an lvalue reference to int. For that to initialize the reference, an lvalue-to-rvalue conversion would be necessary, and that's precisely what the quote prevents: Lvalues shall not be bound to rvalue references.

1
Piotr Skotnicki On

What the standard conversion term means is covered in the following clause of the C++ Standard:

4 Standard conversions [conv]

  1. Standard conversions are implicit conversions with built-in meaning. Clause 4 enumerates the full set of such conversions. A standard conversion sequence is a sequence of standard conversions in the following order:

    — Zero or one conversion from the following set: lvalue-to-rvalue conversion, array-to-pointer conversion, and function-to-pointer conversion.

    — Zero or one conversion from the following set: integral promotions, floating point promotion, integral conversions, floating point conversions, floating-integral conversions, pointer conversions, pointer to member conversions, and boolean conversions.

    — Zero or one qualification conversion.

    [ Note: A standard conversion sequence can be empty, i.e., it can consist of no conversions. — end note ]

A standard conversion sequence will be applied to an expression if necessary to convert it to a required destination type.

In other words, the standard conversion is a set of built-in rules the compiler can apply when converting one type to another. Those built-in conversions include:

  • No conversions
  • Lvalue-to-rvalue conversion
  • Array-to-pointer conversion
  • Function-to-pointer conversion
  • Qualification conversions
  • Integral promotions
  • Floating point promotion
  • Integral conversions
  • Floating point conversions
  • Floating-integral conversions
  • Pointer conversions
  • Pointer to member conversions
  • Boolean conversions

The standard conversion sequence can appear twice during the user-defined conversion sequence - either before and/or after the user-defined conversion:

§ 13.3.3.1.2 User-defined conversion sequences [over.ics.user]

  1. A user-defined conversion sequence consists of an initial standard conversion sequence followed by a user-defined conversion (12.3) followed by a second standard conversion sequence. If the user-defined conversion is specified by a constructor (12.3.1), the initial standard conversion sequence converts the source type to the type required by the argument of the constructor. If the user-defined conversion is specified by a conversion function (12.3.2), the initial standard conversion sequence converts the source type to the implicit object parameter of the conversion function.

  2. The second standard conversion sequence converts the result of the user-defined conversion to the target type for the sequence. Since an implicit conversion sequence is an initialization, the special rules for initialization by user-defined conversion apply when selecting the best user-defined conversion for a user-defined conversion sequence (see 13.3.3 and 13.3.3.1).

Having that said, for the following conversion:

A a;
B b = a;
  • the compiler will search for the conversion constructor in B that can take an instance of A (source type) through some initial standard conversion sequence so that it could then perform that user-defined conversion through selected constructor, and then apply another standard conversion - second standard conversion - for converting the resultant type of the user-defined conversion to the target type;

    or:

  • the compiler will search for the conversion function in A that is callable after some initial standard conversion sequence of the implicit context, that could then convert an instance of A to some type convertible through another standard conversion - the second standard conversion - to the target type B.

As a tangible example let's consider the below conversion:

struct A
{
    operator int() const;
};

A a;
bool b = a;

The compiler considers the following user-defined conversion sequence:

  1. Initial standard conversion: Qualification conversion of A* to const A* to call const-qualified operator int() const.

  2. User-defined conversion: conversion of A to int, through user-defined conversion function.

  3. Second standard conversion: Boolean conversion of int to bool.

The case you are asking about can be split as follows:

struct A
{
    operator int&();
};

int&& b = A();
  • The source type is A.
  • The target type is int&&.
  • The user-defined conversion sequence is the conversion of A to int&&.
  • The initial standard conversion sequence is No conversion at all.
  • The user-defined conversion is the conversion of A to int&.
  • The second standard conversion sequence (converting the result of the user-defined conversion to the target type) that is a part of the overall user-defined conversion sequence would be here the standard conversion of int& to int&& - an Lvalue-to-rvalue conversion. That conversion is considered since int& and int&& are reference-compatible types. According to the below statement §8.5.3 [dcl.init.ref]/p5:

[...] if the reference is an rvalue reference and the second standard conversion sequence of the user-defined conversion sequence includes an lvalue-to-rvalue conversion, the program is ill-formed.

that conversion is not applicable in the overall user-defined conversion sequence.