stride

Calculate the length of the UTF sequence starting at index in str.

  1. uint stride(S str, size_t index)
  2. uint stride(S str)
  3. uint stride(S str, size_t index)
  4. uint stride(S str)
  5. uint stride(S str)
  6. uint stride(S str, size_t index)
    uint
    stride
    (
    S
    )
    (
    auto ref S str
    ,
    size_t index = 0
    )
    if (
    is(S : const dchar[]) ||
    (
    isInputRange!S &&
    is(immutable ElementEncodingType!S == immutable dchar)
    )
    )

Parameters

str S

input range of UTF code units. Must be random access if index is passed

index size_t

starting index of UTF sequence (default: 0)

Return Value

Type: uint

The number of code units in the UTF sequence. For UTF-8, this is a value between 1 and 4 (as per RFC 3629, section 3). For UTF-16, it is either 1 or 2. For UTF-32, it is always 1.

Throws

May throw a UTFException if str[index] is not the start of a valid UTF sequence.

Note: stride will only analyze the first str[index] element. It will not fully verify the validity of the UTF sequence, nor even verify the presence of the sequence: it will not actually guarantee that index + stride(str, index) <= str.length.

Examples

assert("a".stride == 1);
assert("λ".stride == 2);
assert("aλ".stride == 1);
assert("aλ".stride(1) == 2);
assert("𐐷".stride == 4);

Meta