In JavaScript, the codePointAt() method is used to get a positive integer, which is the UTF- code point value.
The syntax of the codePointAt() method is as follows:
codePointAt(position)
The position parameter specifies the position of the element inside the provided string for which we want the code point value.
If it finds the element at the specified position, the function returns an integer that represents the code point value.
Note: If the element is a UTF-16 high surrogate (higher than
U+FFFF), the function returns the surrogate pair. Otherwise, it returns the lowest surrogate point.
If it doesnβt find the element, the function will return undefined.
The code below shows how the codePointAt() method works:
console.log("Hello, Educative".codePointAt(10));// since we are working with utf16 multibyte characters are valid tooconsole.log("π".codePointAt(0));console.log("π".codePointAt(1));// this is out of rangeconsole.log("Hello, Educative π".codePointAt(100));
In line , the value of the position parameter is , which corresponds to the character c in the string "Hello, Educative". Consequently, the codePointAt() method returns the corresponding code point value. Similarly, the code point value for the strings in lines and are printed.
The codepointAt() method in line returns undefined, as the position parameter is larger than the length of the string.