Converts a character code from the current Code Page value to its UNICODE representation.
Declaration:
|
Copy Code |
AT_ERRCOUNT ACCUAPI IG_REC_util_codepage_to_unicode(
LPAT_BYTE pCodepageBuffer,
AT_INT codepageBufferLength,
AT_WCHAR * pUnicode,
LPAT_INT pActualLen
);
|
Arguments:
pCodepageBuffer |
Pointer to character code to be converted. |
codepageBufferLength |
The size of the buffer from which the character code is to be read. |
pUnicode |
Pointer to a UNICODE buffer for the converted character code. |
pActualLen |
Pointer to a variable to hold the actual number of bytes read for the character code. |
Return Value:
Returns the number of ImageGear errors that occurred during this function call.
Supported Raster Image Formats:
This function does not process image pixels.
Remarks:
- This function can be useful whenever a character or a character string from output should be passed to an API function requiring the character's or strings's UNICODE representation (e.g., IG_REC_output_rejection_symbol_set).
- The current Code Page may have been set or changed by a previous IG_REC_output_codepage_set function call.
Example:
|
Copy Code |
AT_ERRCOUNT ErrCount = 0;
HIGEAR higImage = 0;
HIG_REC_IMAGE higRecImage = 0;
LPAT_BYTE lpBuffer = 0;
AT_INT iBufLen = 0;
AT_WCHAR wcUnicode = 0;
AT_INT iActualLen = 0;
ErrCount += IG_load_file("Image.tif", &higImage);
ErrCount += IG_REC_image_import(higImage, &higRecImage);
ErrCount += IG_REC_image_recognize(higRecImage);
ErrCount += IG_REC_output_codepage_set("Windows ANSI");
lpBuffer = (LPAT_BYTE) malloc(1 * sizeof(AT_BYTE));
iBufLen = 1 * sizeof(AT_BYTE);
lpBuffer[0] = '�';
ErrCount += IG_REC_util_codepage_to_unicode(lpBuffer, iBufLen, &wcUnicode, &iActualLen);
//...
free(lpBuffer);
ErrCount += IG_REC_image_delete(higRecImage);
ErrCount += IG_image_delete(higImage);
|