#include "unicode/platform.h"
#include <stddef.h>
#include "unicode/urename.h"
Go to the source code of this file.
Defines | |
#define | U_CFUNC extern |
This is used in a declaration of a library private ICU C function. | |
#define | U_CDECL_BEGIN |
This is used to begin a declaration of a library private ICU C API. | |
#define | U_CDECL_END |
This is used to end a declaration of a library private ICU C API. | |
#define | U_CAPI U_CFUNC U_EXPORT |
This is used to declare a function as a public ICU C API. | |
#define | U_STABLE U_CAPI |
#define | U_DRAFT U_CAPI |
#define | U_DEPRECATED U_CAPI |
#define | U_OBSOLETE U_CAPI |
#define | U_INTERNAL U_CAPI |
#define | INT8_MIN ((int8_t)(-128)) |
The smallest value an 8 bit signed integer can hold. | |
#define | INT16_MIN ((int16_t)(-32767-1)) |
The smallest value a 16 bit signed integer can hold. | |
#define | INT32_MIN ((int32_t)(-2147483647-1)) |
The smallest value a 32 bit signed integer can hold. | |
#define | INT8_MAX ((int8_t)(127)) |
The largest value an 8 bit signed integer can hold. | |
#define | INT16_MAX ((int16_t)(32767)) |
The largest value a 16 bit signed integer can hold. | |
#define | INT32_MAX ((int32_t)(2147483647)) |
The largest value a 32 bit signed integer can hold. | |
#define | UINT8_MAX ((uint8_t)(255U)) |
The largest value an 8 bit unsigned integer can hold. | |
#define | UINT16_MAX ((uint16_t)(65535U)) |
The largest value a 16 bit unsigned integer can hold. | |
#define | UINT32_MAX ((uint32_t)(4294967295U)) |
The largest value a 32 bit unsigned integer can hold. | |
#define | INT64_C(c) c ## LL |
Provides a platform independent way to specify a signed 64-bit integer constant. | |
#define | UINT64_C(c) c ## ULL |
Provides a platform independent way to specify an unsigned 64-bit integer constant. | |
#define | U_INT64_MIN ((int64_t)(INT64_C(-9223372036854775807)-1)) |
The smallest value a 64 bit signed integer can hold. | |
#define | U_INT64_MAX ((int64_t)(INT64_C(9223372036854775807))) |
The largest value a 64 bit signed integer can hold. | |
#define | U_UINT64_MAX ((uint64_t)(UINT64_C(18446744073709551615))) |
The largest value a 64 bit unsigned integer can hold. | |
#define | TRUE 1 |
The TRUE value of a UBool. | |
#define | FALSE 0 |
The FALSE value of a UBool. | |
#define | U_HAVE_WCHAR_H 1 |
Indicates whether <wchar.h> is available (1) or not (0). | |
#define | U_SIZEOF_WCHAR_T 4 |
U_SIZEOF_WCHAR_T==sizeof(wchar_t) (0 means it is not defined or autoconf could not set it). | |
#define | U_SIZEOF_UCHAR 2 |
Number of bytes in a UChar. | |
#define | U_ALIGN_CODE(n) |
This is used to align code fragments to a specific byte boundary. | |
#define | U_INLINE |
Typedefs | |
typedef int8_t | UBool |
The ICU boolean type. | |
typedef uint16_t | UChar |
Define UChar to be wchar_t if that is 16 bits wide; always assumed to be unsigned. | |
typedef int32_t | UChar32 |
Define UChar32 as a type for single Unicode code points. |
This file defines basic types and constants for utf.h to be platform-independent. umachine.h and utf.h are included into utypes.h to provide all the general definitions for ICU. All of these definitions used to be in utypes.h before the UTF-handling macros made this unmaintainable.
Definition in file umachine.h.
|
The FALSE value of a UBool.
Definition at line 202 of file umachine.h. Referenced by BreakIterator::BreakIterator(). |
|
The largest value a 16 bit signed integer can hold.
Definition at line 136 of file umachine.h. |
|
The smallest value a 16 bit signed integer can hold.
Definition at line 123 of file umachine.h. |
|
The largest value a 32 bit signed integer can hold.
Definition at line 140 of file umachine.h. |
|
The smallest value a 32 bit signed integer can hold.
Definition at line 127 of file umachine.h. |
|
Provides a platform independent way to specify a signed 64-bit integer constant. note: may be wrong for some 64 bit platforms - ensure your compiler provides INT64_C
Definition at line 165 of file umachine.h. |
|
The largest value an 8 bit signed integer can hold.
Definition at line 132 of file umachine.h. |
|
The smallest value an 8 bit signed integer can hold.
Definition at line 119 of file umachine.h. |
|
The TRUE value of a UBool.
Definition at line 198 of file umachine.h. Referenced by RunArray::RunArray(). |
|
This is used to align code fragments to a specific byte boundary. This is useful for getting consistent performance test results.
Definition at line 323 of file umachine.h. |
|
This is used to declare a function as a public ICU C API.
Definition at line 106 of file umachine.h. |
|
This is used to begin a declaration of a library private ICU C API.
Definition at line 101 of file umachine.h. |
|
This is used to end a declaration of a library private ICU C API.
Definition at line 102 of file umachine.h. |
|
This is used in a declaration of a library private ICU C function.
Definition at line 100 of file umachine.h. |
|
Indicates whether <wchar.h> is available (1) or not (0). Set to 1 by default.
Definition at line 219 of file umachine.h. |
|
The largest value a 64 bit signed integer can hold.
Definition at line 181 of file umachine.h. |
|
The smallest value a 64 bit signed integer can hold.
Definition at line 177 of file umachine.h. |
|
Number of bytes in a UChar.
Definition at line 268 of file umachine.h. |
|
U_SIZEOF_WCHAR_T==sizeof(wchar_t) (0 means it is not defined or autoconf could not set it).
Definition at line 230 of file umachine.h. |
|
The largest value a 64 bit unsigned integer can hold.
Definition at line 185 of file umachine.h. |
|
The largest value a 16 bit unsigned integer can hold.
Definition at line 149 of file umachine.h. |
|
The largest value a 32 bit unsigned integer can hold.
Definition at line 153 of file umachine.h. |
|
Provides a platform independent way to specify an unsigned 64-bit integer constant. note: may be wrong for some 64 bit platforms - ensure your compiler provides UINT64_C
Definition at line 173 of file umachine.h. |
|
The largest value an 8 bit unsigned integer can hold.
Definition at line 145 of file umachine.h. |
|
|
Define UChar to be wchar_t if that is 16 bits wide; always assumed to be unsigned. If wchar_t is not 16 bits wide, then define UChar to be uint16_t. This makes the definition of UChar platform-dependent but allows direct string type compatibility with platforms with 16-bit wchar_t types.
Definition at line 285 of file umachine.h. Referenced by UnicodeString::append(), UnicodeString::char32At(), UnicodeString::getChar32Limit(), UnicodeString::getChar32Start(), CurrencyUnit::getISOCurrency(), CurrencyAmount::getISOCurrency(), UnicodeString::getTerminatedBuffer(), UnicodeString::replace(), and Transliterator::setID(). |
|
Define UChar32 as a type for single Unicode code points. UChar32 is a signed 32-bit integer (same as int32_t). The Unicode code point range is 0..0x10ffff. All other values (negative or >=0x110000) are illegal as Unicode code points. They may be used as sentinel values to indicate "done", "error" or similar non-code point conditions. Before ICU 2.4 (Jitterbug 2146), UChar32 was defined to be wchar_t if that is 32 bits wide (wchar_t may be signed or unsigned) or else to be uint32_t. That is, the definition of UChar32 was platform-dependent.
Definition at line 305 of file umachine.h. Referenced by UnicodeString::char32At(), UnicodeSetIterator::getCodepoint(), and UnicodeSetIterator::getCodepointEnd(). |