Support 32 bit and 64 bit decoding with one binary
It is possible to configure the build process such that decoding of 32 bit and 64 bit instructions can be chosen at runtime using an additional parameter of the decode function. The header file is now entirely architecture-independent and no longer required any previous defines. Decoding x86-64 still requires a 64-bit pointer size.
This commit is contained in:
@@ -261,14 +261,10 @@ def bytes_to_table(data):
|
||||
return "\n".join(hexdata[i:i+80] for i in range(0, len(hexdata), 80))
|
||||
|
||||
template = """// Auto-generated file -- do not modify!
|
||||
#if defined(DECODE_TABLE_DATA)
|
||||
#if defined(ARCH_386)
|
||||
#if defined(DECODE_TABLE_DATA_32)
|
||||
{hex_table32}
|
||||
#elif defined(ARCH_X86_64)
|
||||
#elif defined(DECODE_TABLE_DATA_64)
|
||||
{hex_table64}
|
||||
#else
|
||||
#error "unknown architecture"
|
||||
#endif
|
||||
#elif defined(DECODE_TABLE_MNEMONICS)
|
||||
{mnemonic_list}
|
||||
#elif defined(DECODE_TABLE_STRTAB1)
|
||||
|
||||
Reference in New Issue
Block a user