The variables Key1 and Key2 returned by the Read_Key macro aren’t ASCII codes; they’re the Windows Virtual Key code and modifier flags, respectively. The former identifies which physical key was pressed, while the latter contains information that affects the interpretation of the former. These are documented in the online help under User Input and Keyboard Functions. You’ll find the Virtual Key codes in the Win32.sh macro source file, which is the main location for things imported from Windows, as these are.
To convert Read_Key’s output into a displayable character, you do it exactly as you would in any Windows program: you test the flags to see what modifiers may be in effect, then, should the combination actually represent an ASCII key, you translate it. If you look at the key constants defined in the macro source file Keys.sh, you’ll note that 0x1BD in fact is a dash; the first hex digit is a flag indicating that the key pressed was an alphanumeric key, and the following two digits are the Windows Virtual Key code for the top-row key representing the dash and underscore characters.
The simplest code you could write would be:
if(Key2 == 1)
AlphaNum_Character = Char(Key1);
which simply verifies that the Virtual Key code does, in fact, represent either a letter or a number, then translates it. If you’re going to feed Key1 into Char(), you have to know it’s valid input for that function first; this is why your code sometimes failed. And, as I noted, the return value is valid; however, in general, Windows doesn’t use the ASCII representation internally, so don’t expect codes returned from its input functions (which Read_Key uses) to match anything in ASCII.