View Single Post
Old 18th Jul 2020, 7:41 am   #19
Karen O
Rest in Peace
 
Join Date: Jul 2011
Location: Bridgnorth, Shropshire, UK.
Posts: 787
Default Re: Orton's rant on serial connection of NIBL computers to modern terminal emulations

I've discovered something which I think should go into this thread:

Quote:
Another 'feature' of NIBL is its internal trick of setting the top bit of a byte to denote the end of a string (its more memory efficient than using a terminating CR or NULL).

Only trouble is, these set top bits are of a habit of getting out onto the serial line, and this can confuse your terminal emulator. So, if you're getting back a mixture of correct text and strange characters, you need to set your terminal emulator to ignore the top bit (i.e. select 7 bit ASCII).
It is true that NIBL can generate serial characters with bit 7 set, but the reason is nothing to do with how NIBL stores strings. All ECHOED characters have the top bit set! The relevant subroutine 'GECO' (get character and echo it) reads in a typed character and echoes it bit at a time. In other words, it samples each bit of serial input and immediately echoes the bit without waiting for the whole character to come in. This is probably to eliminate annoying delay on an already slow comms system (10 characters per second)

It seems that the GECO subroutine is so eager to get back to processing that it begins the stop bit just as soon as bit 7 has been sampled. The result is echoed characters with bit 7 set. This premature bit 7 cum stop bit is completed by a delay at the beginning of GECO - a funny way of doing it but probably with reason behind it.

ASR 33 teletypes were upper case only machines and doubtless ignored bit 7 but our modern terminal emulators often need to be told to ignore bit 7 ('7 bit ASCII').
Karen O is offline