I am coding with serial communication in windows. In this communication, in order for program to recognize a start of message, i have to use parity bit.
It means that if the byte received with setting parity bit is a start point of a message.
For example, If receiving bytes like below from a serial port
([byte]
means a byte received, and [byte]p
means a byte with setting a parity bit)
Serial port <-[byte]<- [byte]<- [byte]<- [byte]<- [byte]p<- [byte] : sequence bytes received
I have to parse above the message after discarding 4 bytes before a byte with setting parity bit.
In the case of Linux, a Setting parity bit is represented as 0XFF 0x00
so if i receive 1 byte 0xEE with parity.
It is denoted as 0xFF 0x00 0xEE
, so that I can picked the start point.
But in windows, parity bit seems to be represented as event EV_ERR
by WaitCommEvent()
And read data separately by ReadFile()
.
I think it is difficult to find out where parity error happen to distinguish a start point of message.
Is there any way to solve this problem, actually since i am new at windows programming, i think there must exist other way, right ?
The documentation seems clear on how parity errors are handled (or not): "Because the operating system determines whether to raise this event or not, not all parity errors may be reported"