I have an application on a Parallax BASIC Stamp board that reads text commands and executes test cases based on the commands. One test case that sends data via the SPI bus and reads from the SPI bus is failing, depending upon the burst rate of the DEBUGIN text.
The Stamp Board is connected to a PC (Quad core 2+ GHZ), through serial port at 19200 baud.
When I use the BASIC Stamp Terminal or Hyper Terminal to send commands to the Stamp Board, the test passes. When I send the same commands via a C# application, the test fails. The primary difference is the burst rate at which the text is sent to the Stamp Board.
Humans send text slower than computers (the application). When using Hyper Terminal, one character is sent at 19200 baud. The application is sending 8 characters at 19200 baud with no pauses between characters.
I'm looking for an explanation how the DEBUGIN statement (input through the serial port) affects the SHIFTIN or SHIFTOUT commands, or if anybody knows how to resolve this issue.
Unfortunately, the baud rate of the DEBUGIN command cannot be changed. The alternative is have a custom version (including conversion of text to numbers) using the serial port command at a slower speed (which uses extra valuable space, which there is little of on my project).
If posting to StackEchange is the wrong forum, please migrate and post the reason it was migrated.
It sounds like the microcontroller end is not set up to do a good job of servicing both the UART and the SPI peripherals, and so too rapid an arrival of subsequent characters on the UART is either causing the SPI to not be serviced, or perhaps causing some characters on the UART to be missed.
The robust solution is to understand the problem and fix it in the architecture of the micro controller code. For example, you may need to use interrupts and to have the interrupt service routines move characters between longer software-managed fifos and the often 1- or 2- deep fifo in the peripheral hardware.
A possibly workable but riskier solution is to have your C# application insert delays between the characters it is sending, to take advantage of the apparent working of human-speed typing. A variation on this theme is to have the embedded device echo characters, and have the C# program wait for the echo of each character before it sends the next (you'll also need an escape character to clear the embedded command buffer and start over if the C# program decides to declare the embedded device timed out and start over)
Another idea is to shorten the data that has to be sent. Human readable command languages to embedded systems are great, because as you've noticed you can play with them using a terminal application. However, if the embedded system is extremely constrained, using a packed binary or hex format can make it easier to parse. An extreme of single-character commands with pauses in between for their execution is the simplest case of this (and if you primarily use alphanumerics, you are back to being able to use the terminal program)