Sorry about the title. I used quotes in the title and they were removed.
I am testing a class I wrote to send data packets to a device using a TCP connection. I have set up a little separate network in my office and while testing on this network I encounter very few problems, still there seems to be problems. about 1 in 20,000 packets seem to be "distorted" when they arrive at the device (that is what the designer of the device tells me).
The problem increases if I communicate with the device on a larger LAN network (several nodes, switches, routers and greater distance), about 2% of the packets seems to be "distorted" when received by the device. Sending packets to a device over the internet (outside of the LAN) seems to produce similar results as the larger LAN network.
The packets I send are about 35 bytes long.
I was under the assumption that the TCP protocol included error checking (and error correcting) functionality that would prevent a packet from being "distorted".
My class also includes methods for sending usising the UDP protocol and I see slightly more packets being "distorted" when compared to TCP.
Any insight or thoughts appreciated.
Einar
Semper ubi sub ubi.