Hi,
I have written some server code which has a method which handles the receiving of data from the client. It has a select method with checks for data to read from the socket and has a timeout value. The client sends a message to my server every second. If I set my code running, then stop the client, I would have expected the select method to return a value of -1. But this doesn't appear to happen. If the client closes, the return value appears to be > 0 but then when the recv method is called, the bytes received is zero.
I have put the code below, with comments and questions. Could somebody please explain exactly what is going on please. I can get my code to do what i want, but not how I expected, so I really don't understand why my code works! Many, many thanks.
FD_ZERO(&rxSet);
FD_SET(m_clientSocket, &rxSet);
maxFD = m_clientSocket;
sockVal = select(maxFD, &rxSet, NULL, NULL, &timeOutVal);
if(sockVal > 0)
{
int bytesReceived = recv(m_clientSocket, recvBuf, recvBufLen - 1, 0);
recvBuf[bytesReceived] = '\0';
if(bytesReceived == -1)
{
printf("server socket error\n");
break;
}
if(bytesReceived > 0)
{
RespondToAllMessagesFromController(recvBuf);
noDataReceivedCount = 0; }
if(bytesReceived == 0)
{
break; }
}
else if(sockVal < 0)
{
int test = 0;
}
else {
}