This question is related to this question I asked
To summarise, I'm playing with websockets at the moment and I'm trying to understand how to authenticate a client connecting to the server using a websocket connection.
On a normal connection, I use token based authentication, I basically just get a token from the server after I log in. Every time I make a request to the server, I put it in a custom header called Authentication which my server reads it from there.
With websockets, this doesn't work because websockets do not have custom headers. I'm left with two options to pass this token.
1) putting the token in the query string - obviously not a great option. Token can be logged by server etc.
2) putting the token in a cookie - this works but only when the client and server sit on the same domain. There's also other restrictions like the client has to be a browser and support cookies etc.
Anyway, the other question is trying to find a solution, this question is about understanding why this is a problem. Why doesn't websockets support custom headers? It's unlikely to be an oversight - websockets and token based authentication are both fairly mature technologies. Is there some sort of security issue with allowing custom headers during the websocket connection?