I have a spark streaming program, that reads data from a socket I have craeted using:
nc -lk 9999
The program reads the data from socket and exclude the "Error" messages. When I write manually in socket, it works fine.
I have created a python script that prints "Error" messages frequently. I will save the result in a file using:
stdbuf -oL python my_script.py &>> my_file.txt
and read the file from the socket:
nc -lk 9999 | tail -f my_file.txt
Every thing is ok, the socket will read data from file while the file is being updated on the background, But the problem is that my spark program doen't capture the "Error" messages.
As a summary:
when I write manually "Error" messages in socket, spark capture them,
But it won't capture "Error" message generated by python script from socket.
Actually the program doesn't work if I read file from socket instead of typing in it.
What is the difference?
The command you typed
means: Take the output of
netcat
and pipe that totail -f my_file.txt
. Buttail
doesn't accept any input, it merely watches the filemy_file.txt
. Tryinstead, so that the output of
tail
is fed tonc
.