Table of Contents:
1. IntroductionOne of the most asked questions by OpenCV users is, "How can I stream videos over the network so I can process it on different computer?" OpenCV does not provide such a function by nature (or so I thought), so we have to write custom code to accomplish this task. If you have experiences with network programming before, this should be quite easy. Just like when you send text files over the network, the same apply for this one. only this time we need to convert the received data into OpenCV's IplImage format. In this tutorial, I will explain to you the system I built to stream OpenCV videos over TCP/IP network. Keep in mind that there are many ways to achieve this. Not to mention the freely available video streaming library such as ffmpeg and VLC. This is not about "this one is better", it is just about sharing the knowledge. 2. Design of the SystemThe system follows the client-server model. The computer that has the video input acts as the server. It waits for a client to connect and stream the videos once the connection has established. The diagram is shown below. Fig 1. Streaming OpenCV videos over the network. The diagram above shows several clients connect to the server and receive the streaming video simultaneously. However, to keep things simple, I made it that the server only accepts one client at a time. If we look deeper into the server side, it should consists from two parts. One who read the video input in a loop, and one who waits for the client and send the video frames. It is impossible to have both parts as a single block of code, since they have to run simultaneously at the same time. To overcome this, we have to write a multi-threaded program. The same also apply for the client side. But another problem occurs, Windows and Unix-like systems have different way for handling with threads. While it is possible to write a code that compile and runs on both systems (using C preprocessor), it doesn't necessarily to. Let's just use Unix and throw Windows away. In addition, I use Berkeley Sockets that is widely available on Unix-like systems for the networking code. In summary, to make this as simplest as possible we keep these things in mind:
3. Implementation of the Server-sideThe server side is the computer that has the video input to be streamed. And like I mentioned before, it consists of two parts. One who read the video input in a loop, and the other waits for the client to connect and send the video frames. Fig 2. Stream server diagram. In the diagram above, we see two threads running on the server side:
The full listing of the server side is in stream_server.c. Next we'll see the detail of both threads. 3.a. Frame GrabberThis is the main thread of the server-side. Its just like the usual code to display video from webcam. Below is the code snippet from stream_server.c. Listing 1: Frame Grabber ...
... The code above should somehow look familiar with you. It grabs a frame in a loop, save it to a global variable But there are also some additional lines you should have notices. When the program starts, it creates a new thread for the streaming server by the line:
After that, the function
Also note that the global variable
Just that simple. Now we move to the 3.b. Stream ServerThis thread waits for a client to connect to server on a predefined port. Once the connection has established, it sends the global variable The code snippet of the main loop is shown below. Listing 2: Stream Server ...
... In the main loop above, the raw data of
Note that we make it thread safe by enclosing the lines above with The return value of
And the line:
checks if main code has issued 4. Implementation of the Client-sideWith the server now has set up and ready to accept connection, now we want to write a client to receive and display the streaming video. Just like the code at the server side, this one consists of two parts. One who connects to the server and receives the frames, and the other display the received frame whenever a newer version exist. Fig 3. Stream client diagram. From the diagram above, we see that there are 2 threads running at the client code. Since the client receives frames from a TCP/IP network, it must specify the server's IP address and port number. They are passed as command line arguments when we run the code:
The last 2 arguments are the width and the height of the expected frame. Keep in mind that the server sends only the data, without the image header. So the client must know the width and the height of the expected frame. Q: Why don't we make it that the image header also being sent? The full listing of the client side is in stream_client.c. Now we move into the details of the client side. 4.a. Stream ClientThis thread connects to the server, given its IP address and port number. Once the connection has established, it receives the frames sent from server. Below is the code snippet from stream_client.c. Listing 3: Stream Client ...
... In the loop above, the client receives the data sent by server with
If nothing went wrong, now we have the image data in
The frame has successfully transmitted! Now its the turn for 4.b. Video PlayerThis should be the easiest part from the whole system. Basically it just display the image in a loop. Listing 4: Video Player ...
... First, run the
After that,
Don't forget to terminate
What else? 5. CompilingWith stream_server.c and stream_client.c now in hand, we want to compile and try to stream our cool videos over the network. Here it is. Compiling in *nix: $ gcc stream_server.c -o stream_server \ `pkg-config --cflags opencv` \ `pkg-config --libs opencv` -lpthread $ gcc stream_client.c -o stream_client \ `pkg-config --cflags opencv` \ `pkg-config --libs opencv` -lpthread Compiling in Windows (under Cygwin): $ gcc stream_server.c -o stream_server \ -I"C:\OpenCV\cxcore\include" \ -I"C:\OpenCV\cv\include" \ -I"C:\OpenCV\otherlibs\highgui" \ -L"C:\OpenCV\lib" -lcxcore -lcv -lhighgui -lpthread $ gcc stream_client.c -o stream_client \ -I"C:\OpenCV\cxcore\include" \ -I"C:\OpenCV\cv\include" \ -I"C:\OpenCV\otherlibs\highgui" \ -L"C:\OpenCV\lib" -lcxcore -lcv -lhighgui -lpthread Replace 6. Experiments & GalleryI tested the system on both Windows and BSD operating systems. The thing is, I only got one laptop, so I use VMWare to obtain virtual PCs and simulate TCP/IP network. In my first test, I ran the stream server and stream client on Windows. The input was taken from an avi file which I obtained from Learning OpenCV's code samples. The screenshot is shown below. Click it to view its original size. In my second test, I ran the stream server on Windows and stream client on BSD. The input was taken from webcam. Here it is. Note that I added face detection code at the client side. Obviously, you can do any image/video processing to the received video. That's the point of this project. In my third test, I ran the stream server and stream client on BSD. The input was the movie 300, in MPEG format. Note that you should compile OpenCV with ffmpeg for reading MPEG files. And the last, BSD - Windows streaming server. The input was the movie Defiance, in MPEG format. Again, I added face detection code at the client side. As you can see, the system runs very well and smooth. However, this streaming system uses a lot of bandwidth. This is true since the server streams raw data. You may have to add some video compression if you want to use the system on the Internet. 7. SummaryIn this article, I have showed you how to stream your OpenCV videos over TCP/IP network. With this system, you can have your webcam attached on a computer, and your video processing program resides on different computer. While the project is far from perfect, generally it is all about ideas. Here are some other things that I'm planning to add to the code:
If you encounter any problem compiling and running the code above, feel free to mail me@nashruddin.com. Feedbacks are highly appreciated. 8. Resources(Nash) |