本文相关的完整源码下载:http://bbs.rosoo.net/forum.php?mod=viewthread&tid=8535
With the protocol handler in place, let's revisit the RTPSourceStream and StreamingDataSource classes from earlier, where they contained only place-holder methods. The StreamingDataSource is simple to code:
- import java.io.IOException;
- import javax.microedition.media.Control;
- import javax.microedition.media.protocol.DataSource;
- import javax.microedition.media.protocol.SourceStream;
-
- public class StreamingDataSource extends DataSource {
-
-
- private String locator;
-
-
-
- private SourceStream[] streams;
-
-
- private Boolean connected = false;
-
- public StreamingDataSource(String locator) {
- super(locator);
- setLocator(locator);
- }
-
- public void setLocator(String locator) { this.locator = locator; }
-
- public String getLocator() { return locator; }
-
- public void connect() throws IOException {
-
-
- if (connected) return;
-
-
- if (locator == null)
- throw new IOException("locator is null");
-
-
- streams = new RTPSourceStream[1];
-
-
- streams[0] = new RTPSourceStream(locator);
-
-
- connected = true;
-
- }
-
- public void disconnect() {
-
-
- if (streams != null) {
-
-
- try {
- ((RTPSourceStream)streams[0]).close();
- } catch(IOException ioex) {}
- }
-
-
- connected = false;
- }
-
- public void start() throws IOException {
-
- if(!connected) return;
-
-
- ((RTPSourceStream)streams[0]).start();
-
- }
-
- public void stop() throws IOException {
-
- if(!connected) return;
-
-
- ((RTPSourceStream)streams[0])Close();
-
- }
-
- public String getContentType() {
-
- return "video/mpeg";
- }
-
- public Control[] getControls() { return new Control[0]; }
-
- public Control getControl(String controlType) { return null; }
-
- public SourceStream[] getStreams() { return streams; }
-
- }
The main work takes place in the connect() method. It creates a new RTPSourceStream with the requested address. Notice that the getContentType() method returns video/mpeg as the default content type, but change it to the supported content type for your system. Of course, this should not be hard-coded; it should be based on the actual support for different media types.
The next listing shows the complete RTPSourceStream class, which, along with RTSPProtocolHandler , does the bulk of work in connecting getting the RTP packets of the server:
- import java.io.IOException;
- import java.io.InputStream;
- import java.io.OutputStream;
- import javax.microedition.io.Datagram;
- import javax.microedition.io.Connector;
- import javax.microedition.media.Control;
- import javax.microedition.io.SocketConnection;
- import javax.microedition.io.DatagramConnection;
- import javax.microedition.media.protocol.SourceStream;
- import javax.microedition.media.protocol.ContentDescriptor;
-
- public class RTPSourceStream implements SourceStream {
-
- private RTSPProtocolHandler handler;
-
- private InputStream is;
- private OutputStream Os;
-
- private DatagramConnection socket;
-
- public RTPSourceStream(String address) throws IOException {
-
-
-
-
-
-
-
- SocketConnection sc =
- (SocketConnection)Connector.open("socket://localhost:554");
-
-
- is = sc.openInputStream();
- Os = sc.openOutputStream();
-
-
- handler = new RTSPProtocolHandler(address, is, Os);
-
-
- handler.doDescribe();
- handler.doSetup();
- }
-
- public void start() throws IOException {
-
-
- socket = (DatagramConnection)Connector.open("datagram://:8080");
-
-
- handler.doPlay();
- }
-
- public void close() throws IOException {
-
- if(handler != null) handler.doTeardown();
-
- is.close();
- os.close();
- }
-
- public int read(byte[] buffer, int offset, int length)
- throws IOException {
-
-
- byte[] fullPkt = new byte[length];
-
-
- Datagram packet = socket.newDatagram(fullPkt, length);
-
-
- socket.receive(packet);
-
-
- RTPPacket rtpPacket = getRTPPacket(packet, packet.getData());
- buffer = rtpPacket.getData();
-
-
- System.err.println(rtpPacket + " with media length: " + buffer.length);
-
-
- return buffer.length;
- }
-
-
- private RTPPacket getRTPPacket(Datagram packet, byte[] buf) {
-
-
- long SSRC = 0;
-
-
- byte PT = 0;
-
-
- int timeStamp = 0;
-
-
- short seqNo = 0;
-
-
-
-
- PT =
- (byte)((buf[1] & 0xff) & 0x7f);
-
- seqNo =
- (short)((buf[2] << 8) | ( buf[3] & 0xff));
-
- timeStamp =
- (((buf[4] & 0xff) << 24) | ((buf[5] & 0xff) << 16) |
- ((buf[6] & 0xff) << 8) | (buf[7] & 0xff)) ;
-
- SSRC =
- (((buf[8] & 0xff) << 24) | ((buf[9] & 0xff) << 16) |
- ((buf[10] & 0xff) << 8) | (buf[11] & 0xff));
-
-
-
- RTPPacket rtpPkt = new RTPPacket();
-
-
- rtpPkt.setSequenceNumber(seqNo);
-
-
- rtpPkt.setTimeStamp(timeStamp);
-
-
- rtpPkt.setSSRC(SSRC);
-
-
- rtpPkt.setPayloadType(PT);
-
-
-
- byte payload[] = new byte [packet.getLength() - 12];
-
- for(int i=0; i < payload.length; i++) payload [i] = buf[i+12];
-
-
- rtpPkt.setData(payload);
-
-
- return rtpPkt;
-
- }
-
- public long seek(long where) throws IOException {
- throw new IOException("cannot seek");
- }
-
- public long tell() { return -1; }
-
- public int getSeekType() { return NOT_SEEKABLE; }
-
- public Control[] getControls() { return null; }
-
- public Control getControl(String controlType) { return null; }
-
- public long getContentLength() { return -1; }
-
- public int getTransferSize() { return -1; }
-
- public ContentDescriptor getContentDescriptor() {
- return new ContentDescriptor("audio/rtp");
- }
- }
The constructor for the
RTPSourceStream
creates a
SocketConnection
to the remote server (hard-coded to the local server and port here, but you can change this to accept any server or port). It then opens the input and output streams, which it uses to create the
RTSPProtocolHandler
. Finally, using this handler, it sends the
DESCRIBE
and
SETUP
commands to the remote server to get the server ready to send the packets. The actual delivery doesn't start until the
start()
method is called by the
StreamingDataSource
, which opens up a local port (hard-coded to
8081
in this case) for receiving the packets and sends the
PLAY
command to start receiving these packets. The actual reading of the packets is done in the
read()
method, which receives the individual packets, strips them to create the
RTPPacket
instances (with the
getRTPPacket()
method), and returns the media data in the buffer supplied while calling the
read()
method.
A MIDlet to see if it works
With all the classes in place, let's write a simple MIDlet to first create a Player instance that will use the StreamingDataSource to connect to the server and then get media packets from it. The Player interface is defined by the MMAPI and allows you to control the playback (or recording) of media. Instances of this interface are created by using the Manager class from the MMAPI javax.microedition.media package (see the MMAPI tutorial). The following shows this rudimentary MIDlet:
- import javax.microedition.media.Player;
- import javax.microedition.midlet.MIDlet;
- import javax.microedition.media.Manager;
-
- public class StreamingMIDlet extends MIDlet {
-
- public void startApp() {
-
- try {
-
-
- Player player =
- Manager.createPlayer(
- new StreamingDataSource(
- "rtsp://localhost:554/sample_100kbit.mp4"));
-
- player.realize();
-
- player.start();
-
- } catch(Exception e) {
- e.printStackTrace();
- }
- }
-
- public void pauseApp() {}
-
- public void destroyApp(boolean unconditional) {}
- }
So what should happen when you run this MIDlet in the Wireless toolkit? I have on purpose left out any code to display the resulting video on screen. When I run it in the toolkit, I know that I am receiving the packets because I see the debug statements as shown in Figure 2.
Figure 2. Running StreamingMIDlet output
The RTP packets as sent by the server are being received. The StreamingDataSource along with the RTSPProtocolHandler and RTPSourceStream are doing their job of making the streaming server send these packets. This is confirmed by looking at the streaming server's admin console as shown in Figure 3.
Figure 3. Darwin's admin console shows that the file is being streamed (click for full-size image).
Unfortunately, the player constructed by the Wireless toolkit is trying to read the entire content at one go. Even if I were to make a StreamingVideoControl , it will not display the video until it has read the whole file, therefore defeating the purpose of the streaming aspect of this whole experiment. So what needs to be done to achieve the full streaming experience?
Ideally, MMAPI should provide the means for developers to register the choice of Player for the playback of certain media. This is easily achieved by providing a new method in the Manager class for registering (or overriding) MIME types or protocols with developer-made Player instances. For example, let's say I create a Player instance that reads streaming data called StreamingMPEGPlayer. With the Manager class, I should be able to say Manager.registerPlayer("video/mpeg", StreamingMPEGPlayer.class) or Manager.registerPlayer("rtsp", StreamingMPEGPlayer.class) . MMAPI should then simply load this developer-made Player instance and use this as the means to read data from the developer-made datasource.
In a nutshell, you need to be able to create an independent media player and register it as the choice of instance for playing the desired content. Unfortunately, this is not possible with the current MMAPI implementation, and this is the data consumption conundrum that I had talked about earlier.
Of course, if you can test this code in a toolkit that does not need to read the complete data before displaying it (or for audio files, playing them), then you have achieved the aim of streaming data using the existing MMAPI implementation.
This experiment should prove that you can stream data with the current MMAPI implementation, but you may not be able to manipulate it in a useful manner until you have better control over the Manager and Player instances. I look forward to your comments and experiments using this code.
Resources
Vikram Goyal is the author of Pro Java ME MMAPI.
View all java.net Articles.
(feidragon319) |