This is a real long shot, but I figure it can't hurt to throw this out here:
I have an IP camera that has an RTSP stream with h.264 video and G.711 audio. I setup a Wowza application (myapp
) to use the rtp-live
StreamType
and then got Flowplayer with the RTMP plugin correctly playing the video. However, I believe the audio isn't being transcoded (it sounds horrible and I don't have transcoding on).
My Streams/StreamType
in conf/myapp/Application.xml
is rtp-live
. I have the RTSP stream specified in content/camera.stream
. That's really it, and it works, putting out a standard RTMP stream at rtmp://myserver/myapp
with stream camera.stream
.
So, I've been trying to enable transcoding of the audio while passing through the video, but everything I find (the Transcoding Guide and others) just confuses me more and all my attempts don't work, often with errors about missing SMIL files (I tried to follow a guide but I might have missed something). I found one guide that specifically outlines audio-only transcoding, but of course I had no luck.
Is this the correct way to approach things? I thought it was and then tried to follow the Flowplayer example for dynamic streaming with Wowza using the F4M and HTTP Streaming plugins, but no joy.
I think a lot of my problem is it's unclear to me what URL format I should use to reference my streams
http://myserver:1935/myapp/smil:camera.stream.smil/manifest.f4m
or maybe
http://myserver:1935/myapp/_definst_/ngrp:camera.stream_all/manifest.f4m
?
Any suggestions of where to start? What should my Application.xml
and my transcoding template look like? Thanks.