Android - Decoding h264 raw stream manuallyIs there a way to run Python on Android?How do save an Android Activity state using save instance state?Close/hide the Android Soft KeyboardWhy is the Android emulator so slow? How can we speed up the Android emulator?Is there a unique Android device ID?What is 'Context' on Android?Proper use cases for Android UserManager.isUserAGoat()?Android Decode raw h264 stream with MediaCodech264 Raw Video Stream Formath264 inside AVI, MP4 and “Raw” h264 streams. Different format of NAL units (or ffmpeg bug)

How does the +1 Keen Composite Longbow (+2 Str) work?

How did Jean Parisot de Valette, 49th Grand Master of the Order of Malta, die?

Removing Doubles Destroy Topology

How to use Screen Sharing if I don't know the remote Mac's IP address

How do you cope with rejection?

What city and town structures are important in a low fantasy medieval world?

Salesforce bug enabled "Modify All"

tikz: 5 squares on a row, roman numbered 1 -> 5

1950s or earlier book with electrical currents living on Pluto

Does a windmilling propeller create more drag than a stopped propeller in an engine out scenario?

What does it mean to "take the Cross"

How is dynamic resistance of a diode modeled for large voltage variations?

On a piano, are the effects of holding notes and the sustain pedal the same for a single chord?

Managing heat dissipation in a magic wand

Gambler's Fallacy Dice

How do we explain the use of a software on a math paper?

Department head said that group project may be rejected. How to mitigate?

Connecting circles clockwise in TikZ

Do most Taxis give Receipts in London?

Why is this python script running in background consuming 100 % CPU?

Vehemently against code formatting

Will this series of events work to drown the Tarrasque?

Good examples of "two is easy, three is hard" in computational sciences

If the Charles SSL Proxy shows me sensitive data, is that data insecure/exposed?



Android - Decoding h264 raw stream manually


Is there a way to run Python on Android?How do save an Android Activity state using save instance state?Close/hide the Android Soft KeyboardWhy is the Android emulator so slow? How can we speed up the Android emulator?Is there a unique Android device ID?What is 'Context' on Android?Proper use cases for Android UserManager.isUserAGoat()?Android Decode raw h264 stream with MediaCodech264 Raw Video Stream Formath264 inside AVI, MP4 and “Raw” h264 streams. Different format of NAL units (or ffmpeg bug)






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty height:90px;width:728px;box-sizing:border-box;








2















So I am trying to decode a stream of raw h264 data and render it to a surface on Android. Here are the steps:



  1. Receive a packet of h264 stream

  2. Accumulate it and try to extract NAL units (byte sequences starting with 00 00 00 01 (NAL header) and up until the next NAL header.

  3. For every extracted NAL unit, call feedFrame(data) where data is a byte[] that starts with NAL header and contains the extracted unit.

  4. See the video rendered on the surface I provided.

The following code does utilizes the AVC decoder:



public StreamReceiver(DashCamActivity activity, Surface surface, int width, int height, byte[] sps, byte[] pps) 
this.activity = activity;
decoder = MediaCodec.createDecoderByType("video/avc");

format.setByteBuffer("csd-0", ByteBuffer.wrap(sps));
format.setByteBuffer("csd-1", ByteBuffer.wrap(pps));
decoder.configure(format, surface, null, 0);
decoder.start();



public void shutdown()

decoder.stop();
decoder.release();


public void feedFrame(byte[] data)

BufferInfo info = new BufferInfo();
int inputIndex = decoder.dequeueInputBuffer(1000);
if(inputIndex == -1)
return;
ByteBuffer inputBuffer = decoder.getInputBuffers()[inputIndex];
if (inputIndex >= 0)
inputBuffer.clear();
inputBuffer.put(data, 0, data.length);
inputBuffer.clear();
decoder.queueInputBuffer(inputIndex, 0, data.length, 0, 0);


int outIndex = decoder.dequeueOutputBuffer(info, 1000);

switch (outIndex)
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
break;

case MediaCodec.INFO_TRY_AGAIN_LATER:
break;

case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
break;

default:
decoder.releaseOutputBuffer(outIndex, true);
break;




For smaller resolutions (1024x768, 1280x800) everything works perfectly. However with larger resolutions (1920x1080, 1900x600), where the length of the byte array I provide is above 65535 (64k), the video starts having stutters and artifacts and Logcat reports strange decoder errors (e.g. IOCTL_MFC_DEC_EXE failed(ret : -2001) on Galaxy S3).
This also happens on a relatively new device that can play 4k with twice the framerate I provide. So I must be doing something wrong, and I don't know if my 64k theory has any truth in it, it's merely an observation.



So to recap:



  • I am providing individual NAL units to the decoder, starting with the
    header.

  • The h264 stream is of baseline profile, level 4.0.

  • Writing the contents of the NAL units to a file in the order they arrive produces a video file that is fully playable using the basic media players

How do I get it to play at high resolutions?










share|improve this question






















  • Well, you're only giving it 1000us to decode a frame. If the frame isn't ready by then, do you have sufficient logic in place to execute a re-try, or do frames just start queuing up in the decoder?

    – greeble31
    Mar 25 at 21:11











  • There is no retry logic since I don't know if a frame is to be decoded after a given moment. A frame consists of unknown number of NALs. I don't now where a new frame starts and I have no choice but to simply feed them and hope video comes out

    – Vladimir Gazbarov
    Apr 9 at 22:01












  • Seems to me your choice of architecture has left you with a number of unhandled edge cases (that are exacerbated by higher resolutions). Why don't you switch the codec to asynchronous operation? Saves you from having to block for N microseconds when trying to dequeue an output buffer. You'll have to rethink how you queue input buffers, though.

    – greeble31
    Apr 10 at 18:35

















2















So I am trying to decode a stream of raw h264 data and render it to a surface on Android. Here are the steps:



  1. Receive a packet of h264 stream

  2. Accumulate it and try to extract NAL units (byte sequences starting with 00 00 00 01 (NAL header) and up until the next NAL header.

  3. For every extracted NAL unit, call feedFrame(data) where data is a byte[] that starts with NAL header and contains the extracted unit.

  4. See the video rendered on the surface I provided.

The following code does utilizes the AVC decoder:



public StreamReceiver(DashCamActivity activity, Surface surface, int width, int height, byte[] sps, byte[] pps) 
this.activity = activity;
decoder = MediaCodec.createDecoderByType("video/avc");

format.setByteBuffer("csd-0", ByteBuffer.wrap(sps));
format.setByteBuffer("csd-1", ByteBuffer.wrap(pps));
decoder.configure(format, surface, null, 0);
decoder.start();



public void shutdown()

decoder.stop();
decoder.release();


public void feedFrame(byte[] data)

BufferInfo info = new BufferInfo();
int inputIndex = decoder.dequeueInputBuffer(1000);
if(inputIndex == -1)
return;
ByteBuffer inputBuffer = decoder.getInputBuffers()[inputIndex];
if (inputIndex >= 0)
inputBuffer.clear();
inputBuffer.put(data, 0, data.length);
inputBuffer.clear();
decoder.queueInputBuffer(inputIndex, 0, data.length, 0, 0);


int outIndex = decoder.dequeueOutputBuffer(info, 1000);

switch (outIndex)
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
break;

case MediaCodec.INFO_TRY_AGAIN_LATER:
break;

case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
break;

default:
decoder.releaseOutputBuffer(outIndex, true);
break;




For smaller resolutions (1024x768, 1280x800) everything works perfectly. However with larger resolutions (1920x1080, 1900x600), where the length of the byte array I provide is above 65535 (64k), the video starts having stutters and artifacts and Logcat reports strange decoder errors (e.g. IOCTL_MFC_DEC_EXE failed(ret : -2001) on Galaxy S3).
This also happens on a relatively new device that can play 4k with twice the framerate I provide. So I must be doing something wrong, and I don't know if my 64k theory has any truth in it, it's merely an observation.



So to recap:



  • I am providing individual NAL units to the decoder, starting with the
    header.

  • The h264 stream is of baseline profile, level 4.0.

  • Writing the contents of the NAL units to a file in the order they arrive produces a video file that is fully playable using the basic media players

How do I get it to play at high resolutions?










share|improve this question






















  • Well, you're only giving it 1000us to decode a frame. If the frame isn't ready by then, do you have sufficient logic in place to execute a re-try, or do frames just start queuing up in the decoder?

    – greeble31
    Mar 25 at 21:11











  • There is no retry logic since I don't know if a frame is to be decoded after a given moment. A frame consists of unknown number of NALs. I don't now where a new frame starts and I have no choice but to simply feed them and hope video comes out

    – Vladimir Gazbarov
    Apr 9 at 22:01












  • Seems to me your choice of architecture has left you with a number of unhandled edge cases (that are exacerbated by higher resolutions). Why don't you switch the codec to asynchronous operation? Saves you from having to block for N microseconds when trying to dequeue an output buffer. You'll have to rethink how you queue input buffers, though.

    – greeble31
    Apr 10 at 18:35













2












2








2


1






So I am trying to decode a stream of raw h264 data and render it to a surface on Android. Here are the steps:



  1. Receive a packet of h264 stream

  2. Accumulate it and try to extract NAL units (byte sequences starting with 00 00 00 01 (NAL header) and up until the next NAL header.

  3. For every extracted NAL unit, call feedFrame(data) where data is a byte[] that starts with NAL header and contains the extracted unit.

  4. See the video rendered on the surface I provided.

The following code does utilizes the AVC decoder:



public StreamReceiver(DashCamActivity activity, Surface surface, int width, int height, byte[] sps, byte[] pps) 
this.activity = activity;
decoder = MediaCodec.createDecoderByType("video/avc");

format.setByteBuffer("csd-0", ByteBuffer.wrap(sps));
format.setByteBuffer("csd-1", ByteBuffer.wrap(pps));
decoder.configure(format, surface, null, 0);
decoder.start();



public void shutdown()

decoder.stop();
decoder.release();


public void feedFrame(byte[] data)

BufferInfo info = new BufferInfo();
int inputIndex = decoder.dequeueInputBuffer(1000);
if(inputIndex == -1)
return;
ByteBuffer inputBuffer = decoder.getInputBuffers()[inputIndex];
if (inputIndex >= 0)
inputBuffer.clear();
inputBuffer.put(data, 0, data.length);
inputBuffer.clear();
decoder.queueInputBuffer(inputIndex, 0, data.length, 0, 0);


int outIndex = decoder.dequeueOutputBuffer(info, 1000);

switch (outIndex)
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
break;

case MediaCodec.INFO_TRY_AGAIN_LATER:
break;

case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
break;

default:
decoder.releaseOutputBuffer(outIndex, true);
break;




For smaller resolutions (1024x768, 1280x800) everything works perfectly. However with larger resolutions (1920x1080, 1900x600), where the length of the byte array I provide is above 65535 (64k), the video starts having stutters and artifacts and Logcat reports strange decoder errors (e.g. IOCTL_MFC_DEC_EXE failed(ret : -2001) on Galaxy S3).
This also happens on a relatively new device that can play 4k with twice the framerate I provide. So I must be doing something wrong, and I don't know if my 64k theory has any truth in it, it's merely an observation.



So to recap:



  • I am providing individual NAL units to the decoder, starting with the
    header.

  • The h264 stream is of baseline profile, level 4.0.

  • Writing the contents of the NAL units to a file in the order they arrive produces a video file that is fully playable using the basic media players

How do I get it to play at high resolutions?










share|improve this question














So I am trying to decode a stream of raw h264 data and render it to a surface on Android. Here are the steps:



  1. Receive a packet of h264 stream

  2. Accumulate it and try to extract NAL units (byte sequences starting with 00 00 00 01 (NAL header) and up until the next NAL header.

  3. For every extracted NAL unit, call feedFrame(data) where data is a byte[] that starts with NAL header and contains the extracted unit.

  4. See the video rendered on the surface I provided.

The following code does utilizes the AVC decoder:



public StreamReceiver(DashCamActivity activity, Surface surface, int width, int height, byte[] sps, byte[] pps) 
this.activity = activity;
decoder = MediaCodec.createDecoderByType("video/avc");

format.setByteBuffer("csd-0", ByteBuffer.wrap(sps));
format.setByteBuffer("csd-1", ByteBuffer.wrap(pps));
decoder.configure(format, surface, null, 0);
decoder.start();



public void shutdown()

decoder.stop();
decoder.release();


public void feedFrame(byte[] data)

BufferInfo info = new BufferInfo();
int inputIndex = decoder.dequeueInputBuffer(1000);
if(inputIndex == -1)
return;
ByteBuffer inputBuffer = decoder.getInputBuffers()[inputIndex];
if (inputIndex >= 0)
inputBuffer.clear();
inputBuffer.put(data, 0, data.length);
inputBuffer.clear();
decoder.queueInputBuffer(inputIndex, 0, data.length, 0, 0);


int outIndex = decoder.dequeueOutputBuffer(info, 1000);

switch (outIndex)
case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
break;

case MediaCodec.INFO_TRY_AGAIN_LATER:
break;

case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
break;

default:
decoder.releaseOutputBuffer(outIndex, true);
break;




For smaller resolutions (1024x768, 1280x800) everything works perfectly. However with larger resolutions (1920x1080, 1900x600), where the length of the byte array I provide is above 65535 (64k), the video starts having stutters and artifacts and Logcat reports strange decoder errors (e.g. IOCTL_MFC_DEC_EXE failed(ret : -2001) on Galaxy S3).
This also happens on a relatively new device that can play 4k with twice the framerate I provide. So I must be doing something wrong, and I don't know if my 64k theory has any truth in it, it's merely an observation.



So to recap:



  • I am providing individual NAL units to the decoder, starting with the
    header.

  • The h264 stream is of baseline profile, level 4.0.

  • Writing the contents of the NAL units to a file in the order they arrive produces a video file that is fully playable using the basic media players

How do I get it to play at high resolutions?







android video h.264 decoder






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Mar 23 at 19:31









Vladimir GazbarovVladimir Gazbarov

4161721




4161721












  • Well, you're only giving it 1000us to decode a frame. If the frame isn't ready by then, do you have sufficient logic in place to execute a re-try, or do frames just start queuing up in the decoder?

    – greeble31
    Mar 25 at 21:11











  • There is no retry logic since I don't know if a frame is to be decoded after a given moment. A frame consists of unknown number of NALs. I don't now where a new frame starts and I have no choice but to simply feed them and hope video comes out

    – Vladimir Gazbarov
    Apr 9 at 22:01












  • Seems to me your choice of architecture has left you with a number of unhandled edge cases (that are exacerbated by higher resolutions). Why don't you switch the codec to asynchronous operation? Saves you from having to block for N microseconds when trying to dequeue an output buffer. You'll have to rethink how you queue input buffers, though.

    – greeble31
    Apr 10 at 18:35

















  • Well, you're only giving it 1000us to decode a frame. If the frame isn't ready by then, do you have sufficient logic in place to execute a re-try, or do frames just start queuing up in the decoder?

    – greeble31
    Mar 25 at 21:11











  • There is no retry logic since I don't know if a frame is to be decoded after a given moment. A frame consists of unknown number of NALs. I don't now where a new frame starts and I have no choice but to simply feed them and hope video comes out

    – Vladimir Gazbarov
    Apr 9 at 22:01












  • Seems to me your choice of architecture has left you with a number of unhandled edge cases (that are exacerbated by higher resolutions). Why don't you switch the codec to asynchronous operation? Saves you from having to block for N microseconds when trying to dequeue an output buffer. You'll have to rethink how you queue input buffers, though.

    – greeble31
    Apr 10 at 18:35
















Well, you're only giving it 1000us to decode a frame. If the frame isn't ready by then, do you have sufficient logic in place to execute a re-try, or do frames just start queuing up in the decoder?

– greeble31
Mar 25 at 21:11





Well, you're only giving it 1000us to decode a frame. If the frame isn't ready by then, do you have sufficient logic in place to execute a re-try, or do frames just start queuing up in the decoder?

– greeble31
Mar 25 at 21:11













There is no retry logic since I don't know if a frame is to be decoded after a given moment. A frame consists of unknown number of NALs. I don't now where a new frame starts and I have no choice but to simply feed them and hope video comes out

– Vladimir Gazbarov
Apr 9 at 22:01






There is no retry logic since I don't know if a frame is to be decoded after a given moment. A frame consists of unknown number of NALs. I don't now where a new frame starts and I have no choice but to simply feed them and hope video comes out

– Vladimir Gazbarov
Apr 9 at 22:01














Seems to me your choice of architecture has left you with a number of unhandled edge cases (that are exacerbated by higher resolutions). Why don't you switch the codec to asynchronous operation? Saves you from having to block for N microseconds when trying to dequeue an output buffer. You'll have to rethink how you queue input buffers, though.

– greeble31
Apr 10 at 18:35





Seems to me your choice of architecture has left you with a number of unhandled edge cases (that are exacerbated by higher resolutions). Why don't you switch the codec to asynchronous operation? Saves you from having to block for N microseconds when trying to dequeue an output buffer. You'll have to rethink how you queue input buffers, though.

– greeble31
Apr 10 at 18:35












0






active

oldest

votes












Your Answer






StackExchange.ifUsing("editor", function ()
StackExchange.using("externalEditor", function ()
StackExchange.using("snippets", function ()
StackExchange.snippets.init();
);
);
, "code-snippets");

StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "1"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);

else
createEditor();

);

function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);



);













draft saved

draft discarded


















StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55317582%2fandroid-decoding-h264-raw-stream-manually%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown

























0






active

oldest

votes








0






active

oldest

votes









active

oldest

votes






active

oldest

votes















draft saved

draft discarded
















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid


  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.

To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f55317582%2fandroid-decoding-h264-raw-stream-manually%23new-answer', 'question_page');

);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Kamusi Yaliyomo Aina za kamusi | Muundo wa kamusi | Faida za kamusi | Dhima ya picha katika kamusi | Marejeo | Tazama pia | Viungo vya nje | UrambazajiKuhusu kamusiGo-SwahiliWiki-KamusiKamusi ya Kiswahili na Kiingerezakuihariri na kuongeza habari

Swift 4 - func physicsWorld not invoked on collision? The Next CEO of Stack OverflowHow to call Objective-C code from Swift#ifdef replacement in the Swift language@selector() in Swift?#pragma mark in Swift?Swift for loop: for index, element in array?dispatch_after - GCD in Swift?Swift Beta performance: sorting arraysSplit a String into an array in Swift?The use of Swift 3 @objc inference in Swift 4 mode is deprecated?How to optimize UITableViewCell, because my UITableView lags

Access current req object everywhere in Node.js ExpressWhy are global variables considered bad practice? (node.js)Using req & res across functionsHow do I get the path to the current script with Node.js?What is Node.js' Connect, Express and “middleware”?Node.js w/ express error handling in callbackHow to access the GET parameters after “?” in Express?Modify Node.js req object parametersAccess “app” variable inside of ExpressJS/ConnectJS middleware?Node.js Express app - request objectAngular Http Module considered middleware?Session variables in ExpressJSAdd properties to the req object in expressjs with Typescript