Hi Pavel,
So perhaps there are no changes required to the M3 firmware files themselves... Maybe then this issue is unrelated to the memory map. I was able to turn on some more debug information. Below is the output of a call to "gst-inspect omx_scaler." The gst-inspect application hangs on the last line below. Does anyone have any ideas?
:~# gst-inspect omx_scaler
Factory Details:
Long name: OpenMAX IL for OMX.TI.VPSSM3.VFPC.INDTXSCWB component
Class: Filter
Description: Scale video using VPSS Scaler module
Author(s): Brijesh Singh
Rank: primary (256)
Plugin Details:
Name: omx
Description: OpenMAX IL
Filename: /usr/lib/gstreamer-0.10/libgstomx.so
Version: 0.3
License: LGPL
Source module: gst-openmax
Binary package: gst-openmax source release
Origin URL: Unknown package origin
GObject
+----GstObject
+----GstElement
+----GstOmxBaseFilter
+----GstOmxBaseVfpc
+----GstOmxScaler
Implemented Interfaces:
GstImplementsInterface
GstOmx
Pad Templates:
SINK template: 'sink'
Availability: Always
Capabilities:
video/x-raw-yuv
format: { NV12 }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
video/x-raw-yuv-strided
format: { NV12 }
rowstride: [ 0, 2147483647 ]
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
SRC template: 'src'
Availability: Always
Capabilities:
video/x-raw-yuv
format: { YUY2 }
width: [ 1, 2147483647 ]
height: [ 1, 2147483647 ]
framerate: [ 0/1, 2147483647/1 ]
Element Flags:
no flags set
Element Implementation:
Has change_state() function: 0x4063a0c0
Has custom save_thyself() function: gst_element_save_thyself
Has custom restore_thyself() function: gst_element_restore_thyself
Element has no clocking capabilities.
Element has no indexing capabilities.
Element has no URI handling capabilities.
Pads:
SRC: 'src'
Implementation:
Has custom eventfunc(): gst_pad_event_default
Has custom queryfunc(): gst_pad_query_default
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
Has getcapsfunc(): gst_pad_get_fixed_caps_func
Has setcapsfunc(): src_setcaps
Has acceptcapsfunc(): gst_pad_acceptcaps_default
Pad Template: 'src'
SINK: 'sink'
Implementation:
Has chainfunc(): 0x40638f4c
Has custom eventfunc(): 0x4065cdac
Has custom queryfunc(): gst_pad_query_default
Has custom iterintlinkfunc(): gst_pad_iterate_internal_links_default
Has setcapsfunc(): sink_setcaps
Has acceptcapsfunc(): gst_pad_acceptcaps_default
Pad Template: 'sink'
Element Properties:
name : The name of the object
flags: readable, writable
String. Default: null Current: "omxscaler0"
component-role : Role of the OpenMAX IL component
flags: readable, writable
String. Default: null Current: null
component-name : Name of the OpenMAX IL component to use
flags: readable, writable
String. Default: null Current: "OMX.TI.VPSSM3.VFPC.INDTXSCWB"
library-name : Name of the OpenMAX IL implementation library to use
flags: readable, writable
String. Default: null Current: "libOMX_Core.so"
use-timestamps : Whether or not to use timestamps
flags: readable, writable
Boolean. Default: true Current: true
input-buffers : The number of OMX input buffers
flags: Entering OMX_Init: (void)
Module<ti.omx> Entering<DomxInit> @line<142>
Entered function:DomxCore_procInit
SysLink_setup() complete
Module<ti.omx> Entering<DomxCore_mapDomxCore2MultiProcId> @line<269>
Module<ti.omx> Leaving<DomxCore_mapDomxCore2MultiProcId> @line<275> with error<0:ErrorNone>
Module<ti.omx> Entering<OmxRpc_moduleRegisterMsgqHeap> @line<892>
Module<ti.omx> Entering<DomxCore_mapDomxCore2MultiProcId> @line<269>
Module<ti.omx> Leaving<DomxCore_mapDomxCore2MultiProcId> @line<275> with error<0:ErrorNone>
Module<ti.omx> Entering<DmmDelegate_createIpcHeap> @line<181>
Module<ti.omx> Leaving<DmmDelegate_createIpcHeap> @line<195> with error<0:ErrorNone>
Module<ti.omx> @<OmxRpc_moduleRegisterMsgqHeap> @line<927> msg<Before MessageQ_registerHeap>
Module<ti.omx> @<OmxRpc_moduleRegisterMsgqHeap> @line<932> msg<After MessageQ_registerHeap>
Module<ti.omx> Leaving<OmxRpc_moduleRegisterMsgqHeap> @line<935> with error<0:ErrorNone>
Module<ti.omx> Entering<DomxCore_mapDomxCore2MultiProcId> @line<269>
Module<ti.omx> Leaving<DomxCore_mapDomxCore2MultiProcId> @line<275> with error<0:ErrorNone>
Module<ti.omx> Entering<OmxRpc_moduleRegisterMsgqHeap> @line<892>
Module<ti.omx> Entering<DomxCore_mapDomxCore2MultiProcId> @line<269>
Module<ti.omx> Leaving<DomxCore_mapDomxCore2MultiProcId> @line<275> with error<0:ErrorNone>
Module<ti.omx> Entering<DmmDelegate_createIpcHeap> @line<181>
Module<ti.omx> Leaving<DmmDelegate_createIpcHeap> @line<195> with error<0:ErrorNone>
Module<ti.omx> @<OmxRpc_moduleRegisterMsgqHeap> @line<927> msg<Before MessageQ_registerHeap>
Module<ti.omx> @<OmxRpc_moduleRegisterMsgqHeap> @line<932> msg<After MessageQ_registerHeap>
Module<ti.omx> Leaving<OmxRpc_moduleRegisterMsgqHeap> @line<935> with error<0:ErrorNone>
Module<ti.omx> Entering<DomxCore_mapDomxCore2MultiProcId> @line<269>
Module<ti.omx> Leaving<DomxCore_mapDomxCore2MultiProcId> @line<275> with error<0:ErrorNone>
Module<ti.omx> Entering<OmxRpc_moduleRegisterMsgqHeap> @line<892>
Module<ti.omx> Entering<DomxCore_mapDomxCore2MultiProcId> @line<269>
Module<ti.omx> Leaving<DomxCore_mapDomxCore2MultiProcId> @line<275> with error<0:ErrorNone>
Module<ti.omx> Entering<DmmDelegate_createIpcHeap> @line<181>
Module<ti.omx> Leaving<DmmDelegate_createIpcHeap> @line<195> with error<0:ErrorNone>
Module<ti.omx> @<OmxRpc_moduleRegisterMsgqHeap> @line<927> msg<Before MessageQ_registerHeap>
Module<ti.omx> @<OmxRpc_moduleRegisterMsgqHeap> @line<932> msg<After MessageQ_registerHeap>
Module<ti.omx> Leaving<OmxRpc_moduleRegisterMsgqHeap> @line<935> with error<0:ErrorNone>
Module<ti.omx> @<DomxInit> @line<183> msg<Waiting for Ipc_attach to happen b/w slave cores>
Module<ti.omx> @<DomxInit> @line<186> msg<Wait completed for Ipc_attach to happen b/w slave cores>
Module<ti.omx> Entering<OmxRpc_moduleInitServer> @line<793>
Module<ti.omx> Entering<omxrpc_rcm_server_create> @line<220>
@ omxrpc_rcm_server_create: rcmServerName OmxRpcRcmServer_3, priority 14
Module<ti.omx> @<omxrpc_rcm_server_create> @line<225> msg<Before RcmServer_Params_init>
Module<ti.omx> @<omxrpc_rcm_server_create> @line<229> msg<After RcmServer_Params_init>
Module<ti.omx> @<omxrpc_rcm_server_create> @line<232> msg<Before RcmServer_create>
Module<ti.omx> @<omxrpc_rcm_server_create> @line<240> msg<After RcmServer_create>
Module<ti.omx> Leaving<omxrpc_rcm_server_create> @line<241> with error<0:ErrorNone>
Module<ti.omx> Entering<omxrpc_rcm_server_remote_fxn_register> @line<279>
@ omxrpc_rcm_server_remote_fxn_register regFxnCategory 0
Calling RcmServer_addSymbol(OmxRpcGetHandle)
Calling RcmServer_addSymbol(OmxRpcFreeHandle)
Calling RcmServer_addSymbol(OmxRpcCreateProxyLite)
Calling RcmServer_addSymbol(OmxRpcGetHeapMemStats)
Calling RcmServer_addSymbol(OmxRpcDeleteProxyLite)
Module<ti.omx> Leaving<omxrpc_rcm_server_remote_fxn_register> @line<306> with error<0:ErrorNone>
Module<ti.omx> Entering<omxrpc_rcm_server_start> @line<255>
Module<ti.omx> @<omxrpc_rcm_server_start> @line<256> msg<Before RcmServer_start>
Module<ti.omx> @<omxrpc_rcm_server_start> @line<258> msg<After RcmServer_start>
Module<ti.omx> Leaving<omxrpc_rcm_server_start> @line<259> with error<0:ErrorNone>
Module<ti.omx> Leaving<OmxRpc_moduleInitServer> @line<854> with error<0:ErrorNone>
Module<ti.omx> Leaving<DomxInit> @line<208> with error<0:ErrorNone>
Entered function:DomxCore_mapPhyAddr2UsrVirtual
Entered function:DomxCore_mapPhyAddr2UsrVirtual
Entered function:DomxCore_mapPhyAddr2UsrVirtual
Leaving OMX_Init: retVal OMX_ERRORTYPE: 0
Entered: OMX_GetHandle (0xc970c, OMX.TI.VPSSM3.VFPC.INDTXSCWB, 0xc9708, 0x4067bca8)
Component OMX.TI.VPSSM3.VFPC.INDTXSCWB In table OMX.TI.VPSSM3.VFCC idx 0
Component OMX.TI.VPSSM3.VFPC.INDTXSCWB In table OMX.TI.VPSSM3.VFDC idx 1
Component OMX.TI.VPSSM3.VFPC.INDTXSCWB In table OMX.TI.VPSSM3.VFPC.DEIHDUALOUT idx 2
Component OMX.TI.VPSSM3.VFPC.INDTXSCWB In table OMX.TI.VPSSM3.VFPC.DEIMDUALOUT idx 3
Component OMX.TI.VPSSM3.VFPC.INDTXSCWB In table OMX.TI.VPSSM3.VFPC.NF idx 4
Component OMX.TI.VPSSM3.VFPC.INDTXSCWB In table OMX.TI.VPSSM3.VFPC.INDTXSCWB idx 5
Component OMX.TI.VPSSM3.VFPC.INDTXSCWB found idx: 5
In OMX_GetHandle, component OMX.TI.VPSSM3.VFPC.INDTXSCWB, omxhandle 0xcdc30
Module<ti.omx> Entering<OmxProxy_commonInit> @line<2491>
Module<ti.omx> @<OmxProxy_commonInit> @line<2492> msg<OMX.TI.VPSSM3.VFPC.INDTXSCWB>
Module<ti.omx> Entering<omxproxy_map_component_name2info> @line<747>
Module<ti.omx> Leaving<omxproxy_map_component_name2info> @line<764> with error<0:ErrorNone>
Module<ti.omx> Entering<omxproxy_get_component_custom_config_info> @line<784>
Module<ti.omx> Leaving<omxproxy_get_component_custom_config_info> @line<801> with error<0:ErrorNone>
Module<ti.omx> @<OmxProxy_commonInit> @line<2565> msg<Before OmxRpc_Params_init>
Module<ti.omx> Entering<OmxRpc_Params_init> @line<93>
Module<ti.omx> Leaving<OmxRpc_Params_init> @line<99> with error<0:ErrorNone>
Module<ti.omx> @<OmxProxy_commonInit> @line<2569> msg<After OmxRpc_Params_init>
Module<ti.omx> @<OmxProxy_commonInit> @line<2579> msg<Before OmxRpc_create>
Module<ti.omx> Entering<OmxRpc_object_create> @line<109>
Module<ti.omx> Entering<OmxRpc_Instance_init> @line<570>
Module<ti.omx> Entering<omxrpc_module_init_client> @line<324>
Entered function:omxrpc_module_init_client (2)
Module<ti.omx> Entering<OmxRpc_rcmClientCreate> @line<976>
Entered function:OmxRpc_rcmClientCreate (0x40781a78, OmxRpcRcmServer_2, 5)
setting log mask for framework components
Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<985> msg<Before RcmClient_Params_init>
Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<987> msg<After RcmClient_Params_init>
Module<ti.omx> @<OmxRpc_rcmClientCreate> @line<992> msg<Before RcmClient_create>
[t=0x00000004] [tid=0x40603000] ti.sdo.rcm.RcmClient: --> RcmClient_create: ()
[t=0x0000005f] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x000000ad] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x000000f8] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x00000147] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x0000019c] [tid=0x40603000] ti.sdo.rcm.RcmClient: --> RcmClient_Instance_init: (obj=0xcde40)
[t=0x000001f3] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x00000240] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x00000289] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x000002d9] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x00000320] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x0000036b] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x000003b3] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x00000403] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x00000449] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x00000494] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x000004db] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x0000052b] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x00000578] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x000005c3] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x0000060c] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x0000065c] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x000006a2] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x000006ed] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x00000734] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x00000784] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x000007cd] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x00001dd0] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x00001e2e] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x00001e80] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x00001ecb] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x00001f16] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x00001f5f] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x00001fae] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x00001ffb] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x0000204a] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x00002093] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x000020e3] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x0000212a] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x00002175] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x000021be] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x0000220d] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void
[t=0x00002254] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> enter: (@4074fbfc)
[t=0x0000229d] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- enter: @00000000
[t=0x000022e5] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: --> leave: (@4074fbfc, @00000000)
[t=0x00002334] [tid=0x40603000] ti.sdo.xdcruntime.linux.GateThreadSupport: <-- leave: void