How-to start service automatically on system startup and on installation

by Ralf » Mon, 13 Apr 2009 13:08:20 GMT


Sponsored Links
 All you need is to create a BroadcastReceiver that is bound to the
BOOT_COMPLETED action.

Tout ce que tu as a faire c'est de creer un BroadcastReceiver qui
recoit l'action BOOT_COMPLETED (avec la permission qui va bien).

Exemple:

<manifest xmlns:android=" http://schemas.android.com/apk/res/android" ;
package="com.alfray.timeriffic" ...>
  <application ...>
    <receiver android:name=".MyBroadcastReceiver">
      <intent-filter>
        <action android:name="android.intent.action.BOOT_COMPLETED" />
      </intent-filter>
    </receiver>
  </application>

<!-- Important -->
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />
</manifest>

R/





--~--~---------~--~----~------------~-------~--~----~



Other Threads

1. PV omx_test_app missing in Froyo source code

Hi Guys,

In the froyo source code, the PV omx_test_app & omx_test_app_enc
folders are missing.
path - external/opencore/codecs_V2/omx/
But i can see these folders in the android GIT.

I want to run the PV test application for testing MP4 files.
For that, i compiled the froyo code after adding these folders. The
build was fine.

But when i tried running the test app exe for NORMAL_SEQ_TEST, i was
getting segmentation fault.

syntax -> test_omx_client /data/test.mp4 -o /data/output.yuv -c mpeg4 -
t 11 11

Have any tried running PV omx_test_app on froyo?

Expecting valuable inputs from you guys.

Thanks,
Prajeesh

-- 

2. Android 1.6, RTP live supported?

Hi All,

For quiet some time now I am trying to stream over RTP files on my machine
to my Android device.  I am using FFMpeg to do the task of reading media
files and do the RTP packetization.  I am using the FFServer to work as RTSP
server for my stream.

The stream Video is in H.264 format and Audio is in AAC format.  See below
the SDP that is returned by FFServer.    H.264 and AAC are Android supported
media formats.  But I am getting error (see below) when trying to play the
RTSP stream using application developed using MediaPlayer class.

Is streaming of video and audio in separate streams packetized using RTP
format supported in Android 1.6?  If RTP live is supported by MediaPlayer,
Can anyone please give tips about which encoder variation does Android 1.6
expects to have been used?  Within H.246/AAC there are many variations,
probably Android does not like the encoder that I am using :)

I have already gone through a relevant post Playing live stream (RTP) in
MediaPlayer:

3. Android market rules are changed, you must accept it within 30 days but...

4. Numeric Keypad on WebView

5. JNI in android

6. Getting a FileDescriptor from a Socket

7. Blank black screen while transiting to new activity.