Media Package in android source

by Anandi » Sat, 27 Mar 2010 04:55:38 GMT


Sponsored Links
 Hello all,

I am trying to find out the location of Media Package in android
source. I am curious about it, because this package is accessible
anywhere(I can import it in any application in any workspace).

I am also looking for the option in "make file" which makes the build
system to generate this package globally accessible.

I have observed one thing that the classes which are there in the
"preloaded classes" list are globally accessible. Does it related
anyway to this fact?

So, Please can anyone help me in solving these queries ?

Thanks in advance
Anu

--



Other Threads

1. MediaPlayer(Streaming LiveTV from LinuxBox)

I am working on the application which plays liveTV stream sent from
LinuxBox.
I am using ffmpeg and ffserver to send out live feeds via RTSP, but it
does not work and mediaPlayer gives me an error message "Sorry, this
video can not be played". I am passing the URL such as "rtsp://<IP>:
5454/test.mp4".

These are the two commands I run on my linux box.

ffserver -d -f ffserver.conf
ffmpeg -i /dev/video1 http://localhost:8090/feed1.ffm

Here's the configuration file (ffserver.conf) I am using.

#Port on which the server is listening. You must select a different
# port from your standard HTTP web server if it is running on the same
# computer.
Port 8090

RTSPPort 5454

# Address on which the server is bound. Only useful if you have
# several network interfaces.
BindAddress 0.0.0.0

# Number of simultaneous HTTP connections that can be handled. It has
# to be defined *before* the MaxClients parameter, since it defines
the
# MaxClients maximum limit.
#MaxHTTPConnections 2000

# Number of simultaneous requests that can be handled. Since FFServer
# is very fast, it is more likely that you will want to leave this
high
# and use MaxBandwidth, below.
MaxClients 1000

# This the maximum amount of kbit/sec that you are prepared to
# consume when streaming to clients.
MaxBandwidth 5000

# Access log file (uses standard Apache log file format)
# '-' is the standard output.
CustomLog -

# Suppress that if you want to launch ffserver as a daemon.
NoDaemon

##################################################################
# Definition of the live feeds. Each live feed contains one video
# and/or audio sequence coming from an ffmpeg encoder or another
# ffserver. This sequence may be encoded simultaneously with several
# codecs at several resolutions.

<Feed feed1.ffm>

# You must use 'ffmpeg' to send a live feed to ffserver. In this
# example, you can type:
#
#ffmpeg http://localhost:8090/feed1.ffm

# ffserver can also do time shifting. It means that it can stream any
# previously recorded live stream. The request should contain:
# "http://xxxx?date=[YYYY-MM-DDT][[HH:]MM:]SS[.m...]".You must specify
# a path where the feed is stored on disk. You also specify the
# maximum size of the feed, where zero means unlimited. Default:
# File=/tmp/feed_name.ffm FileMaxSize=5M
File /tmp/feed1.ffm
FileMaxSize 100M

# You could specify
# ReadOnlyFile /saved/specialvideo.ffm
# This marks the file as readonly and it will not be deleted or
updated.

# Specify launch in order to start ffmpeg automatically.
# First ffmpeg must be defined with an appropriate path if needed,
# after that options can follow, but avoid adding the http:// field
#Launch ffmpeg

# Only allow connections from localhost to the feed.
#ACL allow 127.0.0.1

</Feed>

##################################################################
# RTSP examples
#
# You can access this stream with the RTSP URL:
#   rtsp://localhost:5454/test1-rtsp.mpg
#
# A non-standard RTSP redirector is also created. Its URL is:
#   http://localhost:8090/test1-rtsp.rtsp

<Stream test2.mp4>
Feed feed1.ffm
Format rtp
#File "/var/videos/testVideos/test.mpg"
VideoFrameRate 15
VideoCodec mpeg4
VideoSize qvga
VideoBitRate 256
VideoBufferSize 40000
VideoGopSize 12
AudioCodec aac
AudioBitRate 32
AudioChannels 1
</Stream>

I know that Orb can send live tv feed to the android using RTSP/3gp
format.
I am wondering whether I can do the same using ffmpeg and ffserver.

Can anyone help?

Thanks.

Manabu


--~--~---------~--~----~------------~-------~--~----~

2. How to make the intent only launch one activity when more than one activity are matched?

All,

How to make the intent only launch one activity when more than one
activity are matched?

That means:

the Implicit Intents have not specified a component; instead, they
must include enough information for the system to determine which of
the available components is best to run for that intent

If the available componets is not only one, but we can NOT change
others' code and interface, just want to the intent only launch our
application? how to do?

Thanks

--~--~---------~--~----~------------~-------~--~----~

3. How to port android for at91sam9261EK

4. How to listen for the end of a call?

5. Building internal code into SDK

6. how to count time in android

7. g 1 accelerometer