|
MMAPI 1.2 | ||||||||
PREV NEXT | FRAMES NO FRAMES |
See:
Description
Packages | |
---|---|
javax.microedition.media | |
javax.microedition.media.control | |
javax.microedition.media.protocol |
This document, Mobile Media API (JSR-135) Specification, defines the Multimedia API for the Java TM 2 Platform, Micro Edition (J2METM).
Date Version Description 18-August-2001 MMAPI, 1st Draft Initial MMAPI draft proposed for the Aug. 30-31 Expert Group meeting 2-October-2001 MMAPI, 1st EG Draft Incorporated changes agreed during the Aug. 30-31 Expert Group meeting 5-October-2001 MMAPI, Community Draft, v. 0.8 MMAPI Community review draft version 16-November-2001 MMAPI, EG Draft v. 0.85 Incorporated changes agreed during the Nov. 5-6 Expert Group meeting 10-December-2001 MMAPI, EG Draft v. 0.86 Incorporated changes made in the EG mailing list discussion28-January-2002 MMAPI, Public Draft, v. 0.9 MMAPI Public review draft version11-March-2002 MMAPI, Revised Public Draft, v. 0.95 MMAPI Revised Public draft version 26-April-2002 MMAPI, EG Draft v.0.96 Incorporated changes agreed in the EG meeting & proposed in public review 02-May-2002 MMAPI, EG Draft v.0.97 Incorporated changes agreed in the EG meeting 30.04.2002 09-May-2002 MMAPI, EG Draft v. 0.98 Incorporated changes agreed in the EG meeting 07.05.2002 23-May-2002 MMAPI, Proposed Final Draft, v. 1.0 MMAPI Proposed Final Draft Version 24-June-2002 MMAPI, v. 1.0 MMAPI Final Version 9-April-2003 MMAPI, Draft a, v.1.1 MMAPI Draft a of Version 1.1 24-April-2003 MMAPI, Draft b, v.1.1 MMAPI Draft b of Version 1.1 4-June-2003 MMAPI, v.1.1 MMAPI 1.1 Final Version 13-March-2006 MMAPI, Draft a, v.1.2 MMAPI Draft a of Version 1.2 13-April-2006 MMAPI, Draft b, v.1.2 MMAPI Draft b of Version 1.2 11-May-2006 MMAPI, Draft c, v.1.2 MMAPI Draft c of Version 1.2 22-May-2006 MMAPI, v.1.2 MMAPI 1.2 Final Version
The audience for this document is the public Java community reviewing this specification and the Java Community Process (JCP) expert group defining this specification, implementors of the Multimedia API, and application developers targeting the J2ME platform.
The MMAPI specification is an optional package. An optional package can be adopted to existing J2ME profiles. A profile of J2ME defines device-type-specific sets of APIs for a particular vertical market or industry. Profiles are more exactly defined in the related publication, Configurations and Profiles Architecture Specification, Sun Microsystems, Inc.
The Java Language Specification by James Gosling, Bill Joy, and Guy L. Steele. Addison-Wesley, 1996, ISBN 0-201-63451-1
Connected, Limited Device Configuration (JSR-30), Sun Microsystems, Inc.
Connected Device Configuration (JSR-36), Sun Microsystems, Inc.
Mobile Information Device Profile (JSR-37), Sun Microsystems, Inc.
Mobile Information Device Profile 2.0 (JSR-118), Sun Microsystems, Inc.
Java Media Framework, Sun Microsystems, Inc.
Many multimedia types and formats exist in today's market, and new types and formats are being introduced all the time. There are also many, diverse methods to store and deliver these various media types. For example, there are traditional storage devices (such as disk file systems, CDs, and DVDs), wired protocols (UDP, HTTP, etc.) and wireless protocols (WAP, etc.).J2METM devices range from cell phones with simple tone generation to PDAs and web tablets with advanced audio and video rendering capabilities. To accommodate diverse configurations and multimedia processing capabilities, an API with a high level of abstraction is needed.
The MMAPI Expert Group has also contributed to the ongoing JSR-118 (Mobile Information Device Profile 2.0) JCP specification, and the target is to make MMAPI a direct superset of the JSR-118 MIDP 2.0 Media API.
MMAPI v1.2 introduces security definitions for CDC based implementations.
The MMAPI Expert Group consists of the following companies:
- Nokia (Specification Lead)
- Aplix Corporation
- Beatnik, Inc.
- France Telecom
- Insignia Solutions
- Mitsubishi Electric Corp.
- Motorola
- Netdecisions Holdings United
- NTT DoCoMo, Inc.
- Openwave Systems Inc.
- PacketVideo Corporation
- Philips
- Siemens AG ICM MP TI
- Smart Fusion
- Sun Microsystems, Inc.
- Symbian Ltd
- Texas Instruments Inc.
- Vodafone
- Yamaha Corporation
- Zucotto Wireless
MMAPI features the following:
Feature Description Support for Tone Generation, Playback and Recording of Time Based Media The API supports any time-based audio and video content by offering tools to control the flow of the media stream. Tone generation is a special media type that is characterized by frequency and duration. CLDC Target The main target for the API is a CLDC based device. Other environments (e.g. CDC) are not excluded, but CLDC is the lowest common denominator. Note: MMAPI references
IllegalStateException
which is not present in CLDC 1.0. As such, it is required that the underlying platform (configuration/profile), such as CLDC1.0/MIDP1.0, includes that exception.Small Footprint CLDC as the target configuration sets strict limits for memory consumption. Design of the API emphasizes that as much as possible. Protocol and Content Agnostic The design of the API is not biased towards any specific protocol or content type. Subsettable: Audio-only vs. General Multimedia It is possible to separate a subset of the API in order to provide support for only some type of content (e.g. for basic audio). It allows profiles that cannot support all the features of the full API to only take the parts that are needed. Extensible
The API is designed in a way that allows new features to be added later without breaking the old functionality.Optionality for Implementation The API offers a wide range of features for different purposes. Design of the API allows implementations that can't provide real support for all the features to leave some of them unimplemented.
This section provides a high-level overview of MMAPI. Short code examples in the Usage Scenarios section illustrate practical use of the API.Basic Concepts: Protocol and Content Handling
Basically, multimedia processing can be broken into two parts:Protocol handling refers to reading data from a source (such as a file, capture device, or streaming server) into a media processing system. Content handling usually requires processing the media data (parsing or decoding, for example) and rendering the media to output devices such as an audio speaker or video display.
- Handling the data delivery protocol
- Handling the data content
Two high-level objects are used in this API: DataSource and Player. Each object encapsulates the two parts of multimedia processing:
A DataSource encapsulates protocol handling. It hides the details of how the data is read from its source, whether the data is coming from a file, streaming server, or proprietary delivery mechanism. DataSource provides a set of methods to allow a Player to read data from it for processing.
- DataSource for protocol handling
- Player for content handling
A Player reads from the DataSource, processes the data, and renders the media to the output device. It provides a set of methods to control media playback and basic synchronization. Players also provide some type-specific controls to access features for specific media types.
A factory mechanism, the Manager, creates Players from DataSources. For convenience, Manager also provides methods to create Players from locators and InputStreams.
API Details
The createPlayer method is the top-level entry point to the API:The urlString fully specifies the protocol and the content of the data:Player Manager.createPlayer(String urlString)The Manager parses the URL and creates a DataSource to handle the specified data delivery protocol. The DataSource derives the content type from the data. The Manager then takes this content type and creates a Player to handle the presentation of the data. The resulting Player is returned for use by the application.<protocol>:<content location>The Player provides general methods to control data flow and presentation, for example:
Fine-grained control is an important feature of the API. Therefore, each Player also provides type-specific controls with the getControls and getControl methods:Player.realize() Player.prefetch() Player.start() Player.setMediaTime(long time)Since each type of media will yield different types of controls from its corresponding Player, the getControls and getControl methods expose features that are unique to a particular media type. For example, for the MIDI type, you can receive a MIDIControl from the Player's getControl method.Control[] Player.getControls() Control Player.getControl(String controlType)System Properties
MMAPI has some properties that can be queried by System.getProperty(String key). Currently, the following keys are defined:
Key Description microedition.media.version The string returned designates the version of the MMAPI specification that is implemented. For implementations on configurations and profiles that contain java.security.Permission
the value MUST be "1.2", otherwise the value MUST be either "1.1" or "1.2".supports.mixing Query for whether audio mixing is supported. The string returned is either "true" or "false". If mixing is supported, the following conditions are true:
- At least two tones can be played with
Manager.playTone
simultaneously.Manager.playTone
can be used at the same time when at least onePlayer
is playing back audio.- At least two
Players
can be used to play back audio simultaneously.supports.audio.capture Query for whether audio capture is supported. The string returned is either true or false. If supports.audio.capture is true, audio.encoding must not be null
or empty.supports.video.capture Query for whether video capture is supported. The string returned is either true or false. If supports.video.capture is true, video.encoding must not be null
or empty.supports.recording Query for whether recording is supported. The string returned is either true or false. If supports.recording is true, at least one Player type supports recording. audio.encodings The string returned specifies the supported capture audio formats. Each format is specified in the audio encoding syntax. The formats are delimited by at least one space. If audio capture is not supported, audio.encoding
returnsnull
.video.encodings The string returned specifies the supported capture video formats. Each format is specified in the video encoding syntax. The formats are delimited by at least one space. If video capture is not supported, video.encoding
returnsnull
.video.snapshot.encodings Supported video snapshot formats for the getSnapshot method in VideoControl. The string returned specifies the supported capture image formats. Each format is specified in the image encoding syntax. The formats are delimited by at least one space. The first format returned is the default. If video snapshot is not supported, video.snapshot.encoding
returnsnull
.streamable.contents The string returned specifies the supported streamable content types. Media of these content types will be handled by the Player
as streamable media -- media that is played as the data is received by thePlayer
. Playing back media of this type does not require thePlayer
to buffer the entire content. The content types are given in the MIME syntax as specified in content types, each delimited by at least one space.Tone Generation
Tone generation is important for games and other audio applications. On very small devices, it is particularly important since it is likely to be the only form of multimedia capability. In its simplest form, tone generation reduces to a single buzzer or some simple monophonic tone generation. The Manager class provides a top level method to handle this simple form of single tone generation:Manager.playTone(int note, int duration, int volume)The implementation of this method can be mapped directly to a device's hardware tone generator to provide the most responsive sound generation.In addition, the API provides a way to create a specific type of Player for synthesizing tone sequences:
Player p = Manager.createPlayer(Manager.TONE_DEVICE_LOCATOR)The Player created provides a special type of Control, nbsp;ToneControl which can be used for programming a tone sequence. This enables more sophisticated applications written for slightly more powerful devices.The
ToneControl
interface specifies a tone sequence format to be used for programming a tone sequence. In addition, this tone sequence format can also be used as a file format for specifying tone sequences. A file with the ".jts" extension must be used to store tone sequences in this format."audio/x-tone-seq"
designates the MIME type for this format.Camera
MMAPI provides support for a camera. From the API point of view, a camera attached to a device doesn't really differ from any other video content. Camera has a special locator "capture://video" that is used to create its Player. An application uses VideoControl to display the view finder on the screen and take pictures.VideoControl.getSnapshot(String imageType) is used to capture a camera's picture. The
imageType
parameter specifies the format of the snapshot. The video.snapshot.encodings key can be used to query the supported snapshot formats from the system.Scenario 11 shows how to use the camera, and Manager documents different locators for creating the camera's Player.
Not all implementations of MMAPI support all multimedia types and input protocols. Some implementations may support only a few selected types and protocols. For example, some may support only playback of local audio files. The design of MMAPI allows implementations to provide optional support for different media types and protocols.Manager provides getSupportedContentTypes and getSupportedProtocols to query for supported media types and protocols. An application can also attempt to create a Player from Manager given the media input. If the content type or the protocol specified is not supported, the Player creation methods throw an Exception.
Since MMAPI is an optional package, it does not mandate any particular media types or protocols. Required types and protocols are to be determined by the appropriate profiles adopting MMAPI (e.g. JSR 118). However, an implementation must guarantee support of at least one media type and protocol.
MMAPI also allows implementations to support optional features. For example, some implementations may support only media playback, but not recording. Or some devices may support volume control while others may not. MMAPI allows implementations to expose these optional features as applicable.
Optional features are organized as Controls. A Player can be queried for all of its supported Controls by using getControls or a particular type of Control by using getControl.
MMAPI mandates support of some features for some media types. Other features are recommended while others are entirely optional. This guarantees some uniformity across different implementations. Feature sets are organized as described below. For the purpose of this discussion, the following definitions are used:
Definitions
This document uses definitions based upon those specified in RFC 2119 (See http://www.ietf.org).
Term Definition MUST The associated definition is an absolute requirement of this specification. MUST NOT The definition is an absolute prohibition of this specification. SHOULD Indicates a recommended practice. There may exist valid reasons in particular circumstances to ignore this recommendation, but the full implications must be understood and carefully weighed before choosing a different course. SHOULD NOT Indicates a non-recommended practice. There may exist valid reasons in particular circumstances when the particular behavior is acceptable or even useful, but the full implications should be understood and the case carefully weighed before implementing any behavior described with this label. MAY Indicates that an item is truly optional.Feature Sets
The following feature sets are defined for five different types of media. When a Player is created for a particular type, it must follow these guidelines and implement the appropriate Control types:
Feature Set Implementation Requirements Sampled Audio
- Should implement VolumeControl, StopTimeControl.
MIDI
- Should implement VolumeControl, MIDIControl, TempoControl, PitchControl, StopTimeControl.
Tone Sequence (Player for TONE_DEVICE_LOCATOR)
- Must implement ToneControl.
- Should implement VolumeControl, StopTimeControl.
Interactive MIDI (Player for MIDI_DEVICE_LOCATOR)
- Must implement MIDIControl.
Video
- Must implement VideoControl.
- Should implement FramePositioningControl, StopTimeControl, VolumeControl (if audio is also available).
The following controls belong to none of the above feature sets. Implementations may implement them when applicable: GUIControl, MetaDataControl, RateControl, RecordControl.
APlayer
has five states: UNREALIZED, REALIZED, PREFETCHED, STARTED and CLOSED. APlayer
implementation must allow successful state transition to each of these states using the six state-transition methods:This means that the implementation must guarantee that these methods succeed under normal runtime conditions. This is to ensure that an implementation provides
realize
prefetch
start
stop
deallocate
close
Player
s that are functional.
The Multimedia Security Addendum defines the security requirements for MMAPI.
Some J2METM devices are very resource constrained. It may not be feasible for a device to support a full range of multimedia types, such as video, on some cell phones. As a result, not all devices are required to support the full generality of a multimedia API, such as extensibility to support custom protocols and content types.For the MID profile, version 2.0 (JSR 118), the size of the API is another determining factor. A special subset of MMAPI has been derived to address the specific needs of MIDP. This proposed subset fulfills the requirements set by the MIDP 2.0 Expert Group. These include:
This subset differs from the general multimedia API in the following ways:
- Low footprint audio playback
- Protocol and content format agnostic
- Support for tone generation
- Support for general media flow controls: start, stop, seek etc.
- Support for media-specific type controls: volume etc.
- Support for capability query
It is important to note that the MIDP 2.0 subset is a proper subset of the full MMAPI specification and is fully forward compatible with MMAPI.
- It is audio-only. It excludes all Controls specific to video or graphics.
- It does not support synchronizing simultaneous playback of multiple
Player
s using a common time base.- It does not support custom protocols at the application level. The javax.microedition.media.protocol package (DataSource) is excluded.
- A simplified version of Manager is used.
Currently, MMAPI targets general media playback. As such, it does not cover the following areas of multimedia:It is our intent that by staying with a high level of abstraction, we will be able to extend the API to support the above features.This has been demonstrated by the JavaTM Media Framework which employs a similar paradigm.
- 3D graphics and animation
- General media transcoding
- Editing
- Server-side streaming: transmission
This section provides some common scenarios of how the API could be used.Scenario 1: Single-Tone Generation
try { Manager.playTone(ToneControl.C4, 5000 /* millisec */, 100 /* max vol */); } catch (MediaException e) { }Scenario 2: Simple Media Playback with Looping
try { Player p = Manager.createPlayer("http://webserver/music.mp3"); p.setLoopCount(5); p.start(); } catch (IOException ioe) { } catch (MediaException me) { }Scenario 3: Fine-Grained Playback Control
static final long SECS_TO_MICROSECS = 1000000L; Player p; VolumeControl vc; try { p = Manager.createPlayer("http://webserver/music.mp3"); p.realize(); // Set a listener. p.addPlayerListener(new Listener()); // Grab volume control for the player. // Set Volume to max. vc = (VolumeControl)p.getControl("VolumeControl"); if (vc != null) vc.setLevel(100); // Set a start time. p.setMediaTime(5 * SECS_TO_MICROSECS); // Guarantee that the player can start with the smallest latency. p.prefetch(); // Non-blocking start p.start(); } catch (IOException ioe) { } catch (MediaException me) { } class Listener implements PlayerListener { public void playerUpdate(Player p, String event, Object eventData) { if (event == END_OF_MEDIA || event == STOP_AT_TIME) { System.out.println("Done processing"); try { p.setMediaTime(5 * SECS_TO_MICROSECS); p.start(); } catch (MediaException me) { } break; } } }Scenario 4: MIDI Playback with Some Fine-Grained Control
Player p; TempoControl tc; try { p = Manager.createPlayer("http://webserver/tune.mid"); p.realize(); // Grab the tempo control. tc = (TempoControl)p.getControl("TempoControl"); tc.setTempo(120000); // 120 beats/min p.start(); } catch (IOException ioe) { } catch (MediaException me) { }Scenario 5: Video Playback
Player p; VideoControl vc; try { p = Manager.createPlayer("http://webserver/movie.mpg"); p.realize(); // Grab the video control and set it to the current display. vc = (VideoControl)p.getControl("VideoControl"); if (vc != null) { Form form = new Form("video"); form.append((Item)vc.initDisplayMode(vc.USE_GUI_PRIMITIVE, null)); Display.getDisplay(midlet).setCurrent(form); } p.start(); } catch (IOException ioe) { } catch (MediaException me) { }Scenario 6: Playing Back from Media Stored in RMS
RecordStore rs; int recordID; : // code to set up the record store. try { InputStream is = new ByteArrayInputStream(rs.getRecord(recordID)); Player p = Manager.createPlayer(is, "audio/X-wav"); p.start(); } catch (IOException ioe) { } catch (MediaException me) { }Scenario 7: Playing Back from Media Stored in JAR
/** Notice that in MIDP 2.0, the wav format is mandatory only */ /** in the case that the device supports sampled audio. */ try { InputStream is = getClass().getResourceAsStream("audio.wav"); Player p = Manager.createPlayer(is, "audio/X-wav"); p.start(); } catch (IOException ioe) { } catch (MediaException me) { }Scenario 8: Synchronization of Different Players
Player p1, p2; try { p1 = Manager.createPlayer("http://webserver/tune.mid"); p1.realize(); p2 = Manager.createPlayer("http://webserver/movie.mpg"); p2.realize(); p2.setTimeBase(p1.getTimeBase()); p1.prefetch(); p2.prefetch(); p1.start(); p2.start(); } catch (IOException ioe) { } catch (MediaException me) { }Scenario 9: Tone Sequence Generation
/** * "Mary Had A Little Lamb" has "ABAC" structure. * Use block to repeat "A" section. */ byte tempo = 30; // set tempo to 120 bpm byte d = 8; // eighth-note byte C4 = ToneControl.C4;; byte D4 = (byte)(C4 + 2); // a whole step byte E4 = (byte)(C4 + 4); // a major third byte G4 = (byte)(C4 + 7); // a fifth byte rest = ToneControl.SILENCE; // rest byte[] mySequence = { ToneControl.VERSION, 1, // version 1 ToneControl.TEMPO, tempo, // set tempo ToneControl.BLOCK_START, 0, // start define "A" section E4,d, D4,d, C4,d, E4,d, // content of "A" section E4,d, E4,d, E4,d, rest,d, ToneControl.BLOCK_END, 0, // end define "A" section ToneControl.PLAY_BLOCK, 0, // play "A" section D4,d, D4,d, D4,d, rest,d, // play "B" section E4,d, G4,d, G4,d, rest,d, ToneControl.PLAY_BLOCK, 0, // repeat "A" section D4,d, D4,d, E4,d, D4,d, C4,d // play "C" section }; try{ Player p = Manager.createPlayer(Manager.TONE_DEVICE_LOCATOR); p.realize(); ToneControl c = (ToneControl)p.getControl("ToneControl"); c.setSequence(mySequence); p.start(); } catch (IOException ioe) { } catch (MediaException me) { }Scenario 10: Capture and Recording
try { // Create a DataSource that captures live audio. Player p = Manager.createPlayer("capture://audio"); p.realize(); // Get the RecordControl, set the record location, and // start the Player and record for 5 seconds. RecordControl rc = (RecordControl)p.getControl("RecordControl"); rc.setRecordLocation("file:/tmp/audio.wav"); rc.startRecord(); p.start(); Thread.currentThread().sleep(5000); p.stop(); rc.stopRecord(); rc.commit(); } catch (IOException ioe) { } catch (MediaException me) { } catch (InterruptedException e) { }Scenario 11: Camera
Player p; VideoControl vc; // initialize camera try { p = Manager.createPlayer("capture://video"); p.realize(); // Grab the video control and set it to the current display. vc = (VideoControl)p.getControl("VideoControl"); if (vc != null) { Form form = new Form("video"); form.append((Item)vc.initDisplayMode(vc.USE_GUI_PRIMITIVE, null)); Display.getDisplay(midlet).setCurrent(form); } p.start(); byte[] image = vc.getSnapshot(null); // do something with the image ... } catch (IOException ioe) { } catch (MediaException me) { }
|
MMAPI 1.2 | ||||||||
PREV NEXT | FRAMES NO FRAMES |