Repository: johnsonsu/react-native-sound-player
Branch: master
Commit: 12d2e66ed0f9
Files: 18
Total size: 47.7 KB
Directory structure:
gitextract_5lpaha_7/
├── .github/
│ ├── ISSUE_TEMPLATE/
│ │ └── bug_report.md
│ └── workflows/
│ └── npm-publish.yml
├── .gitignore
├── .prettierrc
├── LICENSE
├── README.md
├── RNSoundPlayer.podspec
├── android/
│ ├── build.gradle
│ └── src/
│ └── main/
│ ├── AndroidManifest.xml
│ └── java/
│ └── com/
│ └── johnsonsu/
│ └── rnsoundplayer/
│ ├── RNSoundPlayerModule.java
│ └── RNSoundPlayerPackage.java
├── index.d.ts
├── index.js
├── ios/
│ ├── RNSoundPlayer.h
│ ├── RNSoundPlayer.m
│ └── RNSoundPlayer.xcodeproj/
│ ├── project.pbxproj
│ └── project.xcworkspace/
│ └── contents.xcworkspacedata
└── package.json
================================================
FILE CONTENTS
================================================
================================================
FILE: .github/ISSUE_TEMPLATE/bug_report.md
================================================
---
name: Bug report
about: Create a report to help us improve
title: "[BUG]"
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Platform (please complete the following information):**
- OS: [e.g. iOS or Android]
**Additional context**
Add any other context about the problem here.
================================================
FILE: .github/workflows/npm-publish.yml
================================================
# This workflow will run tests using node and then publish a package to GitHub Packages when a release is created
# For more information see: https://help.github.com/actions/language-and-framework-guides/publishing-nodejs-packages
name: Publish NPM Package
on:
workflow_dispatch:
release:
types: [created]
jobs:
publish-npm:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 16
registry-url: https://registry.npmjs.org/
- run: npm publish
env:
NODE_AUTH_TOKEN: ${{secrets.NPM_ACCESS_TOKEN}}
================================================
FILE: .gitignore
================================================
*.DS_Store
.AppleDouble
.LSOverride
# Icon must end with two \r
Icon
# Thumbnails
._*
# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent
# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk
# Xcode
#
# gitignore contributors: remember to update Global/Xcode.gitignore, Objective-C.gitignore & Swift.gitignore
## Build generated
build/
DerivedData/
## Various settings
*.pbxuser
!default.pbxuser
*.mode1v3
!default.mode1v3
*.mode2v3
!default.mode2v3
*.perspectivev3
!default.perspectivev3
xcuserdata/
## Other
*.moved-aside
*.xccheckout
*.xcscmblueprint
## Obj-C/Swift specific
*.hmap
*.ipa
*.dSYM.zip
*.dSYM
# CocoaPods
#
# We recommend against adding the Pods directory to your .gitignore. However
# you should judge for yourself, the pros and cons are mentioned at:
# https://guides.cocoapods.org/using/using-cocoapods.html#should-i-check-the-pods-directory-into-source-control
#
# Pods/
# Carthage
#
# Add this line if you want to avoid checking in source code from Carthage dependencies.
# Carthage/Checkouts
Carthage/Build
# fastlane
#
# It is recommended to not store the screenshots in the git repo. Instead, use fastlane to re-generate the
# screenshots whenever they are needed.
# For more information about the recommended setup visit:
# https://docs.fastlane.tools/best-practices/source-control/#source-control
fastlane/report.xml
fastlane/Preview.html
fastlane/screenshots
fastlane/test_output
# Code Injection
#
# After new code Injection tools there's a generated folder /iOSInjectionProject
# https://github.com/johnno1962/injectionforxcode
iOSInjectionProject/
================================================
FILE: .prettierrc
================================================
{
"tabWidth": 4,
"semi": false,
"singleQuote": true
}
================================================
FILE: LICENSE
================================================
MIT License
Copyright (c) 2017 Johnson Su
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
================================================
FILE: README.md
================================================
# react-native-sound-player
> ⚠️ **This package is no longer actively maintained.** For new projects, I recommend using other libraries such as [react-native-track-player](https://github.com/doublesymmetry/react-native-track-player) which provides more comprehensive audio playback features and active community support.
Play audio files, stream audio from URL, using ReactNative.
## Installation
### 1. `yarn` or `npm`
```
// yarn
yarn add react-native-sound-player
// or npm
npm install --save react-native-sound-player
```
### 2. Link
For RN >= 0.60 you can skip this step.
```
react-native link react-native-sound-player
```
## Usage
### Play sound with file name and type
1. Add sound files to iOS/Android.
- On iOS, drag and drop sound file into project in Xcode. Remember to check **"Copy items if needed"** option and **"Add to targets"**.
- On Android, put sound files in `{project_root}/android/app/src/main/res/raw/`. Just create the folder if it doesn't exist.
- When using playAsset() you only need to copy the file to the projects root directory or a subfolder like assets
2. Import the library and call the `playSoundFile(fileName, fileType)` function:
```javascript
import SoundPlayer from "react-native-sound-player";
try {
// play the file tone.mp3
SoundPlayer.playSoundFile("tone", "mp3");
// or play from url
SoundPlayer.playUrl("https://example.com/music.mp3");
// or play file from folder
SoundPlayer.playAsset(require("./assets/tone.mp3"));
} catch (e) {
console.log(`cannot play the sound file`, e);
}
```
> Please note that the device can still go to sleep (screen goes off) while audio is playing.
> When this happens, the audio will stop playing.
> To prevent this, you can use something like [react-native-keep-awake](https://github.com/corbt/react-native-keep-awake).
> Or alternatively, for iOS, you can add a Background Mode of `Audio, AirPlay, and Picture in Picture` in XCode. To do this, select your application from Targets, then click on `Signing & Capabilities` and add `Background Modes`. once the options for it appear on your `Signing & Capabilities` page select the checkbox with `Audio, AirPlay, and Picture in Picture`. This will allow the application to continue playing audio when the app is in the background and even when the device is locked.
## Functions
### `playSoundFile(fileName: string, fileType: string)`
Play the sound file named `fileName` with file type `fileType`.
### `playSoundFileWithDelay(fileName: string, fileType: string, delay: number)` - iOS Only
Play the sound file named `fileName` with file type `fileType` after a a delay of `delay` in _seconds_ from the current device time.
### `loadSoundFile(fileName: string, fileType: string)`
Load the sound file named `fileName` with file type `fileType`, without playing it.
This is useful when you want to play a large file, which can be slow to mount,
and have precise control on when the sound is played. This can also be used in
combination with `getInfo()` to get audio file `duration` without playing it.
You should subscribe to the `onFinishedLoading` event to get notified when the
file is loaded.
### `playUrl(url: string)`
Play the audio from url. Supported formats are:
- [AVPlayer (iOS)](https://stackoverflow.com/questions/21879981/avfoundation-avplayer-supported-formats-no-vob-or-mpg-containers)
- [MediaPlayer (Android)](https://developer.android.com/guide/topics/media/media-formats)
### `loadUrl(url: string)`
Load the audio from the given `url` without playing it. You can then play the audio
by calling `play()`. This might be useful when you find the delay between calling
`playUrl()` and the sound actually starts playing is too much.
### `playAsset(asset: number)`
Play the audio from an asset, to get the asset number use `require('./assets/tone.mp3')`.
Supported formats see `playUrl()` function.
### `loadAsset(asset: number)`
Load the audio from an asset like above but without playing it. You can then play the audio by calling `play()`. This might be useful when you find the delay between calling `playAsset()` and the sound actually starts playing is too much.
### `addEventListener(callback: (object: ResultObject) => SubscriptionObject)`
Subscribe to any event. Returns a subscription object. Subscriptions created by this function cannot be removed by calling `unmount()`. You **NEED** to call `yourSubscriptionObject.remove()` when you no longer need this event listener or whenever your component unmounts.
Supported events are:
1. `FinishedLoading`
2. `FinishedPlaying`
3. `FinishedLoadingURL`
4. `FinishedLoadingFile`
```javascript
// Example
...
// Create instance variable(s) to store your subscriptions in your class
_onFinishedPlayingSubscription = null
_onFinishedLoadingSubscription = null
_onFinishedLoadingFileSubscription = null
_onFinishedLoadingURLSubscription = null
// Subscribe to event(s) you want when component mounted
componentDidMount() {
_onFinishedPlayingSubscription = SoundPlayer.addEventListener('FinishedPlaying', ({ success }) => {
console.log('finished playing', success)
})
_onFinishedLoadingSubscription = SoundPlayer.addEventListener('FinishedLoading', ({ success }) => {
console.log('finished loading', success)
})
_onFinishedLoadingFileSubscription = SoundPlayer.addEventListener('FinishedLoadingFile', ({ success, name, type }) => {
console.log('finished loading file', success, name, type)
})
_onFinishedLoadingURLSubscription = SoundPlayer.addEventListener('FinishedLoadingURL', ({ success, url }) => {
console.log('finished loading url', success, url)
})
}
// Remove all the subscriptions when component will unmount
componentWillUnmount() {
_onFinishedPlayingSubscription.remove()
_onFinishedLoadingSubscription.remove()
_onFinishedLoadingURLSubscription.remove()
_onFinishedLoadingFileSubscription.remove()
}
...
```
### `onFinishedPlaying(callback: (success: boolean) => any)`
Subscribe to the "finished playing" event. The `callback` function is called whenever a file is finished playing. **This function will be deprecated soon, please use `addEventListener` above**.
### `onFinishedLoading(callback: (success: boolean) => any)`
Subscribe to the "finished loading" event. The `callback` function is called whenever a file is finished loading, i.e. the file is ready to be `play()`, `resume()`, `getInfo()`, etc. **This function will be deprecated soon, please use `addEventListener` above**.
### `unmount()`
Unsubscribe the "finished playing" and "finished loading" event. **This function will be deprecated soon, please use `addEventListener` and remove your own listener by calling `yourSubscriptionObject.remove()`**.
### `play()`
Play the loaded sound file. This function is the same as `resume()`.
### `pause()`
Pause the currently playing file.
### `resume()`
Resume from pause and continue playing the same file. This function is the same as `play()`.
### `stop()`
Stop playing, call `playSound(fileName: string, fileType: string)` to start playing again.
### `seek(seconds: number)`
Seek to `seconds` of the currently playing file.
### `setSpeaker(on: boolean)`
Overwrite default audio output to speaker, which forces `playUrl()` function to play from speaker.
### `setMixAudio(on: boolean)`
Only available on iOS. If you set this option, your audio will be mixed with audio playing in background apps, such as the Music app.
### `setVolume(volume: number)`
Set the volume of the current player. This does not change the volume of the device.
### `setNumberOfLoops(loops: number)`
**iOS**: Set the number of loops. A negative value will loop indefinitely until the `stop()` command is called.
**Android**: 0 will play the sound once. Any other number will loop indefinitely until the `stop()` command is called.
### `getInfo() => Promise<{currentTime: number, duration: number}>`
Get the `currentTime` and `duration` of the currently mounted audio media. This function returns a promise which resolves to an Object containing `currentTime` and `duration` properties.
```javascript
// Example
...
playSong() {
try {
SoundPlayer.playSoundFile('engagementParty', 'm4a')
} catch (e) {
alert('Cannot play the file')
console.log('cannot play the song file', e)
}
}
async getInfo() { // You need the keyword `async`
try {
const info = await SoundPlayer.getInfo() // Also, you need to await this because it is async
console.log('getInfo', info) // {duration: 12.416, currentTime: 7.691}
} catch (e) {
console.log('There is no song playing', e)
}
}
onPressPlayButton() {
this.playSong()
this.getInfo()
}
...
```
================================================
FILE: RNSoundPlayer.podspec
================================================
require 'json'
package = JSON.parse(File.read(File.join(__dir__, "package.json")))
Pod::Spec.new do |s|
s.name = "RNSoundPlayer"
s.version = package['version']
s.summary = package["description"]
s.homepage = "https://github.com/johnsonsu/react-native-sound-player"
s.license = package["license"]
s.author = { "Johnson Su" => "johnsonsu@johnsonsu.com" }
s.platforms = { :ios => "9.0", :tvos => "9.0" }
s.source = { :git => "https://github.com/johnsonsu/react-native-sound-player.git", :tag => s.version }
s.source_files = 'ios/**/*.{h,m}'
s.preserve_paths = "package.json", "LICENSE"
s.dependency 'React-Core'
end
================================================
FILE: android/build.gradle
================================================
apply plugin: 'com.android.library'
def DEFAULT_COMPILE_SDK_VERSION = 31
def DEFAULT_BUILD_TOOLS_VERSION = '31.0.3'
def DEFAULT_MIN_SDK_VERSION = 16
def DEFAULT_TARGET_SDK_VERSION = 31
def safeExtGet(prop, fallback) {
rootProject.ext.has(prop) ? rootProject.ext.get(prop) : fallback
}
android {
compileSdkVersion safeExtGet('compileSdkVersion', DEFAULT_COMPILE_SDK_VERSION)
buildToolsVersion safeExtGet('buildToolsVersion', DEFAULT_BUILD_TOOLS_VERSION)
defaultConfig {
minSdkVersion safeExtGet('minSdkVersion', DEFAULT_MIN_SDK_VERSION)
targetSdkVersion safeExtGet('targetSdkVersion', DEFAULT_TARGET_SDK_VERSION)
versionCode 1
versionName "1.0"
ndk {
abiFilters "armeabi-v7a", "x86"
}
}
lintOptions {
warning 'InvalidPackage'
}
}
dependencies {
implementation 'com.facebook.react:react-native:+'
}
================================================
FILE: android/src/main/AndroidManifest.xml
================================================
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.johnsonsu.rnsoundplayer">
</manifest>
================================================
FILE: android/src/main/java/com/johnsonsu/rnsoundplayer/RNSoundPlayerModule.java
================================================
package com.johnsonsu.rnsoundplayer;
import android.content.Context;
import android.media.AudioManager;
import android.media.MediaPlayer;
import android.media.MediaPlayer.OnCompletionListener;
import android.media.MediaPlayer.OnPreparedListener;
import android.net.Uri;
import java.io.File;
import java.io.IOException;
import javax.annotation.Nullable;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.bridge.ReactContextBaseJavaModule;
import com.facebook.react.bridge.ReactMethod;
import com.facebook.react.bridge.Callback;
import com.facebook.react.bridge.ReactContext;
import com.facebook.react.modules.core.DeviceEventManagerModule;
import com.facebook.react.bridge.WritableMap;
import com.facebook.react.bridge.Arguments;
import com.facebook.react.bridge.Promise;
import com.facebook.react.bridge.LifecycleEventListener;
public class RNSoundPlayerModule extends ReactContextBaseJavaModule implements LifecycleEventListener {
public final static String EVENT_SETUP_ERROR = "OnSetupError";
public final static String EVENT_FINISHED_PLAYING = "FinishedPlaying";
public final static String EVENT_FINISHED_LOADING = "FinishedLoading";
public final static String EVENT_FINISHED_LOADING_FILE = "FinishedLoadingFile";
public final static String EVENT_FINISHED_LOADING_URL = "FinishedLoadingURL";
private final ReactApplicationContext reactContext;
private MediaPlayer mediaPlayer;
private float volume;
private AudioManager audioManager;
public RNSoundPlayerModule(ReactApplicationContext reactContext) {
super(reactContext);
this.reactContext = reactContext;
this.volume = 1.0f;
this.audioManager = (AudioManager) this.reactContext.getSystemService(Context.AUDIO_SERVICE);
reactContext.addLifecycleEventListener(this);
}
@Override
public String getName() {
return "RNSoundPlayer";
}
@ReactMethod
public void setSpeaker(Boolean on) {
audioManager.setMode(AudioManager.MODE_IN_COMMUNICATION);
audioManager.setSpeakerphoneOn(on);
}
@Override
public void onHostResume() {
}
@Override
public void onHostPause() {
}
@Override
public void onHostDestroy() {
this.stop();
if (mediaPlayer != null) {
mediaPlayer.release();
mediaPlayer = null;
}
}
@ReactMethod
public void playSoundFile(String name, String type) throws IOException {
mountSoundFile(name, type);
this.resume();
}
@ReactMethod
public void loadSoundFile(String name, String type) throws IOException {
mountSoundFile(name, type);
}
@ReactMethod
public void playUrl(String url) throws IOException {
prepareUrl(url);
this.resume();
}
@ReactMethod
public void loadUrl(String url) throws IOException {
prepareUrl(url);
}
@ReactMethod
public void pause() throws IllegalStateException {
if (this.mediaPlayer != null) {
this.mediaPlayer.pause();
}
}
@ReactMethod
public void resume() throws IOException, IllegalStateException {
if (this.mediaPlayer != null) {
this.setVolume(this.volume);
this.mediaPlayer.start();
}
}
@ReactMethod
public void stop() throws IllegalStateException {
if (this.mediaPlayer != null) {
this.mediaPlayer.stop();
}
}
@ReactMethod
public void seek(float seconds) throws IllegalStateException {
if (this.mediaPlayer != null) {
this.mediaPlayer.seekTo((int) seconds * 1000);
}
}
@ReactMethod
public void setVolume(float volume) throws IOException {
this.volume = volume;
if (this.mediaPlayer != null) {
this.mediaPlayer.setVolume(volume, volume);
}
}
@ReactMethod
public void setNumberOfLoops(int noOfLooping){
// The expected boolean value
Boolean looping;
if (noOfLooping == 0) {
looping = false;
}
else {
looping = true;
}
if (this.mediaPlayer != null) {
this.mediaPlayer.setLooping(looping);
}
}
@ReactMethod
public void getInfo(
Promise promise) {
if (this.mediaPlayer == null) {
promise.resolve(null);
return;
}
WritableMap map = Arguments.createMap();
map.putDouble("currentTime", this.mediaPlayer.getCurrentPosition() / 1000.0);
map.putDouble("duration", this.mediaPlayer.getDuration() / 1000.0);
promise.resolve(map);
}
@ReactMethod
public void addListener(String eventName) {
// Set up any upstream listeners or background tasks as necessary
}
@ReactMethod
public void removeListeners(Integer count) {
// Remove upstream listeners, stop unnecessary background tasks
}
private void sendEvent(ReactApplicationContext reactContext,
String eventName,
@Nullable WritableMap params) {
reactContext
.getJSModule(DeviceEventManagerModule.RCTDeviceEventEmitter.class)
.emit(eventName, params);
}
private void mountSoundFile(String name, String type) throws IOException {
try {
Uri uri;
int soundResID = getReactApplicationContext().getResources().getIdentifier(name, "raw", getReactApplicationContext().getPackageName());
if (soundResID > 0) {
uri = Uri.parse("android.resource://" + getReactApplicationContext().getPackageName() + "/raw/" + name);
} else {
uri = this.getUriFromFile(name, type);
}
if (this.mediaPlayer == null) {
this.mediaPlayer = initializeMediaPlayer(uri);
} else {
this.mediaPlayer.reset();
this.mediaPlayer.setDataSource(getCurrentActivity(), uri);
this.mediaPlayer.prepare();
}
sendMountFileSuccessEvents(name, type);
} catch (IOException e) {
sendErrorEvent(e);
}
}
private Uri getUriFromFile(String name, String type) {
String folder = getReactApplicationContext().getFilesDir().getAbsolutePath();
String file = (!type.isEmpty()) ? name + "." + type : name;
// http://blog.weston-fl.com/android-mediaplayer-prepare-throws-status0x1-error1-2147483648
// this helps avoid a common error state when mounting the file
File ref = new File(folder + "/" + file);
if (ref.exists()) {
ref.setReadable(true, false);
}
return Uri.parse("file://" + folder + "/" + file);
}
private void prepareUrl(final String url) throws IOException {
try {
if (this.mediaPlayer == null) {
Uri uri = Uri.parse(url);
this.mediaPlayer = initializeMediaPlayer(uri);
this.mediaPlayer.setOnPreparedListener(
new OnPreparedListener() {
@Override
public void onPrepared(MediaPlayer mediaPlayer) {
WritableMap onFinishedLoadingURLParams = Arguments.createMap();
onFinishedLoadingURLParams.putBoolean("success", true);
onFinishedLoadingURLParams.putString("url", url);
sendEvent(getReactApplicationContext(), EVENT_FINISHED_LOADING_URL, onFinishedLoadingURLParams);
}
}
);
} else {
Uri uri = Uri.parse(url);
this.mediaPlayer.reset();
this.mediaPlayer.setDataSource(getCurrentActivity(), uri);
this.mediaPlayer.prepare();
}
WritableMap params = Arguments.createMap();
params.putBoolean("success", true);
sendEvent(getReactApplicationContext(), EVENT_FINISHED_LOADING, params);
} catch (IOException e) {
WritableMap errorParams = Arguments.createMap();
errorParams.putString("error", e.getMessage());
sendEvent(getReactApplicationContext(), EVENT_SETUP_ERROR, errorParams);
}
}
private MediaPlayer initializeMediaPlayer(Uri uri) throws IOException {
MediaPlayer mediaPlayer = MediaPlayer.create(getCurrentActivity(), uri);
if (mediaPlayer == null) {
throw new IOException("Failed to initialize MediaPlayer for URI: " + uri.toString());
}
mediaPlayer.setOnCompletionListener(
new OnCompletionListener() {
@Override
public void onCompletion(MediaPlayer arg0) {
WritableMap params = Arguments.createMap();
params.putBoolean("success", true);
sendEvent(getReactApplicationContext(), EVENT_FINISHED_PLAYING, params);
}
}
);
return mediaPlayer;
}
private void sendMountFileSuccessEvents(String name, String type) {
WritableMap params = Arguments.createMap();
params.putBoolean("success", true);
sendEvent(reactContext, EVENT_FINISHED_LOADING, params);
WritableMap onFinishedLoadingFileParams = Arguments.createMap();
onFinishedLoadingFileParams.putBoolean("success", true);
onFinishedLoadingFileParams.putString("name", name);
onFinishedLoadingFileParams.putString("type", type);
sendEvent(reactContext, EVENT_FINISHED_LOADING_FILE, onFinishedLoadingFileParams);
}
private void sendErrorEvent(IOException e) {
WritableMap errorParams = Arguments.createMap();
errorParams.putString("error", e.getMessage());
sendEvent(reactContext, EVENT_SETUP_ERROR, errorParams);
}
}
================================================
FILE: android/src/main/java/com/johnsonsu/rnsoundplayer/RNSoundPlayerPackage.java
================================================
package com.johnsonsu.rnsoundplayer;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import com.facebook.react.ReactPackage;
import com.facebook.react.bridge.NativeModule;
import com.facebook.react.bridge.ReactApplicationContext;
import com.facebook.react.uimanager.ViewManager;
import com.facebook.react.bridge.JavaScriptModule;
public class RNSoundPlayerPackage implements ReactPackage {
@Override
public List<NativeModule> createNativeModules(ReactApplicationContext reactContext) {
return Arrays.<NativeModule>asList(new RNSoundPlayerModule(reactContext));
}
@Override
public List<ViewManager> createViewManagers(ReactApplicationContext reactContext) {
return Collections.emptyList();
}
}
================================================
FILE: index.d.ts
================================================
declare module "react-native-sound-player" {
import { EmitterSubscription } from "react-native";
export type SoundPlayerEvent =
| "OnSetupError"
| "FinishedLoading"
| "FinishedPlaying"
| "FinishedLoadingURL"
| "FinishedLoadingFile";
export type SoundPlayerEventData = {
success?: boolean;
url?: string;
name?: string;
type?: string;
};
interface SoundPlayerType {
playSoundFile: (name: string, type: string) => void;
playSoundFileWithDelay: (name: string, type: string, delay: number) => void;
loadSoundFile: (name: string, type: string) => void;
playUrl: (url: string) => void;
loadUrl: (url: string) => void;
playAsset: (asset: number) => void;
loadAsset: (asset: number) => void;
/** @deprecated please use addEventListener*/
onFinishedPlaying: (callback: (success: boolean) => unknown) => void;
/** @deprecated please use addEventListener*/
onFinishedLoading: (callback: (success: boolean) => unknown) => void;
/** Subscribe to any event. Returns a subscription object. Subscriptions created by this function cannot be removed by calling unmount(). You NEED to call yourSubscriptionObject.remove() when you no longer need this event listener or whenever your component unmounts. */
addEventListener: (
eventName: SoundPlayerEvent,
callback: (data: SoundPlayerEventData) => void
) => EmitterSubscription;
/** Play the loaded sound file. This function is the same as `resume`. */
play: () => void;
/** Pause the currently playing file. */
pause: () => void;
/** Resume from pause and continue playing the same file. This function is the same as `play`. */
resume: () => void;
/** Stop playing, call `playSound` to start playing again. */
stop: () => void;
/** Seek to seconds of the currently playing file. */
seek: (seconds: number) => void;
/** Set the volume of the current player. This does not change the volume of the device. */
setVolume: (volume: number) => void;
/** Only available on iOS. Overwrite default audio output to speaker, which forces playUrl() function to play from speaker. */
setSpeaker: (on: boolean) => void;
/** Only available on iOS. If you set this option, your audio will be mixed with audio playing in background apps, such as the Music app. */
setMixAudio: (on: boolean) => void;
/** iOS: 0 means to play the sound once, a positive number specifies the number of times to return to the start and play again, a negative number indicates an indefinite loop. Android: 0 means to play the sound once, other numbers indicate an indefinite loop. */
setNumberOfLoops: (loops: number) => void;
/** Get the currentTime and duration of the currently mounted audio media. This function returns a promise which resolves to an Object containing currentTime and duration properties. */
getInfo: () => Promise<{ currentTime: number; duration: number }>;
/** @deprecated Please use addEventListener and remove your own listener by calling yourSubscriptionObject.remove(). */
unmount: () => void;
}
const SoundPlayer: SoundPlayerType;
export default SoundPlayer;
}
================================================
FILE: index.js
================================================
/**
* @flow
*/
"use strict";
import { NativeModules, NativeEventEmitter, Platform } from "react-native";
import resolveAsset from 'react-native/Libraries/Image/resolveAssetSource';
const { RNSoundPlayer } = NativeModules;
const _soundPlayerEmitter = new NativeEventEmitter(RNSoundPlayer);
let _finishedPlayingListener = null;
let _finishedLoadingListener = null;
export default {
playSoundFile: (name: string, type: string) => {
RNSoundPlayer.playSoundFile(name, type);
},
playSoundFileWithDelay: (name: string, type: string, delay: number) => {
RNSoundPlayer.playSoundFileWithDelay(name, type, delay);
},
loadSoundFile: (name: string, type: string) => {
RNSoundPlayer.loadSoundFile(name, type);
},
setNumberOfLoops: (loops: number) => {
RNSoundPlayer.setNumberOfLoops(loops);
},
playUrl: (url: string) => {
RNSoundPlayer.playUrl(url);
},
loadUrl: (url: string) => {
RNSoundPlayer.loadUrl(url);
},
playAsset: async (asset: number) => {
if (!(__DEV__) && Platform.OS === "android") {
RNSoundPlayer.playSoundFile(resolveAsset(asset).uri, '');
} else {
RNSoundPlayer.playUrl(resolveAsset(asset).uri);
}
},
loadAsset: (asset: number) => {
if (!(__DEV__) && Platform.OS === "android") {
RNSoundPlayer.loadSoundFile(resolveAsset(asset).uri, '');
} else {
RNSoundPlayer.loadUrl(resolveAsset(asset).uri);
}
},
onFinishedPlaying: (callback: (success: boolean) => any) => {
if (_finishedPlayingListener) {
_finishedPlayingListener.remove();
_finishedPlayingListener = undefined;
}
_finishedPlayingListener = _soundPlayerEmitter.addListener(
"FinishedPlaying",
callback
);
},
onFinishedLoading: (callback: (success: boolean) => any) => {
if (_finishedLoadingListener) {
_finishedLoadingListener.remove();
_finishedLoadingListener = undefined;
}
_finishedLoadingListener = _soundPlayerEmitter.addListener(
"FinishedLoading",
callback
);
},
addEventListener: (
eventName:
| "OnSetupError"
| "FinishedLoading"
| "FinishedPlaying"
| "FinishedLoadingURL"
| "FinishedLoadingFile",
callback: Function
) => _soundPlayerEmitter.addListener(eventName, callback),
play: () => {
// play and resume has the exact same implementation natively
RNSoundPlayer.resume();
},
pause: () => {
RNSoundPlayer.pause();
},
resume: () => {
RNSoundPlayer.resume();
},
stop: () => {
RNSoundPlayer.stop();
},
seek: (seconds: number) => {
RNSoundPlayer.seek(seconds);
},
setVolume: (volume: number) => {
RNSoundPlayer.setVolume(volume);
},
setSpeaker: (on: boolean) => {
RNSoundPlayer.setSpeaker(on);
},
setMixAudio: (on: boolean) => {
if (Platform.OS === "android") {
console.log("setMixAudio is not implemented on Android");
} else {
RNSoundPlayer.setMixAudio(on);
}
},
getInfo: async () => RNSoundPlayer.getInfo(),
unmount: () => {
if (_finishedPlayingListener) {
_finishedPlayingListener.remove();
_finishedPlayingListener = undefined;
}
if (_finishedLoadingListener) {
_finishedLoadingListener.remove();
_finishedLoadingListener = undefined;
}
},
};
================================================
FILE: ios/RNSoundPlayer.h
================================================
//
// RNSoundPlayer
//
// Created by Johnson Su on 2018-07-10.
//
#import <React/RCTBridgeModule.h>
#import <AVFoundation/AVFoundation.h>
#import <React/RCTEventEmitter.h>
@interface RNSoundPlayer : RCTEventEmitter <RCTBridgeModule, AVAudioPlayerDelegate>
@property (nonatomic, strong) AVAudioPlayer *player;
@property (nonatomic, strong) AVPlayer *avPlayer;
@property (nonatomic) int loopCount;
@end
================================================
FILE: ios/RNSoundPlayer.m
================================================
//
// RNSoundPlayer
//
// Created by Johnson Su on 2018-07-10.
//
#import "RNSoundPlayer.h"
#import <AVFoundation/AVFoundation.h>
@implementation RNSoundPlayer
{
bool hasListeners;
}
static NSString *const EVENT_SETUP_ERROR = @"OnSetupError";
static NSString *const EVENT_FINISHED_LOADING = @"FinishedLoading";
static NSString *const EVENT_FINISHED_LOADING_FILE = @"FinishedLoadingFile";
static NSString *const EVENT_FINISHED_LOADING_URL = @"FinishedLoadingURL";
static NSString *const EVENT_FINISHED_PLAYING = @"FinishedPlaying";
RCT_EXPORT_MODULE();
@synthesize bridge = _bridge;
+ (BOOL)requiresMainQueueSetup {
return YES;
}
- (instancetype)init {
self = [super init];
if (self) {
self.loopCount = 0;
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(itemDidFinishPlaying:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:nil];
}
return self;
}
- (void)dealloc {
[[NSNotificationCenter defaultCenter] removeObserver:self];
}
- (NSArray<NSString *> *)supportedEvents {
return @[EVENT_FINISHED_PLAYING, EVENT_FINISHED_LOADING, EVENT_FINISHED_LOADING_URL, EVENT_FINISHED_LOADING_FILE, EVENT_SETUP_ERROR];
}
-(void)startObserving {
hasListeners = YES;
}
-(void)stopObserving {
hasListeners = NO;
}
RCT_EXPORT_METHOD(playUrl:(NSString *)url) {
[self prepareUrl:url];
if (self.avPlayer) {
[self.avPlayer play];
}
}
RCT_EXPORT_METHOD(loadUrl:(NSString *)url) {
[self prepareUrl:url];
}
RCT_EXPORT_METHOD(playSoundFile:(NSString *)name ofType:(NSString *)type) {
[self mountSoundFile:name ofType:type];
if (self.player) {
[self.player play];
}
}
RCT_EXPORT_METHOD(playSoundFileWithDelay:(NSString *)name ofType:(NSString *)type delay:(double)delay) {
[self mountSoundFile:name ofType:type];
if (self.player) {
[self.player playAtTime:(self.player.deviceCurrentTime + delay)];
}
}
RCT_EXPORT_METHOD(loadSoundFile:(NSString *)name ofType:(NSString *)type) {
[self mountSoundFile:name ofType:type];
}
RCT_EXPORT_METHOD(pause) {
if (self.player != nil) {
[self.player pause];
}
if (self.avPlayer != nil) {
[self.avPlayer pause];
}
}
RCT_EXPORT_METHOD(resume) {
if (self.player != nil) {
[self.player play];
}
if (self.avPlayer != nil) {
[self.avPlayer play];
}
}
RCT_EXPORT_METHOD(stop) {
if (self.player != nil) {
[self.player stop];
}
if (self.avPlayer != nil) {
[self.avPlayer pause];
[self.avPlayer seekToTime:kCMTimeZero];
}
}
RCT_EXPORT_METHOD(seek:(float)seconds) {
if (self.player != nil) {
self.player.currentTime = seconds;
}
if (self.avPlayer != nil) {
[self.avPlayer seekToTime:CMTimeMakeWithSeconds(seconds, NSEC_PER_SEC)];
}
}
#if !TARGET_OS_TV
RCT_EXPORT_METHOD(setSpeaker:(BOOL)on) {
AVAudioSession *session = [AVAudioSession sharedInstance];
NSError *error = nil;
if (on) {
[session setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
[session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&error];
} else {
[session setCategory:AVAudioSessionCategoryPlayback error:&error];
[session overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&error];
}
[session setActive:YES error:&error];
if (error) {
[self sendErrorEvent:error];
}
}
#endif
RCT_EXPORT_METHOD(setMixAudio:(BOOL)on) {
AVAudioSession *session = [AVAudioSession sharedInstance];
NSError *error = nil;
if (on) {
[session setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error:&error];
} else {
[session setCategory:AVAudioSessionCategoryPlayback withOptions:0 error:&error];
}
[session setActive:YES error:&error];
if (error) {
[self sendErrorEvent:error];
}
}
RCT_EXPORT_METHOD(setVolume:(float)volume) {
if (self.player != nil) {
[self.player setVolume:volume];
}
if (self.avPlayer != nil) {
[self.avPlayer setVolume:volume];
}
}
RCT_EXPORT_METHOD(setNumberOfLoops:(NSInteger)loopCount) {
self.loopCount = loopCount;
if (self.player != nil) {
[self.player setNumberOfLoops:loopCount];
}
}
RCT_REMAP_METHOD(getInfo,
getInfoWithResolver:(RCTPromiseResolveBlock)resolve
rejecter:(RCTPromiseRejectBlock)reject) {
if (self.player != nil) {
NSDictionary *data = @{
@"currentTime": [NSNumber numberWithDouble:[self.player currentTime]],
@"duration": [NSNumber numberWithDouble:[self.player duration]]
};
resolve(data);
} else if (self.avPlayer != nil) {
CMTime currentTime = [[self.avPlayer currentItem] currentTime];
CMTime duration = [[[self.avPlayer currentItem] asset] duration];
NSDictionary *data = @{
@"currentTime": [NSNumber numberWithFloat:CMTimeGetSeconds(currentTime)],
@"duration": [NSNumber numberWithFloat:CMTimeGetSeconds(duration)]
};
resolve(data);
} else {
resolve(nil);
}
}
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
if (hasListeners) {
[self sendEventWithName:EVENT_FINISHED_PLAYING body:@{@"success": [NSNumber numberWithBool:flag]}];
}
}
- (void)itemDidFinishPlaying:(NSNotification *)notification {
if (hasListeners) {
[self sendEventWithName:EVENT_FINISHED_PLAYING body:@{@"success": [NSNumber numberWithBool:YES]}];
}
}
- (void)mountSoundFile:(NSString *)name ofType:(NSString *)type {
if (self.avPlayer) {
self.avPlayer = nil;
}
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:name ofType:type];
if (soundFilePath == nil) {
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
soundFilePath = [NSString stringWithFormat:@"%@.%@", [documentsDirectory stringByAppendingPathComponent:name], type];
}
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
NSError *error = nil;
self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:&error];
if (error) {
[self sendErrorEvent:error];
return;
}
[self.player setDelegate:self];
[self.player setNumberOfLoops:self.loopCount];
[self.player prepareToPlay];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
if (error) {
[self sendErrorEvent:error];
return;
}
if (hasListeners) {
[self sendEventWithName:EVENT_FINISHED_LOADING body:@{@"success": [NSNumber numberWithBool:YES]}];
[self sendEventWithName:EVENT_FINISHED_LOADING_FILE body:@{@"success": [NSNumber numberWithBool:YES], @"name": name, @"type": type}];
}
}
- (void)prepareUrl:(NSString *)url {
if (self.player) {
self.player = nil;
}
NSURL *soundURL = [NSURL URLWithString:url];
self.avPlayer = [[AVPlayer alloc] initWithURL:soundURL];
[self.avPlayer.currentItem addObserver:self forKeyPath:@"status" options:0 context:nil];
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary<NSKeyValueChangeKey,id> *)change context:(void *)context {
if (object == self.avPlayer.currentItem && [keyPath isEqualToString:@"status"] && hasListeners) {
if (self.avPlayer.currentItem.status == AVPlayerItemStatusReadyToPlay) {
[self sendEventWithName:EVENT_FINISHED_LOADING body:@{@"success": [NSNumber numberWithBool:YES]}];
NSURL *url = [(AVURLAsset *)self.avPlayer.currentItem.asset URL];
[self sendEventWithName:EVENT_FINISHED_LOADING_URL body:@{@"success": [NSNumber numberWithBool:YES], @"url": [url absoluteString]}];
} else if (self.avPlayer.currentItem.status == AVPlayerItemStatusFailed) {
[self sendErrorEvent:self.avPlayer.currentItem.error];
}
}
}
- (void)sendErrorEvent:(NSError *)error {
if (hasListeners) {
[self sendEventWithName:EVENT_SETUP_ERROR body:@{@"error": [error localizedDescription]}];
}
}
@end
================================================
FILE: ios/RNSoundPlayer.xcodeproj/project.pbxproj
================================================
// !$*UTF8*$!
{
archiveVersion = 1;
classes = {
};
objectVersion = 46;
objects = {
/* Begin PBXBuildFile section */
13BE3DEE1AC21097009241FE /* RNSoundPlayer.m in Sources */ = {isa = PBXBuildFile; fileRef = 13BE3DED1AC21097009241FE /* RNSoundPlayer.m */; };
/* End PBXBuildFile section */
/* Begin PBXCopyFilesBuildPhase section */
58B511D91A9E6C8500147676 /* CopyFiles */ = {
isa = PBXCopyFilesBuildPhase;
buildActionMask = 2147483647;
dstPath = "include/$(PRODUCT_NAME)";
dstSubfolderSpec = 16;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXCopyFilesBuildPhase section */
/* Begin PBXFileReference section */
134814201AA4EA6300B7C361 /* libRNSoundPlayer.a */ = {isa = PBXFileReference; explicitFileType = archive.ar; includeInIndex = 0; path = libRNSoundPlayer.a; sourceTree = BUILT_PRODUCTS_DIR; };
13BE3DEC1AC21097009241FE /* RNSoundPlayer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RNSoundPlayer.h; sourceTree = "<group>"; };
13BE3DED1AC21097009241FE /* RNSoundPlayer.m */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.objc; path = RNSoundPlayer.m; sourceTree = "<group>"; };
/* End PBXFileReference section */
/* Begin PBXFrameworksBuildPhase section */
58B511D81A9E6C8500147676 /* Frameworks */ = {
isa = PBXFrameworksBuildPhase;
buildActionMask = 2147483647;
files = (
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXFrameworksBuildPhase section */
/* Begin PBXGroup section */
134814211AA4EA7D00B7C361 /* Products */ = {
isa = PBXGroup;
children = (
134814201AA4EA6300B7C361 /* libRNSoundPlayer.a */,
);
name = Products;
sourceTree = "<group>";
};
58B511D21A9E6C8500147676 = {
isa = PBXGroup;
children = (
13BE3DEC1AC21097009241FE /* RNSoundPlayer.h */,
13BE3DED1AC21097009241FE /* RNSoundPlayer.m */,
134814211AA4EA7D00B7C361 /* Products */,
);
sourceTree = "<group>";
};
/* End PBXGroup section */
/* Begin PBXNativeTarget section */
58B511DA1A9E6C8500147676 /* RNSoundPlayer */ = {
isa = PBXNativeTarget;
buildConfigurationList = 58B511EF1A9E6C8500147676 /* Build configuration list for PBXNativeTarget "RNSoundPlayer" */;
buildPhases = (
58B511D71A9E6C8500147676 /* Sources */,
58B511D81A9E6C8500147676 /* Frameworks */,
58B511D91A9E6C8500147676 /* CopyFiles */,
);
buildRules = (
);
dependencies = (
);
name = RNSoundPlayer;
productName = RCTDataManager;
productReference = 134814201AA4EA6300B7C361 /* libRNSoundPlayer.a */;
productType = "com.apple.product-type.library.static";
};
/* End PBXNativeTarget section */
/* Begin PBXProject section */
58B511D31A9E6C8500147676 /* Project object */ = {
isa = PBXProject;
attributes = {
LastUpgradeCheck = 0610;
ORGANIZATIONNAME = Facebook;
TargetAttributes = {
58B511DA1A9E6C8500147676 = {
CreatedOnToolsVersion = 6.1.1;
};
};
};
buildConfigurationList = 58B511D61A9E6C8500147676 /* Build configuration list for PBXProject "RNSoundPlayer" */;
compatibilityVersion = "Xcode 3.2";
developmentRegion = English;
hasScannedForEncodings = 0;
knownRegions = (
en,
);
mainGroup = 58B511D21A9E6C8500147676;
productRefGroup = 58B511D21A9E6C8500147676;
projectDirPath = "";
projectRoot = "";
targets = (
58B511DA1A9E6C8500147676 /* RNSoundPlayer */,
);
};
/* End PBXProject section */
/* Begin PBXSourcesBuildPhase section */
58B511D71A9E6C8500147676 /* Sources */ = {
isa = PBXSourcesBuildPhase;
buildActionMask = 2147483647;
files = (
13BE3DEE1AC21097009241FE /* RNSoundPlayer.m in Sources */,
);
runOnlyForDeploymentPostprocessing = 0;
};
/* End PBXSourcesBuildPhase section */
/* Begin XCBuildConfiguration section */
58B511ED1A9E6C8500147676 /* Debug */ = {
isa = XCBuildConfiguration;
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x";
CLANG_CXX_LIBRARY = "libc++";
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
CLANG_WARN_CONSTANT_CONVERSION = YES;
CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;
CLANG_WARN_EMPTY_BODY = YES;
CLANG_WARN_ENUM_CONVERSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
COPY_PHASE_STRIP = NO;
ENABLE_STRICT_OBJC_MSGSEND = YES;
GCC_C_LANGUAGE_STANDARD = gnu99;
GCC_DYNAMIC_NO_PIC = NO;
GCC_OPTIMIZATION_LEVEL = 0;
GCC_PREPROCESSOR_DEFINITIONS = (
"DEBUG=1",
"$(inherited)",
);
GCC_SYMBOLS_PRIVATE_EXTERN = NO;
GCC_WARN_64_TO_32_BIT_CONVERSION = YES;
GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;
GCC_WARN_UNDECLARED_SELECTOR = YES;
GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;
GCC_WARN_UNUSED_FUNCTION = YES;
GCC_WARN_UNUSED_VARIABLE = YES;
IPHONEOS_DEPLOYMENT_TARGET = 8.0;
MTL_ENABLE_DEBUG_INFO = YES;
ONLY_ACTIVE_ARCH = YES;
SDKROOT = iphoneos;
};
name = Debug;
};
58B511EE1A9E6C8500147676 /* Release */ = {
isa = XCBuildConfiguration;
buildSettings = {
ALWAYS_SEARCH_USER_PATHS = NO;
CLANG_CXX_LANGUAGE_STANDARD = "gnu++0x";
CLANG_CXX_LIBRARY = "libc++";
CLANG_ENABLE_MODULES = YES;
CLANG_ENABLE_OBJC_ARC = YES;
CLANG_WARN_BOOL_CONVERSION = YES;
CLANG_WARN_CONSTANT_CONVERSION = YES;
CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR;
CLANG_WARN_EMPTY_BODY = YES;
CLANG_WARN_ENUM_CONVERSION = YES;
CLANG_WARN_INT_CONVERSION = YES;
CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR;
CLANG_WARN_UNREACHABLE_CODE = YES;
CLANG_WARN__DUPLICATE_METHOD_MATCH = YES;
COPY_PHASE_STRIP = YES;
ENABLE_NS_ASSERTIONS = NO;
ENABLE_STRICT_OBJC_MSGSEND = YES;
GCC_C_LANGUAGE_STANDARD = gnu99;
GCC_WARN_64_TO_32_BIT_CONVERSION = YES;
GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR;
GCC_WARN_UNDECLARED_SELECTOR = YES;
GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE;
GCC_WARN_UNUSED_FUNCTION = YES;
GCC_WARN_UNUSED_VARIABLE = YES;
IPHONEOS_DEPLOYMENT_TARGET = 8.0;
MTL_ENABLE_DEBUG_INFO = NO;
SDKROOT = iphoneos;
VALIDATE_PRODUCT = YES;
};
name = Release;
};
58B511F01A9E6C8500147676 /* Debug */ = {
isa = XCBuildConfiguration;
buildSettings = {
HEADER_SEARCH_PATHS = (
"$(inherited)",
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/include,
"$(SRCROOT)/../../React/**",
"$(SRCROOT)/../../node_modules/react-native/React/**",
);
LIBRARY_SEARCH_PATHS = "$(inherited)";
OTHER_LDFLAGS = "-ObjC";
PRODUCT_NAME = RNSoundPlayer;
SKIP_INSTALL = YES;
};
name = Debug;
};
58B511F11A9E6C8500147676 /* Release */ = {
isa = XCBuildConfiguration;
buildSettings = {
HEADER_SEARCH_PATHS = (
"$(inherited)",
/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/include,
"$(SRCROOT)/../../React/**",
);
LIBRARY_SEARCH_PATHS = "$(inherited)";
OTHER_LDFLAGS = "-ObjC";
PRODUCT_NAME = RNSoundPlayer;
SKIP_INSTALL = YES;
};
name = Release;
};
/* End XCBuildConfiguration section */
/* Begin XCConfigurationList section */
58B511D61A9E6C8500147676 /* Build configuration list for PBXProject "RNSoundPlayer" */ = {
isa = XCConfigurationList;
buildConfigurations = (
58B511ED1A9E6C8500147676 /* Debug */,
58B511EE1A9E6C8500147676 /* Release */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Release;
};
58B511EF1A9E6C8500147676 /* Build configuration list for PBXNativeTarget "RNSoundPlayer" */ = {
isa = XCConfigurationList;
buildConfigurations = (
58B511F01A9E6C8500147676 /* Debug */,
58B511F11A9E6C8500147676 /* Release */,
);
defaultConfigurationIsVisible = 0;
defaultConfigurationName = Release;
};
/* End XCConfigurationList section */
};
rootObject = 58B511D31A9E6C8500147676 /* Project object */;
}
================================================
FILE: ios/RNSoundPlayer.xcodeproj/project.xcworkspace/contents.xcworkspacedata
================================================
<?xml version="1.0" encoding="UTF-8"?>
<Workspace
version = "1.0">
<FileRef
location = "self:">
</FileRef>
</Workspace>
================================================
FILE: package.json
================================================
{
"name": "react-native-sound-player",
"version": "0.14.5",
"description": "Play or stream audio files in ReactNative on iOS/Android",
"main": "index.js",
"types": "index.d.ts",
"keywords": [
"reactnative",
"react-native",
"sound",
"player",
"audio",
"streaming"
],
"repository": {
"type": "git",
"url": "git+https://github.com/johnsonsu/react-native-sound-player.git"
},
"author": {
"name": "Johnson Su",
"email": "johnsonsu@johnsonsu.com"
},
"prettier": {
"trailingComma": "es5",
"tabWidth": 2,
"semi": true,
"singleQuote": false
},
"license": "MIT"
}
gitextract_5lpaha_7/ ├── .github/ │ ├── ISSUE_TEMPLATE/ │ │ └── bug_report.md │ └── workflows/ │ └── npm-publish.yml ├── .gitignore ├── .prettierrc ├── LICENSE ├── README.md ├── RNSoundPlayer.podspec ├── android/ │ ├── build.gradle │ └── src/ │ └── main/ │ ├── AndroidManifest.xml │ └── java/ │ └── com/ │ └── johnsonsu/ │ └── rnsoundplayer/ │ ├── RNSoundPlayerModule.java │ └── RNSoundPlayerPackage.java ├── index.d.ts ├── index.js ├── ios/ │ ├── RNSoundPlayer.h │ ├── RNSoundPlayer.m │ └── RNSoundPlayer.xcodeproj/ │ ├── project.pbxproj │ └── project.xcworkspace/ │ └── contents.xcworkspacedata └── package.json
SYMBOL INDEX (36 symbols across 4 files)
FILE: android/src/main/java/com/johnsonsu/rnsoundplayer/RNSoundPlayerModule.java
class RNSoundPlayerModule (line 27) | public class RNSoundPlayerModule extends ReactContextBaseJavaModule impl...
method RNSoundPlayerModule (line 40) | public RNSoundPlayerModule(ReactApplicationContext reactContext) {
method getName (line 48) | @Override
method setSpeaker (line 53) | @ReactMethod
method onHostResume (line 59) | @Override
method onHostPause (line 63) | @Override
method onHostDestroy (line 67) | @Override
method playSoundFile (line 77) | @ReactMethod
method loadSoundFile (line 83) | @ReactMethod
method playUrl (line 88) | @ReactMethod
method loadUrl (line 94) | @ReactMethod
method pause (line 99) | @ReactMethod
method resume (line 106) | @ReactMethod
method stop (line 114) | @ReactMethod
method seek (line 121) | @ReactMethod
method setVolume (line 128) | @ReactMethod
method setNumberOfLoops (line 136) | @ReactMethod
method getInfo (line 152) | @ReactMethod
method addListener (line 165) | @ReactMethod
method removeListeners (line 170) | @ReactMethod
method sendEvent (line 175) | private void sendEvent(ReactApplicationContext reactContext,
method mountSoundFile (line 183) | private void mountSoundFile(String name, String type) throws IOExcepti...
method getUriFromFile (line 207) | private Uri getUriFromFile(String name, String type) {
method prepareUrl (line 222) | private void prepareUrl(final String url) throws IOException {
method initializeMediaPlayer (line 254) | private MediaPlayer initializeMediaPlayer(Uri uri) throws IOException {
method sendMountFileSuccessEvents (line 275) | private void sendMountFileSuccessEvents(String name, String type) {
method sendErrorEvent (line 288) | private void sendErrorEvent(IOException e) {
FILE: android/src/main/java/com/johnsonsu/rnsoundplayer/RNSoundPlayerPackage.java
class RNSoundPlayerPackage (line 13) | public class RNSoundPlayerPackage implements ReactPackage {
method createNativeModules (line 14) | @Override
method createViewManagers (line 19) | @Override
FILE: index.d.ts
type SoundPlayerEvent (line 4) | type SoundPlayerEvent =
type SoundPlayerEventData (line 11) | type SoundPlayerEventData = {
type SoundPlayerType (line 18) | interface SoundPlayerType {
FILE: index.js
method if (line 40) | if (!(__DEV__) && Platform.OS === "android") {
method if (line 56) | if (_finishedPlayingListener) {
method if (line 68) | if (_finishedLoadingListener) {
Condensed preview — 18 files, each showing path, character count, and a content snippet. Download the .json file or copy for the full structured content (52K chars).
[
{
"path": ".github/ISSUE_TEMPLATE/bug_report.md",
"chars": 550,
"preview": "---\nname: Bug report\nabout: Create a report to help us improve\ntitle: \"[BUG]\"\nlabels: ''\nassignees: ''\n\n---\n\n**Describe "
},
{
"path": ".github/workflows/npm-publish.yml",
"chars": 634,
"preview": "# This workflow will run tests using node and then publish a package to GitHub Packages when a release is created\n# For "
},
{
"path": ".gitignore",
"chars": 1791,
"preview": "*.DS_Store\n.AppleDouble\n.LSOverride\n\n# Icon must end with two \\r\nIcon\n\n\n# Thumbnails\n._*\n\n# Files that might appear in t"
},
{
"path": ".prettierrc",
"chars": 60,
"preview": "{\n \"tabWidth\": 4,\n \"semi\": false,\n \"singleQuote\": true\n}\n"
},
{
"path": "LICENSE",
"chars": 1067,
"preview": "MIT License\n\nCopyright (c) 2017 Johnson Su\n\nPermission is hereby granted, free of charge, to any person obtaining a copy"
},
{
"path": "README.md",
"chars": 8789,
"preview": "# react-native-sound-player\n\n> ⚠️ **This package is no longer actively maintained.** For new projects, I recommend using"
},
{
"path": "RNSoundPlayer.podspec",
"chars": 698,
"preview": "require 'json'\npackage = JSON.parse(File.read(File.join(__dir__, \"package.json\")))\n\nPod::Spec.new do |s|\n\n s.name "
},
{
"path": "android/build.gradle",
"chars": 903,
"preview": "\napply plugin: 'com.android.library'\n\ndef DEFAULT_COMPILE_SDK_VERSION = 31\ndef DEFAULT_BUILD_TOOLS_VERSION = '31.0.3'\nde"
},
{
"path": "android/src/main/AndroidManifest.xml",
"chars": 130,
"preview": "<manifest xmlns:android=\"http://schemas.android.com/apk/res/android\"\n package=\"com.johnsonsu.rnsoundplayer\">\n</"
},
{
"path": "android/src/main/java/com/johnsonsu/rnsoundplayer/RNSoundPlayerModule.java",
"chars": 9140,
"preview": "package com.johnsonsu.rnsoundplayer;\n\nimport android.content.Context;\nimport android.media.AudioManager;\nimport android."
},
{
"path": "android/src/main/java/com/johnsonsu/rnsoundplayer/RNSoundPlayerPackage.java",
"chars": 763,
"preview": "package com.johnsonsu.rnsoundplayer;\n\nimport java.util.Arrays;\nimport java.util.Collections;\nimport java.util.List;\n\nimp"
},
{
"path": "index.d.ts",
"chars": 3196,
"preview": "declare module \"react-native-sound-player\" {\n import { EmitterSubscription } from \"react-native\";\n\n export type SoundP"
},
{
"path": "index.js",
"chars": 3320,
"preview": "/**\n * @flow\n */\n\"use strict\";\n\nimport { NativeModules, NativeEventEmitter, Platform } from \"react-native\";\nimport resol"
},
{
"path": "ios/RNSoundPlayer.h",
"chars": 405,
"preview": "//\n// RNSoundPlayer\n//\n// Created by Johnson Su on 2018-07-10.\n//\n\n#import <React/RCTBridgeModule.h>\n#import <AVFounda"
},
{
"path": "ios/RNSoundPlayer.m",
"chars": 8480,
"preview": "//\n// RNSoundPlayer\n//\n// Created by Johnson Su on 2018-07-10.\n//\n\n#import \"RNSoundPlayer.h\"\n#import <AVFoundation/AVF"
},
{
"path": "ios/RNSoundPlayer.xcodeproj/project.pbxproj",
"chars": 8147,
"preview": "// !$*UTF8*$!\n{\n\tarchiveVersion = 1;\n\tclasses = {\n\t};\n\tobjectVersion = 46;\n\tobjects = {\n\n/* Begin PBXBuildFile section *"
},
{
"path": "ios/RNSoundPlayer.xcodeproj/project.xcworkspace/contents.xcworkspacedata",
"chars": 135,
"preview": "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<Workspace\n version = \"1.0\">\n <FileRef\n location = \"self:\">\n </FileRef"
},
{
"path": "package.json",
"chars": 638,
"preview": "{\n \"name\": \"react-native-sound-player\",\n \"version\": \"0.14.5\",\n \"description\": \"Play or stream audio files in ReactNat"
}
]
About this extraction
This page contains the full source code of the johnsonsu/react-native-sound-player GitHub repository, extracted and formatted as plain text for AI agents and large language models (LLMs). The extraction includes 18 files (47.7 KB), approximately 12.9k tokens, and a symbol index with 36 extracted functions, classes, methods, constants, and types. Use this with OpenClaw, Claude, ChatGPT, Cursor, Windsurf, or any other AI tool that accepts text input. You can copy the full output to your clipboard or download it as a .txt file.
Extracted by GitExtract — free GitHub repo to text converter for AI. Built by Nikandr Surkov.