A React Native component to show audio waveform with ease in react native application
npm install @yoliani/react-native-audio-waveform     
---
A React Native package featuring native modules for generating and rendering audio waveforms. Designed to efficiently produce visual representations for pre-recorded audio files and dynamically draw waveforms in real-time during audio recording within React Native applications.
---
| Audio Playback Waveform | Audio Record Waveform | Audio Waveform with Speed |
| --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
|
|
|
|
- Installation
- Usage and Examples
- Properties
- Example Code
- License
Here's how to get started with react-native-audio-waveform in your React Native project:
##### 1. Install the package
``sh`
npm install @simform_solutions/react-native-audio-waveform react-native-gesture-handler
###### --- or ---
`sh`
yarn add @simform_solutions/react-native-audio-waveform react-native-gesture-handler
##### 2. Install CocoaPods in the iOS project
`bash`
npx pod-install
##### Know more about react-native-gesture-handler
##### 3. Add audio recording permissions
##### iOS
If you want to use recorder features in iOS, you have to add NSMicrophoneUsageDescription permission in info.plist and add a description based on your use case.
Here is a sample for info.plist permission and a description.
``
##### Android
If you want to use recorder features in Android, you have to add RECORD_AUDIO permission in AndroidManifest.xml.
`xml`
#### 1. Static waveform
When you want to show a waveform for a pre-existing audio file, you need to use static mode for the waveform. We have provided type safety for forward ref so that if you pass the static mode then you can only access methods which are available for static mode other methods will reject promise.
Check the example below for more information.
`tsx
import {
Waveform,
type IWaveformRef,
} from '@simform_solutions/react-native-audio-waveform';
const path = ''; // path to the audio file for which you want to show waveform
const ref = useRef
ref={ref}
path={item}
candleSpace={2}
candleWidth={4}
scrubColor="white"
onPlayerStateChange={playerState => console.log(playerState)}
onPanStateChange={isMoving => console.log(isMoving)}
/>;
`
#### 2. Live recording waveform
When you want to record audio and show a waveform for that recording, you need to create a waveform with live mode. Same as static mode, we have safety for ref methods.
Check the example below for more information.
`tsx
import {
Waveform,
type IWaveformRef,
} from '@simform_solutions/react-native-audio-waveform';
const ref = useRef
ref={ref}
candleSpace={2}
candleWidth={4}
onRecorderStateChange={recorderState => console.log(recorderState)}
/>;
`
You can check out the full example at Example.
---
| Props | Default | Static Mode | Live Mode | Type | Description |
| ------------------------------ | ----------- | --------------- | ------------- | ---------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| mode\* | - | ✅ | ✅ | 'live' or 'static' | Type of waveform. It can be either static for the resource file or live if you want to record audio |static
| ref\* | - | ✅ | ✅ | IWaveformRef | Type of ref provided to waveform component. If waveform mode is , some methods from ref will throw error and same for live.static
Check IWaveformRef for more details about which methods these refs provides. |
| path\* | - | ✅ | ❌ | string | Used for type. It is the resource path of an audio source file. |StyleProp
| playbackSpeed | 1.0 | ✅ | ❌ | 1.0 / 1.5 / 2.0 | The playback speed of the audio player. Note: Currently playback speed only supports, Normal (1x) Faster(1.5x) and Fastest(2.0x), any value passed to playback speed greater than 2.0 will be automatically adjusted to normal playback speed |
| candleSpace | 2 | ✅ | ✅ | number | Space between two candlesticks of waveform |
| candleWidth | 5 | ✅ | ✅ | number | Width of single candlestick of waveform |
| candleHeightScale | 3 | ✅ | ✅ | number | Scaling height of candlestick of waveform |
| maxCandlesToRender | 300 | ❌ | ✅ | number | Number of candlestick in waveform |
| containerStyle | - | ✅ | ✅ | | style of the container |
| waveColor | #545454 | ✅ | ✅ | string | color of candlestick of waveform |
| scrubColor | #7b7b7b | ✅ | ❌ | string | color of candlestick of waveform which has played |
| onPlayerStateChange | - | ✅ | ❌ | ( playerState : PlayerState ) => void | callback function, which returns player state whenever player state changes. |
| onPanStateChange | - | ✅ | ❌ | ( panMoving : boolean ) => void | callback function which returns boolean indicating whether audio seeking is active or not. |
| onRecorderStateChange | - | ❌ | ✅ | ( recorderState : RecorderState ) => void | callback function which returns the recorder state whenever the recorder state changes. Check RecorderState for more details |
| onCurrentProgressChange | - | ✅ | ❌ | ( currentProgress : number, songDuration: number ) => void | callback function, which returns current progress of audio and total song duration. |
| onChangeWaveformLoadState | - | ✅ | ❌ | ( state : boolean ) => void | callback function which returns the loading state of waveform candlestick. |
| onError | - | ✅ | ❌ | ( error : Error ) => void | callback function which returns the error for static audio waveform |
| showsHorizontalScrollIndicator | false | ❌ | ✅ | boolean | whether to show scroll indicator when live waveform is being recorded and total width is more than parent view width |
##### Know more about ViewStyle, PlayerState, and RecorderState
---
#### IWaveformRef Methods
#### For Static mode
#### startPlayer()
`ts`
startPlayer({
finishMode?: FinishMode;
}): Promise
starts playing the audio with the specified finish mode. If finish mode is not specified, it will default to FinishMode.stop.
It returns a boolean indicating whether playback is started.
#### stopPlayer()
`ts`
stopPlayer(): Promise
It returns a boolean indicating whether playback is stopped.
#### pausePlayer()
`ts`
pausePlayer(): Promise
It returns a boolean indicating whether playback is paused.
#### resumePlayer()
`ts`
resumePlayer(): Promise
It returns a boolean indicating whether playback is resumed again.
#### stopAllPlayers()
`ts`
stopAllPlayers(): Promise
Stops all the players at once and frees their native resources. Useful on unmount!
It returns a boolean indicating that all players were stopped.
#### stopAllWaveFormExtractors()
`ts`
stopAllWaveFormExtractors(): Promise
Stops all the extractors used to build the audio waveform and frees its native resource. Useful on unmount!
It returns a boolean indicating that all extractors were stopped.
#### stopPlayersAndExtractors()
`ts`
stopPlayersAndExtractors(): Promise<[boolean, boolean]>
Combined the stopAllWaveFormExtractors and stopAllPlayers in one call to free up the maximum possible resources. Very useful on unmount!
It returns an array of two booleans indicating if all players and all waveform extractors were stopped.
#### For Live mode
#### startRecord()
`ts`
startRecord({
encoder:number;
sampleRate: number;
bitRate: number;
fileNameFormat: string;
useLegacy: boolean;
updateFrequency?: UpdateFrequency;
}): Promise
Start a new audio recording with the given parameters. It will return whether the recording was started or not.
Check UpdateFrequency to know more.
> Note: Before starting the recording, the user must allow NSMicrophoneUsageDescription for iOS. You can check the permissions by using _checkHasAudioRecorderPermission_ from _useAudioPermission_. Check useAudioPermission to know more about various methods.
#### stopRecord()
`ts`
stopRecord(): Promise
It returns a string representing the current recorded audio file path.
#### pauseRecord()
`ts`
pauseRecord(): Promise
It returns a boolean indicating whether the recording is paused.
#### resumeRecord()
`ts`
resumeRecord(): Promise
It returns a boolean indicating whether the recording is resumed again.
By using this hook, you can check and ask for permission from the user for NSMicrophoneUsageDescription permission.
#### checkHasAudioRecorderPermission()
This method checks whether the user has permission to use a microphone for recording new audio. It will return PermissionStatus.
You can use this method as shown below:
`ts`
const hasPermission: PermissionStatus = await checkHasAudioRecorderPermission();
#### getAudioRecorderPermission()
This method lets you ask for NSMicrophoneUsageDescription permission from the user. It will return PermissionStatus.
By combining this with checkHasAudioRecorderPermission you can ask for permission and start recording if permission is granted.
Check out the following example:
`ts
let hasPermission = await checkHasAudioRecorderPermission();
if (hasPermission === PermissionStatus.granted) {
startRecording();
} else if (hasPermission === PermissionStatus.undetermined) {
const permissionStatus = await getAudioRecorderPermission();
if (permissionStatus === PermissionStatus.granted) {
startRecording();
}
} else {
Linking.openSettings();
}
`
---
#### PlayerState
`ts`
enum PlayerState {
playing = 'playing',
paused = 'paused',
stopped = 'stopped',
}
#### RecorderState
`ts`
enum RecorderState {
recording = 'recording',
paused = 'paused',
stopped = 'stopped',
}
#### UpdateFrequency
`ts`
// Update frequency in milliseconds
enum UpdateFrequency {
high = 250.0,
medium = 500.0,
low = 1000.0,
}
#### PermissionStatus
`ts`
enum PermissionStatus {
denied = 'denied',
undetermined = 'undetermined',
granted = 'granted',
}
---
You can check out the example app for react-native-audio-waveform in Example
To use example app you need to first run below command
`bash`
cd example && npx react-native-asset
> Note: If link-assets-manifest.json file already exists then make sure to delete that before running npx react-native-asset command.
This command will add our example audio sample files to the iOS bundle so that we can access them inside the iOS app.
`sh``
yarn
yarn example ios // For iOS
yarn example android // For Android
Support it by joining stargazers for this repository.⭐
For bugs, feature requests, and discussion, please use GitHub Issues, GitHub New Feature, GitHub Feedback
We'd love to have you improve this library or fix a problem 💪
Check out our Contributing Guide for ideas on contributing.
- Check out our other available awesome mobile libraries