
Backup Copilot Pro supports Dropbox, Google Drive, and Amazon S3 out of the box. But what if you need Wasabi, Backblaze B2, DigitalOcean Spaces, or your own custom storage backend? This developer guide shows you how to build custom cloud storage providers by extending Backup Copilot Pro’s provider architecture with complete code examples and implementation details.
Understanding the Cloud Provider Architecture
Backup Copilot Pro uses an abstract cloud provider interface that all storage backends implement. This architecture enables consistent behavior across different cloud services while allowing provider-specific implementations.
Base Provider Class: All providers extend BKPC_Cloud_Provider abstract class defining required methods:
abstract class BKPC_Cloud_Provider {
abstract public function authenticate($credentials);
abstract public function upload($local_file, $remote_path, $options);
abstract public function download($remote_path, $local_file);
abstract public function delete($remote_path);
abstract public function list_files($remote_dir);
abstract public function get_storage_info();
}Your custom provider implements these methods using your chosen storage service’s API.
Required Methods Overview
Each method serves specific purpose in backup workflow:
authenticate($credentials): Validates API credentials, OAuth tokens, or connection parameters. Returns true on success, throws exception on failure.
upload($local_file, $remote_path, $options): Uploads backup file to cloud storage. Handles chunked uploads for large files, progress callbacks, and resumable sessions.
download($remote_path, $local_file): Downloads backup from cloud to local server for restoration. Must handle partial downloads and resume capability.
delete($remote_path): Removes backup file from cloud storage. Called during retention policy enforcement.
list_files($remote_dir): Returns array of files in remote directory with metadata (size, modified date, path). Used for backup inventory.
get_storage_info(): Returns current storage quota usage and available space. Displayed in plugin dashboard.
Building an S3-Compatible Provider
S3-compatible storage (Wasabi, Backblaze B2, DigitalOcean Spaces) shares common API. Here’s complete implementation:
class BKPC_Cloud_Provider_S3_Compatible extends BKPC_Cloud_Provider {
private $endpoint;
private $bucket;
private $access_key;
private $secret_key;
private $client;
public function __construct() {
$this->endpoint = get_option('bkpc_s3_endpoint');
$this->bucket = get_option('bkpc_s3_bucket');
$this->access_key = get_option('bkpc_s3_access_key');
$this->secret_key = $this->decrypt(get_option('bkpc_s3_secret_key'));
}
public function authenticate($credentials) {
require_once 'aws-sdk/aws-autoloader.php';
try {
$this->client = new Aws\S3\S3Client([
'version' => 'latest',
'region' => 'us-east-1',
'endpoint' => $credentials['endpoint'],
'credentials' => [
'key' => $credentials['access_key'],
'secret' => $credentials['secret_key']
],
'use_path_style_endpoint' => true
]);
// Test connection by listing buckets
$this->client->headBucket(['Bucket' => $credentials['bucket']]);
return true;
} catch (Exception $e) {
throw new Exception('S3 Authentication failed: ' . $e->getMessage());
}
}
public function upload($local_file, $remote_path, $options = []) {
$file_size = filesize($local_file);
// Use multipart upload for files > 100MB
if ($file_size > 100 * 1024 * 1024) {
return $this->multipart_upload($local_file, $remote_path, $options);
}
try {
$result = $this->client->putObject([
'Bucket' => $this->bucket,
'Key' => $remote_path,
'SourceFile' => $local_file,
'ServerSideEncryption' => 'AES256'
]);
if (isset($options['progress_callback'])) {
call_user_func($options['progress_callback'], $file_size, $file_size);
}
return $result['ObjectURL'];
} catch (Exception $e) {
throw new Exception('Upload failed: ' . $e->getMessage());
}
}
private function multipart_upload($local_file, $remote_path, $options) {
$uploader = new Aws\S3\MultipartUploader($this->client, $local_file, [
'bucket' => $this->bucket,
'key' => $remote_path,
'before_initiate' => function () use ($options) {
// Store upload ID for resume capability
},
'before_upload' => function ($command) use ($options) {
if (isset($options['progress_callback'])) {
// Update progress
}
}
]);
try {
$result = $uploader->upload();
return $result['ObjectURL'];
} catch (Aws\S3\Exception\S3MultipartUploadException $e) {
// Resume failed upload
$uploader = new Aws\S3\MultipartUploader($this->client, $local_file, [
'bucket' => $this->bucket,
'key' => $remote_path,
'state' => $e->getState()
]);
$result = $uploader->upload();
return $result['ObjectURL'];
}
}
public function download($remote_path, $local_file) {
try {
$this->client->getObject([
'Bucket' => $this->bucket,
'Key' => $remote_path,
'SaveAs' => $local_file
]);
return true;
} catch (Exception $e) {
throw new Exception('Download failed: ' . $e->getMessage());
}
}
public function delete($remote_path) {
try {
$this->client->deleteObject([
'Bucket' => $this->bucket,
'Key' => $remote_path
]);
return true;
} catch (Exception $e) {
throw new Exception('Delete failed: ' . $e->getMessage());
}
}
public function list_files($remote_dir) {
try {
$results = $this->client->listObjects([
'Bucket' => $this->bucket,
'Prefix' => $remote_dir
]);
$files = [];
foreach ($results['Contents'] as $object) {
$files[] = [
'path' => $object['Key'],
'size' => $object['Size'],
'modified' => $object['LastModified']
];
}
return $files;
} catch (Exception $e) {
throw new Exception('List files failed: ' . $e->getMessage());
}
}
public function get_storage_info() {
// S3 doesn't provide quota info via API
return [
'used' => 0,
'total' => -1, // Unlimited
'available' => -1
];
}
}Implementing FTP/SFTP Provider
For legacy systems or custom infrastructure, FTP/SFTP support may be needed:
class BKPC_Cloud_Provider_SFTP extends BKPC_Cloud_Provider {
private $connection;
private $sftp;
public function authenticate($credentials) {
if (!function_exists('ssh2_connect')) {
throw new Exception('PHP SSH2 extension required');
}
$this->connection = ssh2_connect(
$credentials['host'],
$credentials['port'] ?? 22
);
if (!$this->connection) {
throw new Exception('Cannot connect to SFTP server');
}
if ($credentials['auth_type'] === 'password') {
$auth = ssh2_auth_password(
$this->connection,
$credentials['username'],
$credentials['password']
);
} else {
$auth = ssh2_auth_pubkey_file(
$this->connection,
$credentials['username'],
$credentials['public_key'],
$credentials['private_key'],
$credentials['passphrase']
);
}
if (!$auth) {
throw new Exception('SFTP authentication failed');
}
$this->sftp = ssh2_sftp($this->connection);
return true;
}
public function upload($local_file, $remote_path, $options = []) {
$remote_stream = fopen("ssh2.sftp://{$this->sftp}{$remote_path}", 'w');
$local_stream = fopen($local_file, 'r');
$file_size = filesize($local_file);
$uploaded = 0;
while (!feof($local_stream)) {
$buffer = fread($local_stream, 8192);
fwrite($remote_stream, $buffer);
$uploaded += strlen($buffer);
if (isset($options['progress_callback'])) {
call_user_func($options['progress_callback'], $uploaded, $file_size);
}
}
fclose($local_stream);
fclose($remote_stream);
return "sftp://{$credentials['host']}{$remote_path}";
}
public function download($remote_path, $local_file) {
return ssh2_scp_recv($this->connection, $remote_path, $local_file);
}
public function delete($remote_path) {
return ssh2_sftp_unlink($this->sftp, $remote_path);
}
public function list_files($remote_dir) {
$handle = opendir("ssh2.sftp://{$this->sftp}{$remote_dir}");
$files = [];
while (false !== ($file = readdir($handle))) {
if ($file !== '.' && $file !== '..') {
$path = $remote_dir . '/' . $file;
$stat = ssh2_sftp_stat($this->sftp, $path);
$files[] = [
'path' => $path,
'size' => $stat['size'],
'modified' => $stat['mtime']
];
}
}
closedir($handle);
return $files;
}
public function get_storage_info() {
// Execute df command via SSH
$stream = ssh2_exec($this->connection, 'df -k .');
stream_set_blocking($stream, true);
$output = stream_get_contents($stream);
// Parse df output to get storage info
// Implementation depends on server OS
return ['used' => 0, 'total' => 0, 'available' => 0];
}
}Registering Custom Providers
Make your provider available in Backup Copilot Pro:
add_filter('bkpc_cloud_providers', 'register_custom_providers');
function register_custom_providers($providers) {
$providers['s3_compatible'] = [
'name' => 'S3-Compatible Storage',
'class' => 'BKPC_Cloud_Provider_S3_Compatible',
'icon' => 'dashicons-cloud',
'settings_fields' => [
'endpoint' => ['type' => 'text', 'label' => 'Endpoint URL'],
'bucket' => ['type' => 'text', 'label' => 'Bucket Name'],
'access_key' => ['type' => 'text', 'label' => 'Access Key'],
'secret_key' => ['type' => 'password', 'label' => 'Secret Key']
]
];
$providers['sftp'] = [
'name' => 'SFTP',
'class' => 'BKPC_Cloud_Provider_SFTP',
'icon' => 'dashicons-admin-site',
'settings_fields' => [
'host' => ['type' => 'text', 'label' => 'Host'],
'port' => ['type' => 'number', 'label' => 'Port', 'default' => 22],
'username' => ['type' => 'text', 'label' => 'Username'],
'password' => ['type' => 'password', 'label' => 'Password']
]
];
return $providers;
}Implementing OAuth 2.0 Authentication
For providers requiring OAuth (Google Drive, Dropbox):
public function authenticate($credentials) {
// Step 1: Redirect to OAuth consent screen
if (!isset($credentials['code'])) {
$auth_url = 'https://provider.com/oauth/authorize?' . http_build_query([
'client_id' => $credentials['client_id'],
'redirect_uri' => admin_url('admin.php?page=bkpc-oauth-callback'),
'response_type' => 'code',
'scope' => 'file.read file.write'
]);
wp_redirect($auth_url);
exit;
}
// Step 2: Exchange code for access token
$response = wp_remote_post('https://provider.com/oauth/token', [
'body' => [
'code' => $credentials['code'],
'client_id' => $credentials['client_id'],
'client_secret' => $credentials['client_secret'],
'redirect_uri' => admin_url('admin.php?page=bkpc-oauth-callback'),
'grant_type' => 'authorization_code'
]
]);
$body = json_decode(wp_remote_retrieve_body($response), true);
// Store tokens securely
update_option('bkpc_provider_access_token', $this->encrypt($body['access_token']));
update_option('bkpc_provider_refresh_token', $this->encrypt($body['refresh_token']));
update_option('bkpc_provider_token_expires', time() + $body['expires_in']);
return true;
}
private function get_access_token() {
$expires = get_option('bkpc_provider_token_expires');
// Refresh token if expired
if (time() >= $expires) {
$this->refresh_access_token();
}
return $this->decrypt(get_option('bkpc_provider_access_token'));
}
private function refresh_access_token() {
$refresh_token = $this->decrypt(get_option('bkpc_provider_refresh_token'));
$response = wp_remote_post('https://provider.com/oauth/token', [
'body' => [
'refresh_token' => $refresh_token,
'client_id' => get_option('bkpc_provider_client_id'),
'client_secret' => $this->decrypt(get_option('bkpc_provider_client_secret')),
'grant_type' => 'refresh_token'
]
]);
$body = json_decode(wp_remote_retrieve_body($response), true);
update_option('bkpc_provider_access_token', $this->encrypt($body['access_token']));
update_option('bkpc_provider_token_expires', time() + $body['expires_in']);
}Error Handling and Retry Logic
Robust providers handle transient failures:
public function upload_with_retry($local_file, $remote_path, $options = []) {
$max_retries = 3;
$retry_delay = 5; // seconds
for ($attempt = 1; $attempt <= $max_retries; $attempt++) {
try {
return $this->upload($local_file, $remote_path, $options);
} catch (Exception $e) {
if ($attempt === $max_retries) {
throw $e;
}
// Log retry attempt
error_log("Upload attempt {$attempt} failed: {$e->getMessage()}. Retrying in {$retry_delay}s");
sleep($retry_delay);
$retry_delay *= 2; // Exponential backoff
}
}
}Security Best Practices
Protect credentials and user data:
Encrypt Sensitive Data:
private function encrypt($value) {
if (!defined('BKPC_ENCRYPTION_KEY')) {
throw new Exception('Encryption key not defined');
}
return openssl_encrypt($value, 'AES-256-CBC', BKPC_ENCRYPTION_KEY, 0, substr(BKPC_ENCRYPTION_KEY, 0, 16));
}
private function decrypt($value) {
if (!defined('BKPC_ENCRYPTION_KEY')) {
throw new Exception('Encryption key not defined');
}
return openssl_decrypt($value, 'AES-256-CBC', BKPC_ENCRYPTION_KEY, 0, substr(BKPC_ENCRYPTION_KEY, 0, 16));
}Validate Inputs:
public function authenticate($credentials) {
$required = ['endpoint', 'bucket', 'access_key', 'secret_key'];
foreach ($required as $field) {
if (empty($credentials[$field])) {
throw new Exception("Missing required field: {$field}");
}
}
// Validate endpoint is proper URL
if (!filter_var($credentials['endpoint'], FILTER_VALIDATE_URL)) {
throw new Exception('Invalid endpoint URL');
}
// Continue authentication...
}Testing Custom Providers
Thorough testing ensures reliability:
class BKPC_Cloud_Provider_Tests {
public static function test_provider($provider_class, $credentials) {
$provider = new $provider_class();
$results = [];
// Test authentication
try {
$provider->authenticate($credentials);
$results['auth'] = 'PASS';
} catch (Exception $e) {
$results['auth'] = 'FAIL: ' . $e->getMessage();
return $results; // Can't continue without auth
}
// Test upload
$test_file = tempnam(sys_get_temp_dir(), 'bkpc_test_');
file_put_contents($test_file, 'Test backup content');
try {
$provider->upload($test_file, '/test-backup.txt');
$results['upload'] = 'PASS';
} catch (Exception $e) {
$results['upload'] = 'FAIL: ' . $e->getMessage();
}
// Test list files
try {
$files = $provider->list_files('/');
$results['list'] = count($files) > 0 ? 'PASS' : 'FAIL: No files found';
} catch (Exception $e) {
$results['list'] = 'FAIL: ' . $e->getMessage();
}
// Test download
$download_file = tempnam(sys_get_temp_dir(), 'bkpc_download_');
try {
$provider->download('/test-backup.txt', $download_file);
$results['download'] = file_get_contents($download_file) === 'Test backup content' ? 'PASS' : 'FAIL: Content mismatch';
} catch (Exception $e) {
$results['download'] = 'FAIL: ' . $e->getMessage();
}
// Test delete
try {
$provider->delete('/test-backup.txt');
$results['delete'] = 'PASS';
} catch (Exception $e) {
$results['delete'] = 'FAIL: ' . $e->getMessage();
}
// Cleanup
unlink($test_file);
unlink($download_file);
return $results;
}
}Performance Optimization
Optimize for large backups:
Chunked Uploads: Upload large files in chunks (5-10MB) to handle network interruptions and show progress.
Concurrent Connections: Use multiple connections for faster transfers when API supports it.
Compression: Compress before upload to reduce bandwidth:
public function upload($local_file, $remote_path, $options = []) {
// Compress if not already compressed
if (!preg_match('/\.(zip|gz|bz2)$/i', $local_file)) {
$compressed = $local_file . '.gz';
$this->compress_file($local_file, $compressed);
$local_file = $compressed;
$remote_path .= '.gz';
}
// Upload compressed file
return $this->do_upload($local_file, $remote_path, $options);
}Publishing Your Provider
Share with community:
- Package as WordPress Plugin: Create standalone plugin users can install
- Documentation: Provide setup instructions and API credentials guidance
- WordPress.org Repository: Submit to official repository for distribution
- GitHub: Open source on GitHub for community contributions
- Support: Provide support channels for users
Conclusion
Custom cloud storage providers extend Backup Copilot Pro to any storage backend. Whether integrating S3-compatible services, legacy FTP/SFTP systems, or proprietary storage APIs, the provider architecture enables consistent functionality while accommodating provider-specific implementations.
Key implementation requirements: authentication, upload/download with chunking and progress tracking, error handling with retries, security through credential encryption, and thorough testing. Follow these patterns and your custom provider will integrate seamlessly with Backup Copilot Pro’s backup workflows.
External Links
- AWS S3 API Documentation
- S3-Compatible Storage Guide
- OAuth 2.0 Specification
- PHP FTP Functions
- WordPress Plugin Development
Call to Action
Need a custom storage solution? Contact our team for custom development services or get Pro to build your own provider using our SDK!

