Integration Guide
NexStorage's S3-compatible API makes it easy to integrate with a wide range of tools, applications, and frameworks. This guide provides examples of how to connect NexStorage to popular tools and development environments.
Command-Line Tools
AWS CLI
The AWS Command Line Interface works seamlessly with NexStorage:
-
Install the AWS CLI:
# For macOS
brew install awscli
# For Ubuntu/Debian
apt-get install awscli
# For Windows (using PowerShell)
choco install awscli -
Configure a profile for NexStorage:
aws configure --profile nexstorageEnter your NexStorage credentials when prompted:
AWS Access Key ID: YOUR_NEXSTORAGE_ACCESS_KEY
AWS Secret Access Key: YOUR_NEXSTORAGE_SECRET_KEY
Default region name: us-east-1
Default output format: json -
Use the profile with endpoint override:
aws s3 ls --profile nexstorage --endpoint-url https://s3.nexstorage.nexvecta.com -
For convenience, create an alias (bash/zsh):
echo 'alias nex="aws --profile nexstorage --endpoint-url https://s3.nexstorage.nexvecta.com"' >> ~/.bashrc
source ~/.bashrc
# Now you can use the simplified command
nex s3 ls
s3cmd
The s3cmd tool provides a simple way to interact with NexStorage:
-
Install s3cmd:
# For macOS
brew install s3cmd
# For Ubuntu/Debian
apt-get install s3cmd
# For Windows (using pip)
pip install s3cmd -
Configure s3cmd with your NexStorage credentials:
s3cmd --configureUse these settings:
- Access Key: Your NexStorage Access Key
- Secret Key: Your NexStorage Secret Key
- Default Region: us-east-1
- S3 Endpoint: s3.nexstorage.nexvecta.com
- DNS-style bucket+hostname: s3.nexstorage.nexvecta.com
- Use HTTPS: Yes
-
Save the configuration and start using s3cmd:
# List all buckets
s3cmd ls
# Create a bucket
s3cmd mb s3://my-new-bucket
# Upload a file
s3cmd put myfile.txt s3://my-bucket/
MinIO Client (mc)
The MinIO Client works well with NexStorage due to the S3-compatible API:
-
Install the MinIO Client:
# For macOS
brew install minio/stable/mc
# For Linux
wget https://dl.min.io/client/mc/release/linux-amd64/mc
chmod +x mc
sudo mv mc /usr/local/bin/
# For Windows
# Download from https://dl.min.io/client/mc/release/windows-amd64/mc.exe -
Configure a NexStorage alias:
mc alias set nexstorage https://s3.nexstorage.nexvecta.com YOUR_ACCESS_KEY YOUR_SECRET_KEY -
Use the client with your NexStorage alias:
# List all buckets
mc ls nexstorage
# Create a bucket
mc mb nexstorage/my-new-bucket
# Upload a file
mc cp myfile.txt nexstorage/my-bucket/
# Synchronize a directory
mc mirror --watch local-folder nexstorage/my-bucket
Programming Languages
Python with boto3
The AWS SDK for Python (boto3) provides a powerful interface for interacting with NexStorage:
-
Install boto3:
pip install boto3 -
Create a NexStorage client:
import boto3
# Create a session with your credentials
session = boto3.session.Session()
# Create an S3 client
s3_client = session.client(
's3',
endpoint_url='https://s3.nexstorage.nexvecta.com',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY'
)
# List all buckets
response = s3_client.list_buckets()
for bucket in response['Buckets']:
print(f"Bucket: {bucket['Name']}")
# Create a new bucket
s3_client.create_bucket(Bucket='my-python-bucket')
# Upload a file
s3_client.upload_file('local-file.txt', 'my-python-bucket', 'remote-file.txt')
# Download a file
s3_client.download_file('my-python-bucket', 'remote-file.txt', 'downloaded-file.txt') -
Use resource-level interface for higher-level operations:
# Create an S3 resource
s3_resource = session.resource(
's3',
endpoint_url='https://s3.nexstorage.nexvecta.com',
aws_access_key_id='YOUR_ACCESS_KEY',
aws_secret_access_key='YOUR_SECRET_KEY'
)
# Copy all objects between buckets
source_bucket = s3_resource.Bucket('source-bucket')
destination_bucket = s3_resource.Bucket('destination-bucket')
for obj in source_bucket.objects.all():
source = {'Bucket': 'source-bucket', 'Key': obj.key}
destination_bucket.copy(source, obj.key)
JavaScript (Node.js)
Use the AWS SDK for JavaScript to integrate NexStorage with Node.js applications:
-
Install the AWS SDK:
npm install aws-sdk -
Create an S3 client:
const AWS = require('aws-sdk');
// Configure the S3 client
const s3 = new AWS.S3({
endpoint: 'https://s3.nexstorage.nexvecta.com',
accessKeyId: 'YOUR_ACCESS_KEY',
secretAccessKey: 'YOUR_SECRET_KEY',
s3ForcePathStyle: true, // Required for NexStorage
signatureVersion: 'v4'
});
// List all buckets
s3.listBuckets().promise()
.then(data => {
console.log('Buckets:');
data.Buckets.forEach(bucket => {
console.log(`- ${bucket.Name}`);
});
})
.catch(err => {
console.error('Error:', err);
});
// Create a bucket
s3.createBucket({ Bucket: 'my-js-bucket' }).promise()
.then(data => {
console.log('Bucket created successfully');
})
.catch(err => {
console.error('Error creating bucket:', err);
});
// Upload a file
const fs = require('fs');
const uploadParams = {
Bucket: 'my-js-bucket',
Key: 'example.txt',
Body: fs.createReadStream('local-file.txt')
};
s3.upload(uploadParams).promise()
.then(data => {
console.log('Upload success:', data.Location);
})
.catch(err => {
console.error('Error uploading file:', err);
}); -
Use with Express.js for file uploads:
const express = require('express');
const multer = require('multer');
const AWS = require('aws-sdk');
const app = express();
const upload = multer({ storage: multer.memoryStorage() });
// Configure S3 client
const s3 = new AWS.S3({
endpoint: 'https://s3.nexstorage.nexvecta.com',
accessKeyId: 'YOUR_ACCESS_KEY',
secretAccessKey: 'YOUR_SECRET_KEY',
s3ForcePathStyle: true,
signatureVersion: 'v4'
});
app.post('/upload', upload.single('file'), (req, res) => {
const params = {
Bucket: 'my-app-uploads',
Key: `${Date.now()}-${req.file.originalname}`,
Body: req.file.buffer,
ContentType: req.file.mimetype
};
s3.upload(params).promise()
.then(data => {
res.json({ success: true, url: data.Location });
})
.catch(err => {
console.error(err);
res.status(500).json({ success: false, error: err.message });
});
});
app.listen(3000, () => {
console.log('Server running on port 3000');
});
Java
Use the AWS SDK for Java to integrate NexStorage with Java applications:
-
Add the SDK dependency to your Maven project:
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.12.261</version>
</dependency> -
Create an S3 client:
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.client.builder.AwsClientBuilder.EndpointConfiguration;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.Bucket;
import com.amazonaws.services.s3.model.PutObjectRequest;
public class NexStorageExample {
public static void main(String[] args) {
// Set up credentials
BasicAWSCredentials credentials = new BasicAWSCredentials(
"YOUR_ACCESS_KEY",
"YOUR_SECRET_KEY"
);
// Create endpoint configuration
EndpointConfiguration endpointConfig = new EndpointConfiguration(
"https://s3.nexstorage.nexvecta.com",
"us-east-1"
);
// Create S3 client
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withEndpointConfiguration(endpointConfig)
.withCredentials(new AWSStaticCredentialsProvider(credentials))
.withPathStyleAccessEnabled(true)
.build();
// List all buckets
List<Bucket> buckets = s3Client.listBuckets();
System.out.println("Your NexStorage buckets are:");
for (Bucket bucket : buckets) {
System.out.println("- " + bucket.getName());
}
// Create a bucket
s3Client.createBucket("my-java-bucket");
// Upload a file
s3Client.putObject(
new PutObjectRequest(
"my-java-bucket",
"example.txt",
new File("local-file.txt")
)
);
// Download a file
S3Object object = s3Client.getObject("my-java-bucket", "example.txt");
InputStream content = object.getObjectContent();
// Process the input stream as needed
content.close();
}
} -
For Spring Boot applications:
@Service
public class StorageService {
private final AmazonS3 s3Client;
public StorageService(
@Value("${nexstorage.access-key}") String accessKey,
@Value("${nexstorage.secret-key}") String secretKey,
@Value("${nexstorage.endpoint}") String endpoint
) {
BasicAWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
EndpointConfiguration endpointConfig = new EndpointConfiguration(endpoint, "us-east-1");
this.s3Client = AmazonS3ClientBuilder.standard()
.withEndpointConfiguration(endpointConfig)
.withCredentials(new AWSStaticCredentialsProvider(credentials))
.withPathStyleAccessEnabled(true)
.build();
}
public void uploadFile(String bucketName, String key, InputStream fileStream) {
ObjectMetadata metadata = new ObjectMetadata();
s3Client.putObject(bucketName, key, fileStream, metadata);
}
public S3Object downloadFile(String bucketName, String key) {
return s3Client.getObject(bucketName, key);
}
}
Analytics and BI Tools
Apache Spark
Connect Apache Spark to NexStorage to process large datasets:
import org.apache.spark.sql.SparkSession
val spark = SparkSession.builder()
.appName("NexStorage Integration")
.master("local[*]")
.getOrCreate()
// Configure Hadoop AWS settings
spark.sparkContext.hadoopConfiguration.set("fs.s3a.endpoint", "s3.nexstorage.nexvecta.com")
spark.sparkContext.hadoopConfiguration.set("fs.s3a.access.key", "YOUR_ACCESS_KEY")
spark.sparkContext.hadoopConfiguration.set("fs.s3a.secret.key", "YOUR_SECRET_KEY")
spark.sparkContext.hadoopConfiguration.set("fs.s3a.path.style.access", "true")
spark.sparkContext.hadoopConfiguration.set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
// Read data from NexStorage
val df = spark.read.csv("s3a://my-data-bucket/path/to/data.csv")
// Process the data
val resultDf = df.filter(df("column") > 100).groupBy("category").count()
// Write results back to NexStorage
resultDf.write.parquet("s3a://my-results-bucket/path/to/results")
Tableau
Connect Tableau to NexStorage using the AWS S3 connector:
-
In Tableau Desktop, select Connect > To a Server > More... > Amazon S3
-
Configure the connection:
- Authentication: AWS Access Key
- Access Key ID: Your NexStorage Access Key
- Secret Access Key: Your NexStorage Secret Key
- Default Region: us-east-1
- S3 Endpoint Override: s3.nexstorage.nexvecta.com
-
Click Sign In and select the bucket containing your data
-
Choose the file format (CSV, Excel, etc.) and connect to your data
PowerBI
Connect Microsoft PowerBI to NexStorage:
-
In PowerBI Desktop, select Get Data > More... > Azure > Azure Blob Storage
-
Enter the NexStorage endpoint URL:
https://s3.nexstorage.nexvecta.com -
Select Account Key as the authentication method and enter your NexStorage credentials
-
Click Connect and navigate to your data
Backup and Archiving Solutions
Veeam Backup & Replication
Configure Veeam to use NexStorage as a backup repository:
-
In the Veeam Backup & Replication console, navigate to Backup Infrastructure > Backup Repositories
-
Right-click and select Add Backup Repository
-
Select Object Storage > S3 Compatible
-
Configure the repository:
- Service Point: s3.nexstorage.nexvecta.com
- Region: us-east-1
- Access Key: Your NexStorage Access Key
- Secret Key: Your NexStorage Secret Key
- Enable S3 Path Style Access: Yes
- Bucket: Your backup bucket name
-
Complete the wizard and start using NexStorage for backups
Duplicati
Configure Duplicati to back up your data to NexStorage:
-
In Duplicati, click Add backup
-
Select Configure a new backup
-
Set a name and encryption options for your backup
-
For storage type, select S3 Compatible
-
Configure the connection:
- Server: s3.nexstorage.nexvecta.com
- Port: 443
- Use SSL: Yes
- Bucket name: Your backup bucket
- Storage region: us-east-1
- Storage class: Standard
- Access ID: Your NexStorage Access Key
- Access Key: Your NexStorage Secret Key
- Use S3 Path Style: Yes
-
Complete the setup by selecting what to back up and scheduling options
Continuous Integration/Deployment
GitHub Actions
Store build artifacts in NexStorage using GitHub Actions:
name: Build and Store Artifacts
on:
push:
branches: [ main ]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Java
uses: actions/setup-java@v2
with:
java-version: '11'
distribution: 'adopt'
- name: Build with Maven
run: mvn package
- name: Upload to NexStorage
uses: shallwefootball/s3-upload-action@master
with:
aws_key_id: ${{ secrets.NEXSTORAGE_ACCESS_KEY }}
aws_secret_access_key: ${{ secrets.NEXSTORAGE_SECRET_KEY }}
aws_bucket: 'ci-artifacts'
source_dir: 'target'
destination_dir: 'builds/${{ github.sha }}'
endpoint: 'https://s3.nexstorage.nexvecta.com'
Jenkins
Configure Jenkins to store build artifacts in NexStorage:
pipeline {
agent any
environment {
NEXSTORAGE_CREDS = credentials('nexstorage')
}
stages {
stage('Build') {
steps {
sh 'mvn package'
}
}
stage('Upload Artifacts') {
steps {
sh '''
aws s3 cp target/*.jar s3://jenkins-artifacts/builds/${BUILD_NUMBER}/ \
--endpoint-url https://s3.nexstorage.nexvecta.com \
--aws-access-key-id ${NEXSTORAGE_CREDS_USR} \
--aws-secret-access-key ${NEXSTORAGE_CREDS_PSW}
'''
}
}
}
}
Content Management Systems
WordPress
Configure WordPress to store media files in NexStorage:
-
Install the WP Offload Media plugin
-
Configure the plugin with your NexStorage credentials:
- Provider: Custom
- Access Key: Your NexStorage Access Key
- Secret Key: Your NexStorage Secret Key
- Region: us-east-1
- Bucket: Your media bucket
- Custom Endpoint: s3.nexstorage.nexvecta.com
- Force Path Style: Yes
-
Save settings and start uploading media to NexStorage
Drupal
Use the S3 File System module to store Drupal files in NexStorage:
-
Install the S3 File System module:
composer require drupal/s3fs
drush en s3fs -
Configure the module at
/admin/config/media/s3fs:- S3 Access Key: Your NexStorage Access Key
- S3 Secret Key: Your NexStorage Secret Key
- S3 Bucket: Your media bucket
- S3 Region: us-east-1
- Custom S3 Endpoint: s3.nexstorage.nexvecta.com
- Use Path Style API Endpoints: Yes
-
Save configuration and run:
drush s3fs-refresh-cache
Big Data Ecosystems
Hadoop
Configure Hadoop to use NexStorage as a data source and sink:
-
Add the following to your
core-site.xml:<property>
<name>fs.s3a.endpoint</name>
<value>s3.nexstorage.nexvecta.com</value>
</property>
<property>
<name>fs.s3a.access.key</name>
<value>YOUR_ACCESS_KEY</value>
</property>
<property>
<name>fs.s3a.secret.key</name>
<value>YOUR_SECRET_KEY</value>
</property>
<property>
<name>fs.s3a.path.style.access</name>
<value>true</value>
</property>
<property>
<name>fs.s3a.impl</name>
<value>org.apache.hadoop.fs.s3a.S3AFileSystem</value>
</property> -
Use the S3A filesystem in your Hadoop jobs:
hadoop fs -ls s3a://my-data-bucket/
# Run a MapReduce job
hadoop jar hadoop-mapreduce-examples-3.3.1.jar wordcount \
s3a://my-data-bucket/input s3a://my-data-bucket/output
Apache Kafka Connect
Use the Kafka S3 Sink Connector to store Kafka data in NexStorage:
{
"name": "s3-sink",
"config": {
"connector.class": "io.confluent.connect.s3.S3SinkConnector",
"tasks.max": "1",
"topics": "my-topic",
"s3.region": "us-east-1",
"s3.bucket.name": "kafka-data",
"s3.part.size": "5242880",
"flush.size": "1000",
"storage.class": "io.confluent.connect.s3.storage.S3Storage",
"format.class": "io.confluent.connect.s3.format.json.JsonFormat",
"partitioner.class": "io.confluent.connect.storage.partitioner.DefaultPartitioner",
"schema.compatibility": "NONE",
"s3.credentials.provider.class": "com.amazonaws.auth.DefaultAWSCredentialsProviderChain",
"aws.access.key.id": "YOUR_ACCESS_KEY",
"aws.secret.access.key": "YOUR_SECRET_KEY",
"s3.endpoint": "https://s3.nexstorage.nexvecta.com",
"s3.path.style.access": "true"
}
}
Next Steps
Now that you've learned how to integrate NexStorage with various tools and platforms, you might want to explore: