Conversation
d990e6f to
ded9e89
Compare
|
Just pushed a rebase. I had to solve some merge conflicts, mostly because |
|
Marking this ready for review. Both The signature of Note that this PR by itself does not remove all instances of Also note that As is, this PR should address #26 fully. |
|
Unsure why this would get partial test failures. E.g. Backblaze having 1 more file in the bucket then expected, but all other tests passing. You might want to run this locally with your secrets to see if you notice anything @good-lly. |
|
Somehow fixing the JSDoc made the tests go green? I guess that means some of the tests might be flaky, rather than some bug having snuck in? |
It was not jsdoc change, just "re-run" of test ... thats something that happens after some failed tests on some endpoints - especially Backblaze/cache seems lazy when it comes to deleting objects and reapearing of them again on the next listing ... |
|
Great job @Zegnat and thank you for the contribution! Btw: have you tried to run test:perf? I am not able to make |
|
@good-lly mmm, a bit odd, I am uncertain why it worked for me when I created the PR with the screenshots. But the fix seems to be this: diff --git a/tests/perf/performance.test.js b/tests/perf/performance.test.js
index ca9f0a3..242c71e 100644
--- a/tests/perf/performance.test.js
+++ b/tests/perf/performance.test.js
@@ -33,9 +33,9 @@ const BUCKET_NAME = ENDPOINT.split('/')[3];
const BUCKET = BUCKET_NAME || 'core-s3-dev-local';
const SIZES = {
- small: { key: 'bench-1MiB' + now, buf: randomBytes(1 * 1024 * 1024) },
- medium: { key: 'bench-8MiB' + now, buf: randomBytes(8 * 1024 * 1024) },
- large: { key: 'bench-100MiB' + now, buf: randomBytes(100 * 1024 * 1024) },
+ small: { key: 'bench-1MiB' + now.getTime(), buf: randomBytes(1 * 1024 * 1024) },
+ medium: { key: 'bench-8MiB' + now.getTime(), buf: randomBytes(8 * 1024 * 1024) },
+ large: { key: 'bench-100MiB' + now.getTime(), buf: randomBytes(100 * 1024 * 1024) },
};
const collectStreamHelper = async source => {
@@ -153,7 +153,7 @@ const ensureBucket = async adapter => {
const runSuite = async () => {
console.log(`Running performance tests against bucket "${BUCKET}"`);
- const sdks = [makeAws(), makeMinio(), makeS3mini()];
+ const sdks = [makeAws(), makeMinio(), makeS3mini(), await makeAwsLite()];
for (const [label, { key, buf }] of Object.entries(SIZES)) {
const bench = new Bench({ iterations: 10, time: 0 });It looks like there is something in the string representation of the Date object that is tripping up aws-lite. |
|
Running the benchmarks. I am a little surprised. First my results, then exactly how you could reproduce it yourself. We should build a before and after version of S3Mini, with (ddb7a1d) and without (9600168) this Web Crypto PR. We will then run the performance tests using those two builds, but using the performance script from the current latest dev (6782a97). Prepare the builds: git checkout ddb7a1d
npm ci
npm run build
mv dist/s3mini.min.js tests/perf/s3mini.after.min.js
git checkout 9600168
npm ci
npm run build
mv dist/s3mini.min.js tests/perf/s3mini.before.min.js
git checkout dev
npm ci
npm run buildThen diff --git a/tests/perf/performance.test.js b/tests/perf/performance.test.js
index ca9f0a3..ea7f13b 100644
--- a/tests/perf/performance.test.js
+++ b/tests/perf/performance.test.js
@@ -9,6 +9,8 @@ import {
import * as Minio from 'minio';
import awsLite from '@aws-lite/client';
import { S3mini } from '../../dist/s3mini.min.js';
+import { S3mini as S3miniBefore } from './s3mini.before.min.js';
+import { S3mini as S3miniAfter } from './s3mini.after.min.js';
import { Bench } from 'tinybench';
import { printTable } from 'console-table-printer';
import { randomBytes, randomUUID } from 'node:crypto';
@@ -33,9 +35,9 @@ const BUCKET_NAME = ENDPOINT.split('/')[3];
const BUCKET = BUCKET_NAME || 'core-s3-dev-local';
const SIZES = {
- small: { key: 'bench-1MiB' + now, buf: randomBytes(1 * 1024 * 1024) },
- medium: { key: 'bench-8MiB' + now, buf: randomBytes(8 * 1024 * 1024) },
- large: { key: 'bench-100MiB' + now, buf: randomBytes(100 * 1024 * 1024) },
+ small: { key: 'bench-1MiB' + now.getTime(), buf: randomBytes(1 * 1024 * 1024) },
+ medium: { key: 'bench-8MiB' + now.getTime(), buf: randomBytes(8 * 1024 * 1024) },
+ large: { key: 'bench-100MiB' + now.getTime(), buf: randomBytes(100 * 1024 * 1024) },
};
const collectStreamHelper = async source => {
@@ -117,15 +119,15 @@ const makeAwsLite = async () => {
};
};
-const makeS3mini = () => {
- const client = new S3mini({
+const makeS3mini = (suffix, version) => {
+ const client = new version({
accessKeyId: ACCESS_KEY,
secretAccessKey: SECRET_KEY,
endpoint: ENDPOINT,
region: 'auto',
});
return {
- name: 's3mini',
+ name: `s3mini ${suffix}`,
get: k => client.getObject(k),
put: (k, b) => client.putObject(k, b),
list: () => client.listObjects('/'),
@@ -153,7 +155,7 @@ const ensureBucket = async adapter => {
const runSuite = async () => {
console.log(`Running performance tests against bucket "${BUCKET}"`);
- const sdks = [makeAws(), makeMinio(), makeS3mini()];
+ const sdks = [makeAws(), makeMinio(), await makeAwsLite(), makeS3mini('before', S3miniBefore), makeS3mini('after', S3miniAfter)];
for (const [label, { key, buf }] of Object.entries(SIZES)) {
const bench = new Bench({ iterations: 10, time: 0 });Then run the tests: npm run test:perf(For what it is worth, I think I prefered the extra styling from @monstermann/tinybench-pretty-printer over plain console-table-printer.) |
Marking this as a draft for now.
Fixes #26.
This was originally branched off of a018fff and not yet rebased. I should be able to get to that tomorrow. Just wanted to put the work out here, because I saw changes to
devthat were made in preparation for this.Note that the Web Crypto API is all asynchronous. This may have some minor performance implications for the library, as awaits are known to have those. Probably no difference on the macro scale though.
dev.Buffercan be removed, as Web Crypto usesArrayBufferand the currently usedBufferis a Node.js class rather than something from ECMAScript. This means platforms like Cloudflare Workers will still need Node.js compatibility turned on even though the dependency onnode:cryptois gone.