Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

postpack is not called after yarn pack #7924

Closed
thoov opened this issue Feb 21, 2020 · 3 comments
Closed

postpack is not called after yarn pack #7924

thoov opened this issue Feb 21, 2020 · 3 comments
Labels
fixed-in-modern This issue has been fixed / implemented in Yarn 2+.

Comments

@thoov
Copy link
Contributor

thoov commented Feb 21, 2020

Bug description

postpack is not called after yarn pack has completed. Also the success message about creating the tarball during yarn pack no longer shows up. This only happens on node 12.16.*

What is the current behavior?

When executing the command yarn pack, yarn invokes the user defined script prepack before packing but does not invoke postpack.

What is the expected behavior?

When executing the command yarn pack, yarn should invoke the user defined script prepack before and postpack after the packaging.

Steps to Reproduce

  1. Create a blank project (yarn init)
  2. Add the following to package.json
"scripts": {
  "prepack": "echo \"Hello\"",
  "postpack": "echo \"World\""
}
  1. Run yarn pack

Expected Output:

yarn pack v1.22.0
$ echo "Hello"
Hello
$ echo "World"
World
success Wrote tarball to "/private/tmp/yarn-test/yarn-test-v1.0.0.tgz".
✨  Done in 0.06s.

Actual Output:

yarn pack v1.22.0
$ echo "Hello"
Hello

Environment

  • Node Version: 12.16.0 & 12.16.1
  • Yarn v1 Version: 1.22.0
  • OS and version: OSX 10.15.3
@thoov
Copy link
Contributor Author

thoov commented Feb 21, 2020

cc @haochuan

@stefanpenner
Copy link
Contributor

This issue appears to continue into node@^14

stefanpenner added a commit to stefanpenner/yarn that referenced this issue Jun 15, 2020
stefanpenner added a commit to stefanpenner/yarn that referenced this issue Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely.

```js
// src/cli/commands/pack.js

await new Promise((resolve, reject) => {
 stream.pipe(fs2.createWriteStream(filename));
 stream.on(‘error’, reject); // reject is never invoked
 stream.on(‘close’, resolve); //  resolve is never invoked
 // reached
});

// never reached
`"

What is going on here?

As it turns out, the above stream code is unsafe, and only appeared
to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit
while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable.

The facts:

1. a Node.js process exits once it's event queue has been drained.
2. stream.pipe(…) does not add a task to the event queue
3. new Promise (…) does not add a task to the event queue
4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue
5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect)
6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook.

Mystery solve!

That's a lot going on, so how can one safely use streams ?

Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);`
This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`.
and appears to be preferred in  Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations.

In short, rather then:
```js
stream.pipe(otherStream);
`"

Under most circumstances it is likely wise to use
```js
const { pipeline } = require(‘stream’);

pipeline(stream, otherStream, err => {
  // node-back
});
`"

And if you are using streams with promises, consider first promisifying `pipeline`

```js
const { promisify } = require(‘util’);
const pipeline = promisify(require(‘stream’).pipeline)
`"
stefanpenner added a commit to stefanpenner/yarn that referenced this issue Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely.

```js
// src/cli/commands/pack.js

await new Promise((resolve, reject) => {
 stream.pipe(fs2.createWriteStream(filename));
 stream.on(‘error’, reject); // reject is never invoked
 stream.on(‘close’, resolve); //  resolve is never invoked
 // reached
});

// never reached
```

What is going on here?

As it turns out, the above stream code is unsafe, and only appeared
to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit
while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable.

The facts:

1. a Node.js process exits once it's event queue has been drained.
2. stream.pipe(…) does not add a task to the event queue
3. new Promise (…) does not add a task to the event queue
4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue
5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect)
6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook.

Mystery solve!

That's a lot going on, so how can one safely use streams ?

Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);`
This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`.
and appears to be preferred in  Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations.

In short, rather then:
```js
stream.pipe(otherStream);
`"

Under most circumstances it is likely wise to use
```js
const { pipeline } = require(‘stream’);

pipeline(stream, otherStream, err => {
  // node-back
});
`"

And if you are using streams with promises, consider first promisifying `pipeline`

```js
const { promisify } = require(‘util’);
const pipeline = promisify(require(‘stream’).pipeline)
`"
stefanpenner added a commit to stefanpenner/yarn that referenced this issue Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely.

```js
// src/cli/commands/pack.js

await new Promise((resolve, reject) => {
 stream.pipe(fs2.createWriteStream(filename));
 stream.on(‘error’, reject); // reject is never invoked
 stream.on(‘close’, resolve); //  resolve is never invoked
 // reached
});

// never reached
```

What is going on here?

As it turns out, the above stream code is unsafe, and only appeared
to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit
while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable.

The facts:

1. a Node.js process exits once it's event queue has been drained.
2. stream.pipe(…) does not add a task to the event queue
3. new Promise (…) does not add a task to the event queue
4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue
5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect)
6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook.

Mystery solve!

That's a lot going on, so how can one safely use streams ?

Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);`
This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`.
and appears to be preferred in  Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations.

In short, rather then:
```js
stream.pipe(otherStream);
`"

Under most circumstances it is likely wise to use
```js
const { pipeline } = require(‘stream’);

pipeline(stream, otherStream, err => {
  // node-back
});
```

And if you are using streams with promises, consider first promisifying `pipeline`

```js
const { promisify } = require(‘util’);
const pipeline = promisify(require(‘stream’).pipeline)
```
stefanpenner added a commit to stefanpenner/yarn that referenced this issue Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely.

```js
// src/cli/commands/pack.js

await new Promise((resolve, reject) => {
 stream.pipe(fs2.createWriteStream(filename));
 stream.on(‘error’, reject); // reject is never invoked
 stream.on(‘close’, resolve); //  resolve is never invoked
 // reached
});

// never reached
```

What is going on here?

As it turns out, the above stream code is unsafe, and only appeared
to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit
while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable.

The facts:

1. a Node.js process exits once it's event queue has been drained, not referenced IO handles, and no referenced timer handles.
2. stream.pipe(…) does not add a task to the event queue
3. new Promise (…) does not add a task to the event queue
4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue
5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect)
6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook.

Mystery solve!

That's a lot going on, so how can one safely use streams ?

Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);`
This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`.
and appears to be preferred in  Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations.

In short, rather then:
```js
stream.pipe(otherStream);
`"

Under most circumstances it is likely wise to use
```js
const { pipeline } = require(‘stream’);

pipeline(stream, otherStream, err => {
  // node-back
});
```

And if you are using streams with promises, consider first promisifying `pipeline`

```js
const { promisify } = require(‘util’);
const pipeline = promisify(require(‘stream’).pipeline)
```
stefanpenner added a commit to stefanpenner/yarn that referenced this issue Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely.

```js
// src/cli/commands/pack.js

await new Promise((resolve, reject) => {
 stream.pipe(fs2.createWriteStream(filename));
 stream.on(‘error’, reject); // reject is never invoked
 stream.on(‘close’, resolve); //  resolve is never invoked
 // reached
});

// never reached
```

What is going on here?

As it turns out, the above stream code is unsafe, and only appeared
to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit
while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable.

The facts:

1. a Node.js process exits once it's event queue has been drained and has no outstanding reference-able handles (such as IO, Timers etc).
2. stream.pipe(…) does not add a task to the event queue
3. new Promise (…) does not add a task to the event queue
4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue
5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect)
6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook.

Mystery solve!

That's a lot going on, so how can one safely use streams ?

Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);`
This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`.
and appears to be preferred in  Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations.

In short, rather then:
```js
stream.pipe(otherStream);
`"

Under most circumstances it is likely wise to use
```js
const { pipeline } = require(‘stream’);

pipeline(stream, otherStream, err => {
  // node-back
});
```

And if you are using streams with promises, consider first promisifying `pipeline`

```js
const { promisify } = require(‘util’);
const pipeline = promisify(require(‘stream’).pipeline)
```
stefanpenner added a commit to stefanpenner/yarn that referenced this issue Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely.

```js
// src/cli/commands/pack.js

await new Promise((resolve, reject) => {
 stream.pipe(fs2.createWriteStream(filename));
 stream.on(‘error’, reject); // reject is never invoked
 stream.on(‘close’, resolve); //  resolve is never invoked
 // reached
});

// never reached
```

### What is going on here?

As it turns out, the above stream code is unsafe, and only appeared
to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit
while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable.

### The facts:

1. a Node.js process exits once it's event queue has been drained and has no outstanding reference-able handles (such as IO, Timers etc).
2. stream.pipe(…) does not add a task to the event queue
3. new Promise (…) does not add a task to the event queue
4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue
5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect)
6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook.

Mystery solved!

### That's a lot going on, so how can one safely use streams ?

Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);`
This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`.
and appears to be preferred in  Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations.

In short, rather then:
```js
stream.pipe(otherStream);
`"

Under most circumstances it is likely wise to use
```js
const { pipeline } = require(‘stream’);

pipeline(stream, otherStream, err => {
  // node-back
});
```

And if you are using streams with promises, consider first promisifying `pipeline`

```js
const { promisify } = require(‘util’);
const pipeline = promisify(require(‘stream’).pipeline)
```
stefanpenner added a commit to stefanpenner/yarn that referenced this issue Jun 16, 2020
In some versions of Node.js (such as 12.16.2), yarn pack no-longer was running the `postpack` hook. Debugging the code in question (following example) lead us to notice that the following await never progressed, the streams `error` and `close` handlers were never invoked. In-fact the process was appearing to exit prematurely.

```js
// src/cli/commands/pack.js

await new Promise((resolve, reject) => {
 stream.pipe(fs2.createWriteStream(filename));
 stream.on(‘error’, reject); // reject is never invoked
 stream.on(‘close’, resolve); //  resolve is never invoked
 // reached
});

// never reached
```

As it turns out, the above stream code is unsafe, and only appeared
to function do to a specific implementation detail of `zlib.Gzip`. Once that implementation detail changed in Node.js; the process would exit
while awaiting the above promise, and thus would leave the code which triggers the `postpack` hook unreachable.

1. a Node.js process exits once it's event queue has been drained and has no outstanding reference-able handles (such as IO, Timers etc).
2. stream.pipe(…) does not add a task to the event queue
3. new Promise (…) does not add a task to the event queue
4. prior to node 2.16, an implementation of `zlib.Gzip` added a task to the event queue
5. nodejs/node@0e89b64 changed that behavior (confirmed via bisect)
6. in node 2.16 (and several other versions) `yarn pack` would exit prior to invoking the `postpack` hook.

Mystery solved!

Luckily Node.js has a new utility method (node >= 10) `const { pipeline } = require(‘stream’);`
This higher level utility has less caveats than `stream.pipe` / `stream.on(…)`.
and appears to be preferred in  Node.js's own documentation has been re-worked to use `pipeline` for all but the most specific low level operations.

In short, rather then:
```js
stream.pipe(otherStream);
`"

Under most circumstances it is likely wise to use
```js
const { pipeline } = require(‘stream’);

pipeline(stream, otherStream, err => {
  // node-back
});
```

And if you are using streams with promises, consider first promisifying `pipeline`

```js
const { promisify } = require(‘util’);
const pipeline = promisify(require(‘stream’).pipeline)
```
@merceyz
Copy link
Member

merceyz commented Jan 2, 2021

@merceyz merceyz closed this as completed Jan 2, 2021
@paul-soporan paul-soporan added the fixed-in-modern This issue has been fixed / implemented in Yarn 2+. label Jan 2, 2021
copybara-service bot pushed a commit to google/safevalues that referenced this issue Jan 27, 2021
…ight place in the package

postpack doesn't get invoke by yarn, but this has been fixed in the latest version: yarnpkg/yarn#7924

PiperOrigin-RevId: 354042434
copybara-service bot pushed a commit to google/safevalues that referenced this issue Jan 27, 2021
…ight place in the package

postpack doesn't get invoke by yarn, but this has been fixed in the latest version: yarnpkg/yarn#7924

PiperOrigin-RevId: 354042434
copybara-service bot pushed a commit to google/safevalues that referenced this issue Jan 27, 2021
…ight place in the package

postpack doesn't get invoke by yarn, but this has been fixed in the latest version: yarnpkg/yarn#7924

PiperOrigin-RevId: 354055453
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
fixed-in-modern This issue has been fixed / implemented in Yarn 2+.
Projects
None yet
Development

No branches or pull requests

4 participants