-
-
Couldn't load subscription status.
- Fork 33.6k
build: build v8 with -fvisibility=hidden -fvisibility-inlines-hidden #56290
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
build: build v8 with -fvisibility=hidden -fvisibility-inlines-hidden #56290
Conversation
|
Review requested:
|
|
FWIW changing compile/link options will invalidate any stored CCACHE entries, which would result in more of V8 having to be recompiled. Generally most of the x64 CI machines in Jenkins are now 4 GB RAM + swap (2 GB). Anything running on a machine with |
|
In case anyone is curious, here is a bar chart I generated with ChatGPT about the busiest hours of the CI, using data from https://ci.nodejs.org/job/node-test-pull-request/api/json?tree=builds[timestamp,url,result] (past 100 jobs) |
|
Dem 8 hour workday Europeans kicking off a quick CI run right before dinner, huh? That's a cool graph! |
3752a7e to
ad5b353
Compare
|
(Unless you live in Spain, where 8PM is considered early for dinner). In seriousness I think the spike is probably due to being an overlap of working hours between Europe and Americas, in any case 16-18 CET would be the time to avoid testing something that invalidates the ccache. I dumped the script to https://gist.github.com/joyeecheung/6006b7ee781a2a619021767e0c8f2c01 though I have no idea if it works locally. Rebased and added -fvisibility-inlines-hidden. It seems the Node.js files are missing the flags too, though I am not yet sure if all the other headers are set up correctly for us to apply the flags directly without breaking addons, so I've commented it out. |
|
Started a CI, though I noticed that test-orka-macos11-x64-1 had been offline for a couple of days, creating a very long osx build queue, I just cleaned it up and brought it online, so not sure how well it would fare after this CI gets its turn in the osx jobs. If it doesn't look great in the build load, then not adding |
|
Looks like the build is still timing out on some platforms, need to try again in a less busy hour... There was an error on AIX that doesn't show up on other platforms https://ci.nodejs.org/job/node-test-commit-aix/55380/nodes=aix72-ppc64/testReport/junit/addons/heap-profiler/test/ AFAICT, it shouldn't be failing as |
Seems strange that it worked 3 weeks ago but not in the recent run. Symbol exporting is different on AIX -- we have https://github.com/nodejs/node/blob/main/tools/create_expfile.sh to generate an exports file containing the symbols that is used for addons: Lines 2 to 6 in 529b56e
We haven't touched it in five years -- I'm not even sure it's needed anymore since https://gcc.gnu.org/gcc-7/changes.html has
@abmusse Is this something you can look into? |
@joyeecheung Since this still occurs (https://ci.nodejs.org/job/node-test-commit-aix/55692/nodes=aix72-ppc64/testReport/junit/addons/heap-profiler/test/) please skip the setting on AIX. We (IBM/RH) will add it to the list of things to look at on AIX. |
ad5b353 to
b6bf41b
Compare
b6bf41b to
94b8874
Compare
|
Rebased this after I realized how big the binary has become on Linux recently. This currently gives Trying to see if CI cache fares better with the flag changes now.. |
| 'V8_TLS_USED_IN_LIBRARY', # Enable V8_TLS_LIBRARY_MODE. | ||
| ], | ||
| }], | ||
| ['OS=="mac"', { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can this be moved up above the V8_TLS_USED_IN_LIBRARY? So that this can be closer to the comments above, which is less relevant with V8_TLS_USED_IN_LIBRARY.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would you mind if I leave it to a follow-up PR? Trying to get the cache of it into the CI first instead of retrying for flakes and make the cache a mess in the CI...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Merged it to put the cache in, feel free to open another PR to move them closer
|
Landed in 3437e1c |
This refactors node_test.go to add a new "nightly" test, that gets the latest nightly build of Node, runs a docker container, and then runs the test as normal. While creating this, I also discovered a real issue in our support for the v26 prerelease (slated for release in April): nodejs/node#56290 causes one of the symbols we rely on to no longer be exported, so we need to grab it a different way. This commit also fixes that issue.
This refactors node_test.go to add a new "nightly" test, that gets the latest nightly build of Node, runs a docker container, and then runs the test as normal. While creating this, I also discovered a real issue in our support for the v26 prerelease (slated for release in April): nodejs/node#56290 causes one of the symbols we rely on to no longer be exported, so we need to grab it a different way. This commit also fixes that issue.
PR-URL: #56290 Reviewed-By: Richard Lau <[email protected]> Reviewed-By: James M Snell <[email protected]>
This refactors node_test.go to add a new "nightly" test, that gets the latest nightly build of Node, runs a docker container, and then runs the test as normal. While creating this, I also discovered a real issue in our support for the v26 prerelease (slated for release in April): nodejs/node#56290 causes one of the symbols we rely on to no longer be exported, so we need to grab it a different way. This commit also fixes that issue.

Split from #56275 since it seems to cause gcc on some machines in the CI to time out or run out of memory. Trying to see if it's just a CI hiccup or if it's something that needs to be worked around.