Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 16 additions & 2 deletions eng/pipelines/coreclr/superpmi-collect.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,7 @@ jobs:
jobTemplate: /eng/pipelines/common/build-coreclr-and-libraries-job.yml
buildConfig: checked
platforms:
# Linux tests are built on the OSX machines.
# - OSX_x64
- OSX_arm64
- Linux_arm
- Linux_arm64
- Linux_x64
Expand All @@ -46,6 +45,16 @@ jobs:
- Linux_x64
- windows_x64

# superpmi-collect-job that targets macOS/arm64 depends on coreclr binaries produced by the macOS/x64 job
- template: /eng/pipelines/common/platform-matrix.yml
parameters:
jobTemplate: /eng/pipelines/coreclr/templates/build-job.yml
buildConfig: checked
platforms:
- OSX_x64
jobParameters:
testGroup: outerloop

- template: /eng/pipelines/common/platform-matrix.yml
parameters:
jobTemplate: /eng/pipelines/common/templates/runtimes/build-test-job.yml
Expand All @@ -62,6 +71,7 @@ jobs:
platforms:
# Linux tests are built on the OSX machines.
# - OSX_x64
- OSX_arm64
- Linux_arm
- Linux_arm64
- Linux_x64
Expand All @@ -83,6 +93,7 @@ jobs:
platforms:
# Linux tests are built on the OSX machines.
# - OSX_x64
- OSX_arm64
- Linux_arm
- Linux_arm64
- Linux_x64
Expand All @@ -105,6 +116,7 @@ jobs:
platforms:
# Linux tests are built on the OSX machines.
# - OSX_x64
- OSX_arm64
- Linux_arm
- Linux_arm64
- Linux_x64
Expand All @@ -127,6 +139,7 @@ jobs:
platforms:
# Linux tests are built on the OSX machines.
# - OSX_x64
- OSX_arm64
- Linux_arm
- Linux_arm64
- Linux_x64
Expand All @@ -148,6 +161,7 @@ jobs:
platforms:
# Linux tests are built on the OSX machines.
# - OSX_x64
- OSX_arm64
#TODO: Need special handling of running "benchmark build" from inside TMP folder on helix machine.
# - Linux_arm
# - Linux_arm64
Expand Down
6 changes: 5 additions & 1 deletion eng/pipelines/coreclr/templates/run-superpmi-collect-job.yml
Original file line number Diff line number Diff line change
Expand Up @@ -106,7 +106,7 @@ jobs:
steps:
- ${{ parameters.steps }}

- script: $(PythonScript) $(Build.SourcesDirectory)/src/coreclr/scripts/superpmi_collect_setup.py -source_directory $(Build.SourcesDirectory) -core_root_directory $(Core_Root_Dir) -arch $(archType) -mch_file_tag $(MchFileTag) -input_directory $(InputDirectory) -collection_name $(CollectionName) -collection_type $(CollectionType) -max_size 50 # size in MB
- script: $(PythonScript) $(Build.SourcesDirectory)/src/coreclr/scripts/superpmi_collect_setup.py -source_directory $(Build.SourcesDirectory) -core_root_directory $(Core_Root_Dir) -arch $(archType) -platform $(osGroup) -mch_file_tag $(MchFileTag) -input_directory $(InputDirectory) -collection_name $(CollectionName) -collection_type $(CollectionType) -max_size 50 # size in MB
displayName: ${{ format('SuperPMI setup ({0})', parameters.osGroup) }}

# Create required directories for merged mch collection and superpmi logs
Expand Down Expand Up @@ -159,6 +159,10 @@ jobs:
artifactName: 'SuperPMI_Collection_$(CollectionName)_$(CollectionType)_$(osGroup)$(osSubgroup)_$(archType)_$(buildConfig)'
displayName: ${{ format('Upload artifacts SuperPMI {0}-{1} collection', parameters.collectionName, parameters.collectionType) }}

# Ensure the Python azure-storage-blob package is installed before doing the upload.
- script: $(PipScript) install --user --upgrade pip && $(PipScript) install --user azure.storage.blob==12.5.0 --force-reinstall
displayName: Upgrade Pip to latest and install azure-storage-blob Python package

- script: $(PythonScript) $(Build.SourcesDirectory)/src/coreclr/scripts/superpmi.py upload -log_level DEBUG -arch $(archType) -build_type $(buildConfig) -mch_files $(MergedMchFileLocation)$(CollectionName).$(CollectionType).$(MchFileTag).mch -core_root $(Build.SourcesDirectory)/artifacts/bin/coreclr/$(osGroup).x64.$(buildConfigUpper)
displayName: ${{ format('Upload SuperPMI {0}-{1} collection to Azure Storage', parameters.collectionName, parameters.collectionType) }}
env:
Expand Down
4 changes: 2 additions & 2 deletions eng/pipelines/coreclr/templates/superpmi-collect-job.yml
Original file line number Diff line number Diff line change
Expand Up @@ -84,8 +84,8 @@ jobs:
- template: /eng/pipelines/common/download-artifact-step.yml
parameters:
unpackFolder: '$(Build.SourcesDirectory)/artifacts/tests/libraries_zipped/$(osGroup).$(archType).$(buildConfigUpper)'
artifactFileName: 'libraries_test_assets_${{ parameters.osGroup }}_x64_Release$(archiveExtension)'
artifactName: ${{ format('libraries_test_assets_{0}_x64_Release', parameters.osGroup) }}
artifactFileName: 'libraries_test_assets_${{ parameters.osGroup }}_$(archType)_Release$(archiveExtension)'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good catch.

artifactName: ${{ format('libraries_test_assets_{0}_$(archType)_Release', parameters.osGroup) }}
displayName: 'generic libraries test artifacts'

# Unzip individual test projects
Expand Down
5 changes: 4 additions & 1 deletion src/coreclr/scripts/superpmi_benchmarks.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@

import stat
from os import path
from os.path import isfile
from os.path import isfile, realpath
from shutil import copyfile
from coreclr_arguments import *
from jitutil import run_command, ChangeDir, TempDir
Expand Down Expand Up @@ -136,6 +136,9 @@ def build_and_run(coreclr_args, output_mch_name):
project_file = path.join(performance_directory, "src", "benchmarks", "micro", "MicroBenchmarks.csproj")
benchmarks_dll = path.join(artifacts_directory, "MicroBenchmarks.dll")

# Workaround https://github.com/dotnet/sdk/issues/23430
project_file = realpath(project_file)

if is_windows:
shim_name = "%JitName%"
corerun_exe = "CoreRun.exe"
Expand Down
31 changes: 21 additions & 10 deletions src/coreclr/scripts/superpmi_collect_setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@
# 4. Lastly, it sets the pipeline variables.
#
# Below are the helix queues it sets depending on the OS/architecture:
# | Arch | windows | Linux |
# |-------|-------------------------|--------------------------------------------------------------------------------------------------------------------------------------|
# | x86 | Windows.10.Amd64.X86.Rt | |
# | x64 | Windows.10.Amd64.X86.Rt | Ubuntu.1804.Amd64 |
# | arm | - | (Ubuntu.1804.Arm32)[email protected]/dotnet-buildtools/prereqs:ubuntu-18.04-helix-arm32v7-bfcd90a-20200121150440 |
# | arm64 | Windows.10.Arm64 | (Ubuntu.1804.Arm64)[email protected]/dotnet-buildtools/prereqs:ubuntu-18.04-helix-arm64v8-20210531091519-97d8652 |
# | Arch | windows | Linux | macOS |
# |-------|-------------------------|--------------------------------------------------------------------------------------------------------------------------------------|----------------|
# | x86 | Windows.10.Amd64.X86.Rt | | - |
# | x64 | Windows.10.Amd64.X86.Rt | Ubuntu.1804.Amd64 | OSX.1014.Amd64 |
# | arm | - | (Ubuntu.1804.Arm32)[email protected]/dotnet-buildtools/prereqs:ubuntu-18.04-helix-arm32v7-bfcd90a-20200121150440 | - |
# | arm64 | Windows.10.Arm64 | (Ubuntu.1804.Arm64)[email protected]/dotnet-buildtools/prereqs:ubuntu-18.04-helix-arm64v8-20210531091519-97d8652 | OSX.1100.ARM64 |
#
################################################################################
################################################################################
Expand All @@ -47,6 +47,7 @@
parser.add_argument("-source_directory", help="path to source directory")
parser.add_argument("-core_root_directory", help="path to core_root directory")
parser.add_argument("-arch", help="Architecture")
parser.add_argument("-platform", help="OS platform")
parser.add_argument("-mch_file_tag", help="Tag to be used to mch files")
parser.add_argument("-collection_name", help="Name of the SPMI collection to be done (e.g., libraries, tests)")
parser.add_argument("-collection_type", help="Type of the SPMI collection to be done (crossgen, crossgen2, pmi)")
Expand Down Expand Up @@ -196,6 +197,11 @@ def setup_args(args):
lambda unused: True,
"Unable to set arch")

coreclr_args.verify(args,
"platform",
lambda unused: True,
"Unable to set platform")

coreclr_args.verify(args,
"mch_file_tag",
lambda unused: True,
Expand Down Expand Up @@ -383,28 +389,33 @@ def main(main_args):
superpmi_src_directory = os.path.join(source_directory, 'src', 'coreclr', 'scripts')
superpmi_dst_directory = os.path.join(correlation_payload_directory, "superpmi")
arch = coreclr_args.arch
platform_name = coreclr_args.platform.lower()
helix_source_prefix = "official"
creator = ""
ci = True
if is_windows:
if platform_name == "windows":
helix_queue = "Windows.10.Arm64" if arch == "arm64" else "Windows.10.Amd64.X86.Rt"
else:
elif platform_name == "linux":
if arch == "arm":
helix_queue = "(Ubuntu.1804.Arm32)[email protected]/dotnet-buildtools/prereqs:ubuntu-18.04-helix-arm32v7-bfcd90a-20200121150440"
elif arch == "arm64":
helix_queue = "(Ubuntu.1804.Arm64)[email protected]/dotnet-buildtools/prereqs:ubuntu-18.04-helix-arm64v8-20210531091519-97d8652"
else:
helix_queue = "Ubuntu.1804.Amd64"
elif platform_name == "osx":
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you also update the comments at the top of the file?

helix_queue = "OSX.1100.ARM64" if arch == "arm64" else "OSX.1014.Amd64"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You do have helix queue for OSX_x64 but you aren't running the collection. Is it intentional?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I don't think the macOS/x64 and Linux/x64 collections will be that different. I added OSX.1014.Amd64 to the file for consistency. I can remove it if you want.


# create superpmi directory
print('Copying {} -> {}'.format(superpmi_src_directory, superpmi_dst_directory))
copy_directory(superpmi_src_directory, superpmi_dst_directory, verbose_output=True, match_func=lambda path: any(path.endswith(extension) for extension in [".py"]))

if is_windows:
if platform_name == "windows":
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: This can continue to be is_windows?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well, it could.
But is_windows, in fact, means is_host_windows while when preparing the artifacts to publish to the Helix we actually mean is_target_windows. I thought it would make more sense to update the condition to reflect our intention.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Decided to leave this as is for the reason I described.

acceptable_copy = lambda path: any(path.endswith(extension) for extension in [".py", ".dll", ".exe", ".json"])
else:
acceptable_extensions = [".py", ".dll", ".json"]
acceptable_extensions.append(".so" if platform_name == "linux" else ".dylib")
# Need to accept files without any extension, which is how executable file's names look.
acceptable_copy = lambda path: (os.path.basename(path).find(".") == -1) or any(path.endswith(extension) for extension in [".py", ".dll", ".so", ".json"])
acceptable_copy = lambda path: (os.path.basename(path).find(".") == -1) or any(path.endswith(extension) for extension in acceptable_extensions)

print('Copying {} -> {}'.format(coreclr_args.core_root_directory, superpmi_dst_directory))
copy_directory(coreclr_args.core_root_directory, superpmi_dst_directory, verbose_output=True, match_func=acceptable_copy)
Expand Down