As each dockerfile command is run, it will generate a intermediate container.
So the RUN , COPY, CMD ... command couldn't pass the environment to next container.
You need to use ENV set the Environment, but the ENV will not execute the command.
In Azure Devops, you could use Self-hosted Agent(Running in Docker) to create a variable.
Here is the steps:
Step1: Create a Self-Hosted Agent running in Docker.
Step2: In Build Pipeline, you could run the gcc --version | head -n 1
Here is a Blog about create Self-Hosted Agent running in Docker.
Update:
You could try to add the container resource to Azure Pipeline, then you could run the script on the container.
Here is a doc about this feature.
Here is the Yaml example:
resources:
containers:
- container: python
image: python:3.8
trigger:
- none
pool:
vmimage: ubuntu-16.04
steps:
- script: |
gcc --version | head -n 1
echo "##vso[task.setvariable variable=test]$(gcc --version | head -n 1)"
displayName: 'Run a multi-line script'
target:
container: python
commands: restricted
- script: |
echo "$(test)"
displayName: 'Run a multi-line script'
- task: PowerShell@2
inputs:
targetType: 'inline'
script: |
$token = "PAT"
$url="https://dev.azure.com/{Organization Name}/_apis/distributedtask/pools/{Pool Id}/agents/{AgentID}/usercapabilities?api-version=5.0"
$token = [System.Convert]::ToBase64String([System.Text.Encoding]::ASCII.GetBytes(":$($token)"))
$JSON = @'
{
"Gcc-version":"$(test)"
}
'@
$response = Invoke-RestMethod -Uri $url -Headers @{Authorization = "Basic $token"} -Method PUT -Body $JSON -ContentType application/json
Result:

The Rest API is used to update the Agent Capabilities.
Note: We can only manually change the user defined Capabilities.
On the other hand, you still could create a Self-hosted agent running in docker.
Then you could directly run the same script on the agent and get the tool version.