Azure CLI install: Homebrew or FU

The only official installer for Azure CLI for macOS is Homebrew. This is just wrong because there is no official package manager for macOS and while it may be that Microsoft and GitHub like Homebrew, it should not be the only option. Microsoft also mentions there is another option:

If you can’t use homebrew to install the Azure CLI in your environment, it’s possible to use the manual instructions for Linux. This process isn’t officially maintained to be compatible with macOS. Using a package manager such as Homebrew is always recommended.

https://learn.microsoft.com/en-us/cli/azure/install-azure-cli-macos#other-installation-methods

Unfortunately this has not worked for years and does not even work on linux. It’a just an absurdly hostile position to take. I almost just dropped to using an orbstack linux VM but I gave it one more college try to fix Microsoft’s script to work with MacPorts, which is not difficult. You need to install python3, pip3, py-virtualenv (which will be @20.26.6 at this writing) as pre-requisites and just patch out Microsofts hamfisted use of an ancient 16.7.11 version of virtualenv from July 20, 2021 without it’s dependencies!

Yes, Microsoft is not really trying because I was able to make this work pretty easily.

With MacPorts on macOS 15, install python312. Install py-pip. Install py-virtualenv.

sudo port select --set pip3 pip312   
sudo port select --set python3 python312 
sudo port select --set python python312 
sudo port select --set virtualenv virtualenv312
curl -L -O https://aka.ms/InstallAzureCli 

Edit the InstallAzureCli script to remove the last part of the script that runs the python script the bash script downloads.

# python_cmd=python3
# if ! command -v python3 >/dev/null 2>&1
# then
#   if command -v python >/dev/null 2>&1
#   then
#     python_cmd=python
#   else
#     echo "ERROR: python3 or python not found."
#     echo "If python is available on the system, add it to PATH."
#     exit 1
#   fi
# fi
#
# chmod 775 $install_script
# echo "Running install script."
# $python_cmd $install_script < $_TTY

Edit install.py to change the create_virtualenv(temp_dir, install_dir) function to use the system virtualenv.

def create_virtualenv(tmp_dir, install_dir):
    # download_location = os.path.join(tmp_dir, VIRTUALENV_ARCHIVE)
    # print_status('Downloading virtualenv package from {}.'.format(VIRTUALENV_DOWNLOAD_URL))
    # response = urlopen(VIRTUALENV_DOWNLOAD_URL)
    # with open(download_location, 'wb') as f: f.write(response.read())
    # print_status("Downloaded virtualenv package to {}.".format(download_location))
    # if is_valid_sha256sum(download_location, VIRTUALENV_ARCHIVE_SHA256):
    #     print_status("Checksum of {} OK.".format(download_location))
    # else:
    #     raise CLIInstallError("The checksum of the downloaded virtualenv package does not match.")
    # print_status("Extracting '{}' to '{}'.".format(download_location, tmp_dir))
    # package_tar = tarfile.open(download_location)
    # package_tar.extractall(path=tmp_dir)
    # package_tar.close()
    # virtualenv_dir_name = 'virtualenv-'+VIRTUALENV_VERSION
    # working_dir = os.path.join(tmp_dir, virtualenv_dir_name)
    # cmd = [sys.executable, 'virtualenv.py', '--python', sys.executable, install_dir]
    # exec_command(cmd, cwd=working_dir)
    cmd = ["virtualenv", "--python", sys.executable, install_dir]
    exec_command(cmd)

Now the script will work:

python3 ./install.py

If you are using zsh and you want completions you need to turn on bashcompint.

autoload -Uz bashcompinit
bashcompinit #bash completion support (required for az)

#enable command-line completion for az cli
if [ -f "$HOME"/lib/azure-cli/az.completion ]; then
  #requires bash autocompletion: bashcompinit
   . "$HOME"/lib/azure-cli/az.completion
fi
view raw instructions.md hosted with ❤ by GitHub

Backup Windows Server on EC2 with PowerShell and AWS CLI

When we first started working with Windows Server or Amazon EC2, we ran into the problem of how to back up SQL Server databases. My initial thought was to somehow mount S3 as a volume and have SQL Server write backup jobs to it. That turns out to be kind of hard.

After a little head-scratching, I realized that EBS volume snapshots are stored in S3. Therefore, all we need to do is mount an EBS volume and have SQL Server write backup jobs to that and then snapshot that volume. Oh, and unmount it from the running Windows server before snapshot to make sure that we are taking clean snapshots.

It turns out to not be too hard to make this happen on Windows Server 2012 Core using a PowerShell script that drives the AWS CLI and does the hoodoo with the Windows volume manager to mount and dismount the EBS backup volume.

This solution requires PowerShell Storage Cmdlets which ship with Windows Sever 2012/ and Amazon Web Services python-based command-line interface. Works great on Windows Server Core.

The biggest problem that I ran into was that the script would run just fine when invoked manually or when I would trigger the job from schtasks while logged in, but it would hang and fail when run from scheduled tasks in the middle of the night. Long story short: when schtasks runs the job with nobody logged in it gives the .DEFAULT environment instead of the environment of the user context of the scheduled task. That meant that AWSCLI didn’t receive the correct %USERPROFILE% environmental variable and was not able to locate its config file with the user id and key.

The simplest solution was to wrap the invocation of the powershell script in a cmd batch script:

set USERPROFILE=C:\Users\Administrator\

powershell.exe -file “C:\Program Files\Invoke-BackupJob.ps1” -path E:\ -diskNumber 2 -ec2VolumeID vol-<id-number-here> -description “Important Backup” > “C:\Program Files\Utility\Log.txt”

Use the PowerShell Get-Disk cmdlet to figure out the disk number of the EBS volume in Windows and the EC2 console or AWSCLI to figure out the EBS volume ID in EC2.

[gist https://gist.github.com/breiter/7887782]