cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
1437
Views
0
Helpful
1
Replies

Packaging AuthProxy

modmedcorpit
Level 1
Level 1

Packaging AuthProxy

There’s something funny with the way you go about “installing” the authproxy software in Linux. It’s funny because you basically build openssl and python from source and then copy a bunch of stuff over. Although that “should work” in most Linux environments it’s not the way you’re supposed to install software, at all. Most Linux distributions have package managers, that handle integrity, re-installations, upgrades, removals, etc.

We have successfully packaged duoauthproxy for RHEL7 and compatibles and in the process of doing so we’ve identified some issues with the upstream code.

Patched code

It all starts with openssl, which is the first piece of software that gets built. Every Linux distribution will have a copy of openssl already installed, but you chose to ship, build, and install this one wich suggests you need a specific version or you added some custom changes, who knows; then cpython gets built. Now, this is a regular altinstall build (it would seem) using the custom openssl BUT the resulting build is way smaller than an altinstall using the official python sources which suggest that it’s heavily customized.

What are those customizations, one might ask? Well, who knows, the source is provided as is, so one has to trust DUO’s openssl and cpython expertise. Here’s a suggestion:

Could you provide the patches that transform the upstream code in your versions? You could make it part of your installation (download upstream + patch) or simply provide it somewhere else, so customers are able to audit the modifications. This is the main paint point in this regard: we’re running critical software (cryptographic engine and language intepreter) with modifications that potentially have very little QA/auditing done compared to the upstream projects as a whole (openssl & cpython). This also applies to the customized python modules that you use (not the ones you own but the others).

Fake versions

One effect of this customization frenzy is that you end up with custom releases, which would suggest using a custom versioning, but you didn’t go that far. The problem is that you “ship” standard versioned custom releases, which makes the life of anyone trying to identify your software a living hell. Did you modified the code? Is it really version x.y.z? Well, it depends. Sometimes it might be, sometimes it definitely isn’t. Already went over your openssl & cpython issues, and those might or might not be standard versions; the situation in the python packages is just madness. Here’s a simple example:

The duoauthproxy version 5.1.1, and a couple version before it, include ldaptor 19.1.0, or that’s what you would think. The thing is that the duo shipped ldaptor includes an md4 submodule that doesn’t exist in the upstream version, which means that the version in duoauthproxy is not really 19.1.0 but something else.

Could you please use the version metadata properly? There are several ways to go about it, you can do the trivial -postX solution, use a custom -duoX modifier or better yet, work with the upstream project to merge the changes there. By “faking” version numbers you’re making all your shipped packages “special versions” which is harder to audit and package.

Wheels

Not sure why, but you found that the best way to ship python code was to provide the source code. Although it mostly works, this hasn’t been the best way to go about it in the python world for a while now. For pure python modules (most of the ones you use) this is a no brainer: you get X version and build a wheel for it, or you download the already pre-built wheel from pypi, or you just let pip do it for you. For C based modules it might be more complicated (it was only cryptography, correct?), but it is still better to build a wheel and then install it.

Keep in mind that a wheel is still the source code (at least for the pure python modules), so you probably don’t need to worry about GPL & Co. but is way easier to manage than a source tree.

Virtual Environments

This is the intended way to customize a python environment and also to deploy your app. You only need the requirements file, assuming that you can reference the exact version that you need and you can provide the wheels to the non-public modules (not in pypi) although pip can download and build modules in case of missing wheels, if I’m not mistaken; you could even host a private index that pip could use to install your non-public modules (if you, for whatever reason, don’t want to publish them to pypi). With this method your app installer would become a simple requirements file (plus your custom openssl & python, plus some OS scripts, maybe).

Technical debt

There are some development decisions that are puzzling, at the least. Examples are:

user & group are hardcoded into every script that needs it

# --- install script will customize this ---
USER_DEFAULT = None
GROUP_DEFAULT = None
# ------------------------------------------

What if you need to change the user or group for whatever reason? Then you need to change the actual code. A better solution would be to leverage some kind of configuration file and module, like json or configparser or at least make it an optional command line parameter with a sensitive default.

util.py

# Keep the home directory so we can find config files, etc.
_home_dir = ""

def _set_home_dir():...

# do this when the module loads, i.e. before anybody calls chdir()
_set_home_dir()

I can’t recall where or when, but I’m pretty sure there used to be a chdir() call in the root of the execution path in some module (this code here is to save the _home_dir before the chdir frenzy). Using cwd and chdir is a thing in bash (and other shells) but there are better ways to handle paths in python, namely os.path and pathlib. Any of this two modules will allow you to work with absolute paths and derivates which will ignore the cwd altogether (and that’s a very good thing).

        # This local file is <AUTHPROXY_INSTALL_DIR>/usr/local/lib/python2.7/site-packages/duoauthproxy/lib/util.py.
        # Find <AUTHPROXY_INSTALL_DIR>.
        _home_dir = os.path.abspath(
            os.path.join(os.path.dirname(__file__), *[".."] * 7)
        )

Skipping the outdated comment (how many versions ago did you migrated to python 3?) that right there is a hell of an assumption. This code expects your installer to create a non-standard tree where the directories are placed in a specific way. Isn’t the python standard virtual environment tree not good enough? You could at least use a command line parameter with this guess as the default (instead of burying it so deep into the code). BTW, the same assumption is made on the authproxyctl script.

Conclusion

Although we’re able to build an RPM our of your source code there are several changes that you could implement to make the life of all packagers a little bit better. Thank you in advance.

1 Reply 1

Amy2
Level 5
Level 5

Hi @modmedcorpit, thank you for taking the time to write out such detailed and thorough feedback. You’ve asked some interesting questions here. We’ve shared your comments with the internal teams responsible at Duo for their consideration.

If you or anyone else in the future would like to file an official feature request, you can do so by contacting Duo Support, or your account executive or customer success manager (CSM) if you are a Duo Care customer. Learn more here.

Quick Links