Welcome to the Inedo Forums! Check out the Forums Guide for help getting started.

If you are experiencing any issues with the forum software, please visit the Contact Form on our website and let us know!

Upack install does not install dependencies



  • Greetings.

    I am currently evaluating if Universal Packages are the right tool for our department. It looks promising because it is so incredibly simple. Yet I do not get dependencies to work. I have two packages.

    Package A:

    { "name" : "PackageA", "version": "1.0.0" }
    

    Package B:

    { "name" : "PackageB", "version": "1.0.0", "dependencies": [ "PackageA:1.0.0" ] }
    

    I have uploaded them both to ProGet and they show up correctly and they also correctly list their dependencies.

    Yet when I run "upack.exe install" only the package itself is installed. Not it's dependencies. Is this intendend? Is there a reason why they are not managed? If not, what are they for?

    I am using upack.exe 2.0.0.1



  • Dependencies are part of the Universal Package and Feed specification because they're important for package management in general.

    However, the upack client doesn't install them, because it's primarily a simplified reference implementation of a general package client.

    We'd love to get your feedback on how your usecase (like, what are you using universal packages for, and how would you find dependencies helpful for this case), because maybe it's something we can include in the upack tool.



  • My use case is the following:

    We have a single "lib" package that is deployed alongside our product. It is a collection of .NET assemblies, resources (ZIP, CSS, HTML), documentation among other things.

    This "lib" package is really nothing more than a collection of dependencies, a versioned snapshot of all of the contained components.

    Each of these components are in turn packages themselves. To give an example:

    • Lib v1.0 contains ... PackageA v1.0, PackageB v2.0, PackageC v1.5
    • Lib v2.0 contains ... PackageA v1.0, PackageB v2.1, PackageC v1.5

    Each of these component packages is a separate project with a seperate repository and a separate build configuration. PackageA might be a NuGet package. PackageB might come from NPM. PackageC might be just a folder on our file system. It would therefore come in handy, if the build process would create a .upack file on build, ready for the "lib" package to be consumed. This way we would not have to repackage everything each time one of the components gets updated.

    if upack would also install dependencies, I would create the "lib" package as a completely empty package with only a list of dependencies that get installed when install the "lib" package.

    I hope I was not writing too much confusing stuff. Any suggestions?



  • Ah.. it's very interesting, thank you for sharing!

    So basically, you're using Universal Packages for dependencies at "build" time (not literally compiling/building, but when creating a product "build" that is to be tested through a pipeline). That's pretty cool to see, it makes a lot of sense.

    A few follow-up questions

    • Do these packages contain further dependencies on each other? Like, can "A.1.0" also require "Z3.4", and so on? It seems, maybe not, based on what you wrote.
    • Moreover, it seems you're always using static versioning? It's not like you're going to say "Lib 2.0 contains A-latest, B-1.2, ...". I don't think so based on my understanding.
    • Based on the above, the "lib" package's contents (conceptually) are simply the "unpackaged" contents of several well-known, specifically-versioned set of other packages? I think so.

    All that said, once we find a nice solution for it... I would love to see about documenting this as a use case, because it's really good way to handle things -- please let us know if you would be willing to help us with a use case study of it! It would make for a great webinar to our audience, or it could be a blog article, and anonymous if it's a problem to share your company info...



  • Hi there. Thanks a lot for your update. As for your questions:

    1. These packages are probably never going to contain references to other "child" packages. Each component is built through our CI and then published to ProGet for the sole purpose of being consumed later on by the "lib" package. The lib package itself is also never going to be contained as a dependency in other packages. I don't know if this will change, but it is unlikely - at least for our scenario.

    2. Yes, we are using static versioning. Periodically (every 3 months more or less) we ship a new version of our product. Every version we release, requires a certain version of the "lib" package to run correctly. A certain version of the lib package contains certain fixed versions of each component. This is because our product is built against those exact versions.

    3. Exactly. Like said in 1., each component has a separate build-step called "package" at the end of the pipline that takes the build artifacts and packages them up in a upack. The "lib" package then just "collects" them together to have 1 single package that contains everything we need.

    At the moment I am using a hybrid method to still be able to leverage upack in our build process. I have the "upack.json" for the "lib" package that does not contain anything at all, but just a list of depencies. I then use a powershell script that reads that .json file and uses upack to gather all dependencies into a folder and then uses "upack" to package that folder up to build the "lib" package. This works as in: it provides us with a package that contains everything we need. But the downside is, that the "lib" package then duplicates the contents of it's dependencies. We're talking about 7MB here, so not really an issue, but still ... using dependencies during "upack install" would be cleaner.

    As for use case and blog ... I am more than willing to help you here. I would really like to see upack grow as it is - in my opinion - a helpful tool for products that don't already have a package ecosystem (like ours). How can I helo you?



  • As some background, we need to tread very carefully with dependencies; they're deceptively complicated, and every other package manager we've worked with has gotten them wrong. In fact, NPM seems to have given up on sensibility altogether with npm3 Non-determinism. Fortunately, Microsoft's NuGet team hasn't, but you can get a feeling for how challenging the problem is to document in their Dependency Resolution documentation. This is precisely why they're so basic, and are only a starting point in the Universal Feed Specs.

    Anyway, your usecase is exactly something we want to solve... and it's something we've come across when designing Romp usecases as well. Let me share it.

    For example, ProGet has two installable components (Web, Service) that can be installed on different servers for load balancing / high availability purposes. However, most users would just want to install ProGet on a single server. Clearly, we need at least three romp packages: ProGet.Web, ProGet.Service, and ProGet.Full (which has both).

    But.... doesn't it seem like ProGet.Full can just take dependencies on ProGet.Web and ProGet.Service? Of course, then we run into the issues surrounding the same questions I asked you: it's always a static version (you must have same versions of .Web and .Service) and there will never, ever be dependencies that .Web and .Service require. So not really dependencies, and thus a pretty poor usecase for a dependency feature.

    But this gives me an idea. What do you think about a "virtual package" concept, maybe we can call it a bundle? It's only metadata (let's say it's a JSON file), and it acts like a package in every other way (immutable, static, etc).

    In your case, maybe it looks like this:

    lib.2.0.0.ubundle
    { 
      "name" : "Lib", 
      "version": "2.0.0",
      "content": [
         "PackageA:1.0.0",
         { "name": "PackageB", "version": "2.0.0", "hash": "...", "virtualPath": "/lib/res/b" },
      ]
    }   
    

    Obviously on the ProGet/romp side, we might have additional properties that describe installation order and other variables for a "romp bundle" or something.

    Anyway it's just a rough idea. But you clearly seem to understand the problem space, so I want to ask your opinion :)



  • I understand the concept of a virtual package and it's a nice approach. Especially the "virtual path". Because with packages that are just pure generic folders it's quite difficult for the package manager to figure out what to do with the sub-package and where to put it. In my case the subfolder, where the sub-package is placed, has to match exactly with it's name. But that obviously doesn't mean it holds true for everyone else.

    I also know the troubles that plague systems with dependency management all too well. They look so simple, when in reality they are not.

    With that said:

    I'm not sure if I understand when you want to resolve the virtual package.

    Option 1

    You resolve the virtual package during upack install.

    In this case: I'm not sure why you need to "invent" something new here. As I see it, virtual packages are then nothing more than a "stripped-down dependency management" with two restrictions: static versioning and non-recursive dependencies.

    Moreover: Dependencies are currently nothing more than a part of the specification. The package manager (upack) completely ignores them. So why not use them? Right now one would look at the documentation of upack, sees dependencies, is happy ... but then gets disappointed by the lack of support thereof.

    Option 2

    You resolve the virtual package during upack pack.

    In which case it would completely replace my custom script I talked about earlier. I would still end up with duplicated content, but that is the least of my concerns. But in this case I would think about removing dependencies from the documentation altogether ... or at least mark them as something that is not (yet?) supported.

    Option 2 is one I would really love to see implemented.



  • Great points. Gave us lots to think about!

    As we work on our docs for v5, we're going to be clearly on dependencies, and why upack.exe doesn't use them (but how other use cases might).

    Yes; I suppose that this "bundle" (virtual package?) is indeed a form of "stripped-down dependency management". But I see it differing in a few key ways:

    • conceptually, a "bundle" visualizes the "these 5 things are related", not "this 1 thing requires these 4 things"
    • simplicity, the limitations make it more accessible
    • generalizable, unlike dependencies

    And then of course, if we take the virtual package concept further, we could even have the virtual package have instructions within it to describe where to download actual content from... which is not dissimilar to what Docker for Windows does.

    As to the options...

    I think, a regular "upack install" would do exactly as you say. It would simply request that package from the download package endpoint url. From the server-side, the bundle would be munged transparently and returned as a package (or content-only if you specify it).

    I guess if you did "upack pack" today, it would give you an error? Maybe we should use a different command (bundle? unbundle? repack?). Or not. But in any case... yes it would do exactly what you specify -- except perhaps issue a small warning that you can't "re-bundle" it (i.e. split it to the original packages).



  • Just as an update, we will be introducing these as "virtual packages" in Universal Feed v1.2, shipping in ProGet 5.0


Log in to reply
 

Inedo Website HomeSupport HomeCode of ConductForums GuideDocumentation