Since the beginning of Bincrafters, our packages were hosted on Bintray. In
February, JFrog announced the sunset of
Bintray
(see also this Conan blog
post). On
May 1, 2021, Bintray will shut down forever and with it our current Conan
repository at https://api.bintray.com/conan/bincrafters/public-conan.
JFrog offers Artifactory as one possibility to host a Conan repository. In addition to the opportunity to host Artifactory yourself, they also offer managed Artifactory cloud instances which also includes a free trier.
Since we build and provide binary packages for a huge amount of configurations, we always required a lot of space and network traffic. JFrog generously sponsored us by effectively making our Bintray account unlimited and helping us to cover costs for CI.
We are thankful that JFrog decided to continue to sponsor us, even after the sunset of Bintray. We now have a sponsored Artifactory instance at bincrafters.jfrog.io.
In order to continue to consume Bincrafters packages, you need to change our remote URL from pointing to Bintray to our new Artifactory instance.
If you have named our remote bincrafters, you can execute the following
command to update our remote:
$ conan remote update bincrafters https://bincrafters.jfrog.io/artifactory/api/conan/public-conan
Additionally, this new remote requires Conan clients to have revisions enabled. This was not the case with our Bintray remote.
If you have not enabled revisions yet, you can do so by executing
$ conan config set general.revisions_enabled=1
You can start using our new remote right now. We recommend switching to the new remote as soon as possible, to have enough time to solve potential unwanted side effects of these changes.
Our new remote should still offer all recipes and binary packages that you would get from our Bintray remote up to May 1, 2021. This is due to the fact, that our Artifactory remote integrates the Bintray remote transparently.
We made the decision to copy all recipes from our Bintray remote to Artifactory, but not the binary packages. This means, that after the 1st of May, our Artifactory remote will not provide old binary packages anymore that previously existed.
One reason for this decision was the fact that many of the recipes and their binary packages got obsolete in the meantime and, for example, can now be found directly in the Conan Center Index. When it is possible, we migrate our recipes there and welcome help from contributors for such migrations.
The other major reason is the enormous size of all binary packages combined, which we would need to transfer and verify. The disadvantages of having these packages around further might easily outnumber the advantages.
For new recipes and recipes revisions we continue to build and provide binary packages via our new Artifactory remote.
In a nutshell, on May 1, 2021 the vast majority of currently existing binary packages will not be available anymore, but all recipes should continue to be available from our new Conan remote. If you notice missing recipes or similar unexpected behaviour, please reach out to us.
We recommend to not expect that our remote offers binary packages for a specific
configuration. Instead, set a build
mode like missing
(or outdated) in order to tell your Conan client to build the packages if they
are not available.
If you notice missing recipes, other unexpected behaviour, run into problems or have questions regarding these changes, please either open an issue on GitHub or reach out to us in Slack.
]]>Conan enables users to encapsulate C/C++ project dependencies, distribute them, and consume them in other projects. This involves the complex challenges of transitive dependencies, versioning, licensing, and so forth. Hopefully this high-level functionality is familiar and obvious to everyone based on experience with other ecosystems, because the features will likely become decreasingly familiar as we get into more detail.
With the operative word of “Source” meaning “Source Code”, Conan shares this functionality with a number of other “package managers for native languages.” Package recipes contain the sources (or mechanism to obtain the sources) for a project and the ability to invoke the build system on the local machine. Furthermore, this includes the ability to recursively perform this process for the complete chain of transitive dependencies. This robust management and building of sources for projects and their dependencies is a central function to Conan, and the “Binary Package Management” feature which we will discuss next is built on top of it. However, its important to point out that many developers do not use the “Binary” features for a variety of reasons, and instead build all project and dependencies from sources locally. There are a variety of valid reasons and use cases for doing this which we won’t go into, but the key point here is that it’s a well supported workflow with Conan.
Of note, one reason that a number of other package managers have been springing up around C and C++ with this functionality, is that because it’s the first and easiest thing to implement in the problem domain of package management. Package managers for the languages of Rust and Go operate exclusively in this mode and have little-to-no support for binary packages. Since many of the new C and C++ package managers borrow numerous features from these ecosystems, it’s easy to understand why they stop with source package management as well.
Managing binaries for libraries is a related-but-separate feature set from managing sources. Conan is unique among “package managers for native languages” in providing this functionality as a first-class part of the platform. There are binary package managers for operating systems such as Yum, Apt, Homebrew, and Chocolatey, but those are a different category altogether. Since it’s a fairly unique feature of the platform, it’s worth defining what it means in the context of native software development.
Conan captures compiled binaries into it’s packages so that multiple developers don’t have to redundantly compile the same binaries on each of their machines. A common case with Conan is to have a CI server like Jenkins re-compile libraries after each GIT commits and push the compiled binaries to a shared Conan repository. Then, the Conan clients on all the developer machines can download updated binaries whenever they invoke Conan on a project that references those libraries. In many cases, this is much more efficient than the alternative. “Binaries” typically includes .lib, .a, .so, .dll, .dylib, and .exe files, but Conan can technically capture any type of file in these packages. Native software binaries are unlike binaries from other languages (like Java’s .jar files for example). Native binaries are not universal, and two binaries must be “ABI Compatible” to be used together. Thus, in order to capture, share, and consume binaries effectively, Conan must record a complete picture of “how the binaries were built” in the form of “ABI Metadata” each time it captures files. This includes OS, architecture, compiler settings, debug or release, and so forth. This also means it has to keep track of many binaries for a single package, and provide the mechanisms for searching, matching, downloading, and uploading in a way that accomodates “single-binary operation” and “multiple-binary operations”. Conan typically captures these binaries immediately after running a build like in the CI example mentioned earlier, but it can also capture binaries that were built outside Conan, such as binaries of an SDK provided by a vendor.
In the world of software development, package managers feed build systems. They obtain dependencies, and make them available for build systems which is done by passing variables such as file paths, compiler definitions, and other variables. Unlike other language ecosystems, C/C++ features a vast array of build systems, and even “meta-build systems”, each with their own proprietary formats. Conan intends to be a build-system agnostic package manager, and therefor needs to be able to pass these common variables to whichever build system is used by a given project. To address this daunting challenge, Conan features an extensible “Generator” model, and there are over a dozen public generators available for all the most common build systems. When a developer runs a build through Conan, it calls the appropriate generator, which converts generic variable information to the proprietary variables compatible with the specified build system(s). It then writes these variables to files, which the build system then reads.
This generator system is one of the key features of Conan that gives it the flexibility to deal with the challenging diversity of the real world. This includes the known diversity of open-source build tools, along with incomprehensible diversity of private build tools. The generators make it truly trivial for organizations to write and maintain whatever custom generators they need to integrate with their existing build systems.
People use scripts around C and C++ all the time, including shell scripts, python, groovy, etc. There are countless tedious steps to perform when working with C and C++ projects, and in any professional setting or codebase with more than a few projects, scripting is practically a necessity. Many organizations maintain their own scripts in-house, per-organization, or per-project. While there’s a great deal of specifics in these scripts which cannot be generalized, there’s a vast amount of redundancy among these scripts which can. Conan represents one generalized open-source, cross-platform collection of such scripts which is consistent, feature rich, extensively field tested, and even unit tested (how many organization unit test their in-house build scripts?). It provides this scripting framework because it has to in the service of being a package manager, and so it’s very well organized and intuitive to use.
Environment variables are one of the few “common coins” we have in the diverse world of C and C++. The vast majority of build systems and related tools enable settings and options to be controlled with environment variables. If you’re doing any sort of professional development, it’s likely that you set environment variables for this purpose, either manually via your operating system, shell scripts, or via a CI platform like Jenkins. Likewise, you probably also work with a team of developers, and there’s a need for everyone on the team who works with these tools to understand and use these variables consistently to achieve consistent builds across the codebase.
Both Conan recipes and Conan profiles provide mechanisms for managing environment variables while Conan performs it’s various operations. When working with enterprise environments and teams, profiles are a cornerstone of the Conan platform and workflows. Part of the power is in Conan’s ability to compose and distribute these profiles, which includes composing and distributing of environment variables to both teams and CI servers. Much like many of the other value propositions of Conan, one of the critical characteristics which comes from managing these variables with it, is that it represents a separate abstraction layer, thus making it is agnostic of the operating system, build system, meta-build system, etc.
There are a great many tools that surround the C and C++ development ecosystem. Developers install them using various mechanisms, including manual installation, and their operating systems package managers. There are also enterprise management tools which enable tools to be deployed to all users in an organization. With Conan’s build_requires feature, build tools can be packaged, distributed, and “installed” just like any other Conan package, and thus represents a new option. There are over a dozen such packages featuring extremely popular build tools already on Conan Center, or the Bincrafters repository. Also, many organizations which use Conan internally package the unique build tools they use and distribute them to their developers and build servers this way.
You may wonder why we need yet another option for installing these tools, and why this method is so popular with Conan users. There are several reasons. For starters, there is often a direct relationship between a C/C++ project and the tool (or a collection of tools) required to build it. Conan provides a unique ability to describe these relationships (library to build-tool) with a very natural and robust experience which includes all the same functionality that normal “library” dependencies do. Crucially, builds that use tools from Conan packages are inherently much more deterministic thanks to some of the details about how Conan packages them. Additionally, sometimes developers have to work on libraries which require different versions of the same tools, so they often need multiple versions of tools to easily co-exist on the same machine. Conan encpasulates the tools, by version, in a way that guarantee’s safe co-existence. With the build_requires feature being definable on a per-recipe or per-profile basis, it makes it extremely intuitive for developers to put some default tools and versions in their tools in their default profile, with the flexibility override them as needed for certain builds.
A toolchain is a vague but commonly used term, and here it is used to describe all the “inputs” to a builds. Effectively, this is like a superset of the two items described above: the tools used to build something, and the settings fed into that build (by environment variables, settings files, command-line inputs, etc.). So, given that we’ve already described how Conan manages both environment variables and build tool installations, here we are pointing out that Conan can manage the various configuration files used as inputs to the build tools. These files can be packaged, versioned, and distributed just like the build tools and environment variables.
As mentioned before, people already wrap their build systems with their own scripts to provide a layer between the developer and the build tools, and Conan simply provides a more mature and robust layer than most other scripted solutions. While scripts for build operations are vast and diverse, probably the most common things which people script is the calling of the build system. So, here are just a few of the novel “build system wrapper features” which people often script themselves and which Conan provides (although there are several more):
I focused on this small and simple workflow, because in the world of C and C++ development, one of our most famous and ancient incantations repeated thousands of times each day around the world is mkdir build && cd build && some_build_command ... If questioned about why they use this specific workflow, I imagine many people would probably something like “it’s simple and it works”, which I would not deny. However, the reality is that the workflow doesn’t end there, and so these commands brings along baggage in terms of the natural followup steps (cleaning the directory and re-running, creating separate directories for debug/release, etc). These steps are just unsimple enough to get scripted most of the time, so when taken as a whole, it’s nice to have a set of scripts that manages our build directories intelligently and idempotently.
So now that we’ve outlined a variety of lesser-known features, maybe the above description is a more accurate represntation of Conan which includes all the things that Conan can do that are not traditionally considered part of “package management”. Hopefully this gives readers who are new to Conan a bit more context with which they can read through the documentation and get a foothold in how Conan works, why it works the way it does, and whether or not they want to try using it in their codebase.
]]>While enterprise development environments are almost always private and proprietary, they still form an ecosystem. Recently at the SwampUP devops conference there were a few dozen C and C++ developers and/or build engineers, almost all of them representing enterprise organizations and openly discussing their challenges around dependency management. Perhaps unsurprisingly, there are a lot of shared challenges. One thing that was certainly shared was that they had all done some research and/or testing with Conan and saw potential value for their build environment. The situation is actually similar in the #conan channel in the cpplang.slack.com slack workspace. There are a number of users who represent enterprise development teams with very large-scale challenges and are looking toward Conan as a solution. They’re not as vocal as the OSS enthusiasts, but they’re generally easy to spot because they ask very different types of questions. We’re also starting to see less beginner questions about the basics Conan, and more advanced questions surrounding migrations to Conan from legacy solutions, which is interesting and encouraging.
Certainly one of the most attractive things about Conan for these very large organizations is that it represents a single platform which is capable of wrapping around their most complex and obscure build scenarios for both their internal enterprise code as well as whatever OSS libraries they use. While there are a number of other novel package managers popping up in the C++ ecosystem, they’re almost exclusively focused on trying to make the OSS library ecosystem simpler. None are designed to address these highly complex enterprise build environments. This effectively means that Conan the only platform of it’s kind, and since many of them already use JFrog’s Artifactory service for managing other package types, adding Conan repositories on their existing servers, making it overwhelmingly attractive as a solution.
So now that we’ve talked a bit about the landscape of enterprise package management, lets talk about a few of the Conan features which actually solve problems and deliver value in these environments.
Package management (or the more general term “dependency management”) is the well-known marquee feature of Conan, and thus not intended to be the focus of this article. However, it’s important to clarify some points about how Conan dependency management differs from others. Conan provides four parallel components surrounding the packaging and management of dependencies for projects. While enabling users to package source files (as do other package managers), Conan also enables users to package compiled binaries, and the build tools used to compile the sources into binaries. It also lets users specify dependencies from other package managers, namely those built into the common operating systems. In particular, the management of binaries and build tools is one unique and powerful feature which is used extensively in the enterprise use-case, and relatively rarely in the OSS use-case (although that is changing slowly). By providing all of these slightly different “dependency management” features through a unified interface, platform, and syntax, it really simplifies the shape of the problem from an administrative point of view.
Note: For more detail about binary and tool packaging, check out our upcoming blog post titled “What is Conan?”.
Of all the under-appreciated features in Conan, profiles have to be the single most powerful and transformative. At SwampUP, virtually every presentation came from developers who use Conan in an enterprise codebase, and virtually all of them talked about their use of profiles. So what makes profiles so important?
In order to make a powerful and valuable transformation across many organizations in an industry, there has to be something really wrong. Something broken or horribly inefficient, and so complicated that it hasn’t been fixed in a simple and general way. Right? One would think so, but Conan profiles actually seem to defy this logic. What Conan profiles actually do is actually straightforward (although the road of creating and evolving the functionality was not). As it stands now, Conan Profiles could potentially stand alone as a tool which would be generally useful across all of software development. Here’s what Conan profiles offer:
Note: For more detail about profiles, check out our upcoming blog post titled “What is Conan?”.
With the ability to compose and distribute the profile system among enterprise teams, Conan profiles have truly revolutionized a growing number of enterprise workflows and strategies in a way that would not be possible without them. It has proven to be an absolutely necessary abstraction layer to corrale all the inputs to build processes.
Ref. Conan Profiles: https://docs.conan.io/en/latest/reference/commands/misc/profile.html
How many compilers are there for C and C++ ? Most developers are probably aware of at least 3 or 4, and might think that 10 is a generous guess. The most honest answer is probably that nobody knows. 50 probably sounds like a reasonable guess, but it the real number is certainly far greater than that. Thus, Conan takes the approach of supporting an infinite number of compilers, by defining a few well-known compilers for practicality and convenience, but giving developers the ability to define whatever compilers are used their environment, and their properties. One crucial property that Conan empowers users to define and control is the “ABI compatibility” properties of a compiler. For example, binaries built from different GCC minor versions are ABI compatible, but only those after 4.9. Similar situations exist with obscure enterprise compilers. Conan is designed to handle these situations with ease. Similar to the problem of knowing all compilers, it’s impractical and useless to try to know or define all possible architectures. Thus, again the best solution for a tool like Conan is to provide sensible defaults but empower developers to define the ones they care about (which it does).
Conan users who want to use exclusively for getting dependencies for OSS projects rarely use obscure compilers and architectures, so this is a great example of a feature which is predominantly designed for enterprise projects and seems unnecessarily complicated to the newcomer. With that said, the IOT revolution and explosion of custom architectures is making this a more relevant feature for people who do development for new and obscure hardware platforms as a hobby.
Have you ever heard of OpusMake?
http://www.opussoftware.com/
I hadn’t heard of it, but I was tasked with automating builds with it. Unfortunately, CMake was not usable in the codebase, so I needed a way to generate make files manually. Also, the syntax of OpusMake is proprietary, very different from GnuMake, so even if there was an existing way to generate Makefiles, it wasn’t going to help me. Fortunately, it was trivial in Conan to write a custom Generator that passed the information to OpusMake in its native format and use it throughout our environment.
This is a common situation in many enterprise organizations. Their requirements might include building projects with in-house, vendor provided, or just really old build systems, none of which would ever be natively supportable by any package manager (including Conan). So, similar to the virtues mentioned above (the features around profiles and settings), the crucial requirement around build systems is flexibility and Conan provides a really powerful model for that.
Ref. Conan Generators: https://docs.conan.io/en/latest/reference/generators.html
Hopefully you will come away from the post with a better understanding of why the Conan documentation makes it appears a little more complicated than NPM, Maven, and perhaps most of the other package managers for C and C++. And, if you work in a large scale enterprise codebase, we hope we’ve highlighted some feaures that are relevant to your current challenges. In any case, we think it’s important to help those who are actively exploring Conan to see it clearly for all that it does, and hopefully learn a few things along the way.
]]>Update 06/29/2018
The video of the Bincrafters talk from SwampUP has been posted:
https://www.youtube.com/watch?v=v2QiNTvco5E
In general, I have to say that this was one of the most well-organized and enjoyable conferences I’ve ever been to. I see why people who come once, tend to return each year thereafter. The attribute about this conference which was probably the most unique and memorable was the diversity of technology represented for such a small conference. It was not dominated by a single vertical, language, cloud platform, or even a specific devops concept like containers. We had experts talking about every part of devops, from a vast array of companies and ecosystems.
There were a few clear message among the Conan audience about devops for C and C++. One such message was that the open-source ecosystem is only part of the story of dependency management. Another was that enterprise devops teams and workflows need and want a solution which manages and deals with compiled binaries, and not just sources. These might be the most important takeaways from the conference to be highlighted for the wider C and C++ community, and particularly anyone people involved with the C++ committee study group SG15. Understanding the perspectives of the large enterprises who are aggressively pursuing “package management” seem very relevant to the process of creating any successful standards in the realm of tooling interoperability.
Also of note, the package management discussions contained a lot of substance in the form of concrete details about homegrown devops strategies these companies have tried in the past, along with feedback about the costs, benefits, and lessons learned from those strategies. The conference really shined light on the under-representation of this enterprise user base in the public discussion around tooling and how unfortunate that is given that this user base contains extremely wise and experience build engineers.
The enterprise devops engineers had several things in common.
A few additional notes: virtually everyone was using some form of CI servers such as Jenkins to automate tests and builds, only a few were using docker for C and C++, and only a few were using CMake as a primary build system. To be fair, most didn’t seem to have a “primary” build system at all, because most had to support several in parallel.
There was an interesting meeting where engineers from many of the existing and potential companies using Conan gathered to vote on potential roadmap features. This was not to dictate the course of the future, but rather to gather and weigh feedback. There were 15 proposed items taken from existing feature requests. There were a few that stuck out as being relevant to Bincrafters, so we’ll focus on those.
Codesharing of python code using Conan rather than PIP was one such feature. This had been boiling up in the slack channel in the days leading to the conference, so it was no surprise that it was front and center in the meeting. @grafikrobot from the bincrafters team had to do extensive work to implement this kind of functionality manually for the boost recipes, so this one definitely has value for us. It was pretty unanimous among the rest of the group, and with a “prototype” already sketched out by the Conan team, this one seems to be imminent. This will probably be easiest win.
The discussion around features related to new options for dealing with source code was an interesting one. It seemed to be important to a few people, but it was not a feature request we had seen in the community previously. The specific proposal was to bring process of obtaining sources for each build from GIT at build time to the local development flow (which traditionally uses the exports_sources feature instead of a source method). It also included implicitly locking a Conan package version to the GIT commit hash that was the HEAD at the time Conan was first called on the target recipe. There were other related requests around the sources for packages. Some were concerned about the sources being zipped up in the packages being a liability. Meanwhile, we’ve had cases in Bincrafters where we wanted the opposite (to bundle sources in packages which use the source method to fetch external sources at build time). One common theme was that pretty much everybody was trying to work toward reproducible builds which is good. Despite the interest in the she specific feature proposed, there are some substantial challenges to implementing this functionality, so you probably shouldn’t expect to see this come out any time soon.
Different names have been kicked around for this feature, but we’ll call it workspaces for now. In short, it’s a feature designed to make it more familiar and slightly more efficient for developers to leverage Conan while editing multiple inter-dependent projects at the same time. Really this feature is about enabling developers to use the same pattern of temporary build directories their projects are setup to use today (and thus less refactoring and re-training), and also to skip the potentially un-necessary copies of sources and binaries to the Conan cache, which occurs regularly when builds are run through Conan. With this feature, users can define a YAML file in a directory structure that Conan discovers, and which tells Conan how these project folders are laid out in the user’s workspace. Based on the definitions in the file, Conan points to the appropriate temporary build directories for the compiled binaries when generating conanbuildinfo files (rather than the Conan cache). Much like python code sharing, there’s an active prototype of this feature with many users interested in it, so it seems imminent.
Package revisions was probably the second most significant feature for Bincrafters, and the problem goes back as far as the existence of binary package managers, and still poses problems for extremely mature package managers such as APT and YUM. If you’re not familiar with this one, there are two known challenges surrounding semantic versioning with packages, one of which affects the OSS world, the other affects the enterprise.
In the OSS world, the challenge is meeting three conflicting objectives with nothing but a version field. Most everyone wants to maintain the existing versioning strategy because it is the most intuitive, whereby we use the version number from the upstream library as the version number for the package. However, the first competing objective is the widespread desire for package immutability. This is the principle that once a public package is uploaded to the official central repository, the binaries should not change for any reason, lest a “fix” or improvement might actually break users who have already worked around the bug or deficiency themselves. The last part of the conflict is the fact that there is no such thing as a perfect package with C and C++, and there are numerous valid reasons to update a package, even when in the central repository. While everyone agrees that we would prefer not to overwrite existing in-use binaries, there’s no good alternative when there’s a severe problem with a binary. There are proposed mitigations, but are all undesirable, as they all sacrifice something without even solving the whole problem. It’s widely agreed that the best solution is a new first-class feature designed to solve this problem specifically, however the specifics of how it would work have yet to be determined, let alone tested and validated.
In the enterprise world, there’s a different problem. Many C and C++ codebases don’t use semantic versioning for their internal modules. The term “live-at-head” coined by Titus Winters at CPPCon 2017 describes a strategy that is very common in C and C++, where developers all work off a super repo that contains all the modules, and they’re effectively “versioned together” by virtue of the source control system. In simple terms, the conclusion by many enterprise build engineers is that trying to migrate from this approach to any manual versioning strategy is a huge source of complexity, a burden to developers, and will be a constant source of problems. In the wide world of open-source software, this burden is absolutely necessary and warranted, but in enterprise development it’s often not. In these types of environments, it seems clear that we should be using the same strategy as source control systems, with hashes with timestamps that are completely automatic, yet still traceable. Furthermore, many organizations need to be able to correlate source control changes to specific binaries that were built from them. Achieving this in the current version of Conan involves a lot of manual engineering but could be done much more elegantly by adding automatic versioning features to Conan, along with a few others.
For Conan, the overarching challenge to providing all of these options is that the traditional strategy of manual version management has been baked into the core logic of the platform from day one, and countless other features rely on it to work a certain way. It’s overwhelming to try to imagine how to implement an alternative system which contains several potential implicit behaviors, and still works with all the packages and recipes which exist today.
The good news is that concept of “package revisions” (as discussed at a high level) seemed like it could address both cases. The current idea is basically an implicit “revision” field incremented and applied automatically upon each build, which exists in addition to manually specified version. Package queries would then default to a “latest revision” behavior, with simple and familiar ways of changing the behavior to “oldest” or locking on specific revisions. The bad news is that this feature is extremely far away because of its overwhelming nature. At best, we might hope to see it in beta as part of 2.0, but even that seems uncertain, and we probably won’t see a 2.0 release until 2020 anyway.
Once again, it was a great conference, and that the discourse among the enterprise C and C++ engineers was invaluable to me as a Bincrafter, and it was certainly invaluable to the Conan team as well. We truly hope that this trip report will also start to shine more light on the importance of understanding and accommodating both OSS and enterprise use cases in Conan, as well as other parts of the tooling ecosystem like build systems, and CI services.
]]>After further discussion with the Conan team we realized there were two distinct packaging scenarios the Conan client was trying to accomodate: “in-source” and “out-of-source”. The Bincrafters packages are all built “out-of-source”, meaning the conanfile.py is in a separate github repository from the library sources, but one of the long-term goals of Conan is to foster the growing number of packages where the library maintainers manage the conanfile.py in their project (“in-source”).
The previous updates were focused on “in-source” packaging workflows but it seemed it should be suitable for both scenarios. It did work, but it wasn’t optimized, and came with a lot of unfortunate side-effects. We brought this feedback to the Conan team and they aggressively sought various ways to accomodate our scenario and optimize our workflow. This brought about the recognition that the workflows are just fundamentally different, and some insight about possible solutions. The good news is that they were able to dramatically improve the original “out-of-source” workflow with very little modification to Conan.
So, the new workflow brings back the intuitive linear workflow that we had prior to the November 2017 update. After users have a conanfile.py that looks like it’s ready to go, they can just jump right back to conan create as in:
$ conan create . bincrafters/testing
Most often, the source() method is pretty easy to debug and get right, and the first significant problems for debugging start occurring in the build() method. So once we see that the source() method is getting the sources, we can focus on the build() method. Doing so is as simple as adding -k to our command (short for --keep-source):
$ conan create . bincrafters/testing -k
The -k option existed before, was very useful then, and continues to be useful now. With this, we no longer need to use the conan build command for our workflow (although again, it’s still great for the in-source packaging workflow).
So, after some amount of trial-and-error with the build() method, the package should eventually build the library properly. The next phase then becomes ensuring that the package() method is gathering all the proper files from the build directory, and copying them to the package directory. This is often harder that it seems, and often requires trial and error. For this trial and error, we just want to re-run the package() step multiple times, without re-running the source() or build() step. And thus, this is exactly what the latest feature from Conan 1.1 enables. We can now simply add -kb to our command (short for --keep-build):
$ conan create . bincrafters/testing -kb
The -kb flag also implies -k, effectively causing conan to fast-forward to package() method and re-run it with whatever build directory has already been generated. This was really the missing link for our “out-of-source” workflows before.
The final step of the conan create command which often needs to be run many times until successful is the test_package step. In this case, the best approach is to take advantage of the separate and dedicated conan test command:
$ conan test test_package bincrafters/testing
We discussed another --keep flag like --keep-package/-kp, but this was decided to be a bit too much trouble when the conan test method was perfectly reasonable.
Also regarding test_package, we got some great new features in Conan 1.1. One of the lingering annoyances for our team was that building packages always left behind the test_package/build directory in each package folder. This directory has been almost completely useless the vast majority of the time (after the test_package is run), and so Conan has now given us a few flexible ways to clean this directory up automatically.
CONAN_TEMP_TEST_FOLDER - This is an environment variable that can be set to True to tell Conan to automatically cleanup the test_package/build directory automatically. You can pass this variable to Conan via the CLI when you call conan test by adding -e CONAN_TEMP_TEST_FOLDER=True, but for most use cases, you probably want to set it in your default Conan Profile.
temp_test_folder - This is a value in conan.conf that you can set to have the same effect as the environment variable above. The difference is that setting it in conan.conf is global, and will affect all recipes and profiles.
--test-build-folder - This is a flag to the conan test and conan create commands which allows you to specify a different build directory for the test_package. This doesn’t exactly help “automatically clean up” the directory, but it does allow you to choose to always put the build directory into some well-known temp directory rather than a subdirectory of your project folder. This is a handy piece of flexibility.
Again, it’s important to point out that the recommendations here are for optimizing “out-of-source” builds, and that the previous post is still largely relevant for “in-source” builds. However, for “out-of-source” builds which is still the majority of packages we see, this workflow brings a number of advantages.
We really hope other OSS packagers test this workflow for themselves and let us know their thoughts. The Conan team is really amazing when it comes to listening to feedback, and we try to help encourage the feedback loop from the community for that reason.
]]>Old: Boost.System/1.66.0@bincrafters/stable
New: boost_system/1.66.0@bincrafters/stable
For the boost packages, users should begin migrating their package references to point to the new naming convention ASAP. Also of note, the boost packages were completely re-written during the rename, with a number of patches and bugfixes. If you discover a breaking change to your setup, please let us know (see github issues link at the bottom of this post).
On 3/15/2018, the old Boost modular packages with upper-case characters in the name will be removed from Bintray. Any local cache containing the packages will continue to work as long as the user does not clean said cache. The old github repositories containing the old recipes will also be removed on 3/15/2018.
The overwhelming majority of Bincrafters packagers and community members agreed that establishing and following a convention of lower-case-only package names was preferred. As Conan has just come out of Beta, and we were preparing the packages for submission to Conan Center, we decided that now was our last opportunity to make this change, which we would be living with for years to come.
As usual, please let us know if you have any trouble with the new packages on our Github Community Issues list:
https://github.com/bincrafters/community/issues
Thanks again to the community members who have submitted feedback and issues so far. It has substantially improved the quality and direction of Bincrafters, and we hope to continue to see this kind of collaboration.
]]>If you want to help support the Bincrafters team and the modular Boost packages, now is your chance. We have a very simple need, and so the contributions we are looking for are extremely small. In short, we need tests. Boost has ~35 libraries which require compilation and we’ve written tests for each of those. However, there are 100 header-only libraries, and so far we’ve only created test_packages for a few of them. So, if you want to contribute and know how to write a test_package for Conan, please choose a library from the list at the bottom of this post.
We’ve been asked about Conan Center many times, and the answer is Yes! We do plan on submitting to Conan Center in the near future, and we have the support of the Conan team. In the meantime, the more usage feedback we get in our github issues list and slack channel the better: https://github.com/bincrafters/community/issues
Also, it’s worth noting that one of the requirements for libraries to be included in Conan Center is a proper test_package.
There are ~135 libraries in Boost, so obviously there are a few nuances related to the packages which users might want to know. We’ve created a page in our documentation to capture all these usage details, and will continue to add to it over time.
Usage Notes for Boost Packages
Here are some highlights:
FindBoost.cmake : CMake refs to Boost::xyz and Boost_FOUND now workall : bugfixes… bugs and important flags fixed everywhereall : packages now built with both libcxx options (libstdc and libstdc++11)boost_iostreams : zlib, bzip2, and lzma compressors are enabled by defaultboost_python : improved version and python path detection. better override options.misc : C++11 Only Libraries include: log, context, coroutine, fiberWe are currently in the process of rebuilding 1.65.1 and 1.64.0 with the new changes. We hope this will be ready by 2/20/2018.
Please feel free to fork and add a test_package, then submit a PR back for review.
testing/1.66.0 branch.Example: Boost Random test_package
If you follow @conan_io or @bincrafters on twitter, you have probably started to see some new package announcements for Conan Center. If not, we encourage you to take a trip over there and browse the current listing. In any case, what we really want to point out is the bigger picture of what’s taking place with respect to the prioritization of packages. Broadly speaking, Bincrafters have begun by submitting a number of packages in the following categories:
These might not seem terribly exciting to Conan users because in many cases they are header-only, or are relatively easy to build, and don’t do anything flashy. However, the world of packaging is a world of layers, and these represent the low-level “foundational” on which countless libraries are built. So, we simply had to start with these and make sure they were mature and stable before moving on to the “really cool stuff.” As the Conan team has already helped us work through this first group, we’ve started on the next group. So, in the coming weeks, you should also start to see packages from the following categories hit Conan Center:
Again, these libraries can be described as “foundational”, but perhaps just a slightly higher layer than logging and testing. These libraries also have the distinction of being used by multiple members of the Bincrafters team on their own projects. As stated on the Bincrafters blog, we put significant priority on the packages our contributors need to use, because meeting our own packaging needs is what brought us together around Conan in the first place.
Boost, Qt, wxWidgets, FFMpeg, Xiph, OpenCV, Folly, CGAL… etc.
These are the projects that we know users really struggle with building and maintaining. Now that our CI is really streamlined, flexible, and maintainable, we actually start to approach these projects with maximum efficiency and a sense of confidence. Also, crucially, there have been significant improvements and patches in Conan itself which will dramatically streamline the way we write some of these recipes.
Again, with the recent improvements in our standards and templates, we can now actually get back to the vast number of “regular libraries” on the backlog (as well as those stalled in the current pipeline). This will result in much more visible strides, including mature packages for libraries like ZeroMQ, SqLite, CPPRestSDK, and countless others. The goal is to have a steady stream of these flow from our team to Conan Center throughout the next year. In the past, many of these were waiting on some of the “foundational” dependencies mentioned before, or raised some number of questions about maintainability or technicalities. For example, how to handle of Makefile-only libraries on Windows, how to handle libraries which need pthread on linux or ws2_32 on Windows, how to manage environment variable propagation, whether or not to include MinGW in our CI: these topics were all under discussion over the past 5 months. During that time, we were learning so much that the templates were changing almost daily. The good news is that now, we’ve chosen strategies for most of these and that really frees us up to get more things done without having philosophical quandaries about every new package. We’ll continue to evolve, but not as rapidly as before.
After so much groundwork and the stability promise of Conan 1.0, we feel we can FINALLY start getting into the fun part of the mission which we should all remember: enabling a modern package-based developer experience like we have in other languages. This means packaging many more “single-purpose” “best-in-breed” libraries as opposed to super-libraries, and offering modern library authors a clear and mature strategy for adding conanfile.py to their project and maintaining it long-term. The doors are now open for substantially more progress than Conan.io saw last year, and so we encourage you to stay tuned to @conan_io or @bincrafters on twitter for the rest of 2018.
After writing this post, we discovered that the PYENV commands are necessary on TravisCI, and only CircleCI can run Conan properly without the PYENV install. Apologies for any confusion this may have caused.
Most of the packagers in Bincrafters use a common set of shell scripts with TravisCI to build packages on Linux and OSX. These shell scripts are generated by the conan new command when using the --ci flags. We took this code for granted, and only took a closer look at it recently as we tested CircleCI for OSX builds. We recently discovered that the majority of the scripts were related to PYENV setup and configuration, and that this was not necessary. We removed the code and saw a 2 minute reduction in build times.
The details of this are documented in an open ticket here:
https://github.com/conan-io/conan/issues/2385
As a result, we urge anyone using TravisCI to build Conan packages for OSX to remove the code related to PYENV. Here is a simple example of such modifications:
https://github.com/conan-io/conan/issues/2385
Currently, the Bincrafters team is exploring several other CI services to grapple with the load of OSX builds we generate. TravisCI has served us well, by performing thousands of builds at no cost to us or our community (it’s incredible really). We will continue to use them for Linux builds for the foreseeable future. The TravisCI team has indicated that they are working on infrastructure changes to improve the situation around OSX builds for OSS projects. We hope this involves enabling us to pay for dedicated OSX builds similar to how Appveyor works for Windows builds, but they have not been willing to make any commitments around this.
In the meantime, we have found CircleCI’s “2.0” builds with their OSX capabilities to be the closest thing to what we need. We are currently engaged in a trial of their service, and will be building a vast number of Conan packages on CircleCI over the next two weeks. Of note, at this time CircleCI does not support XCode 7x, only 8x and 9x. So, we may choose to end support for XCode 7x builds in our packages. If this affects you, please let us know by opening a ticket here:
https://github.com/bincrafters/community/issues
Bincrafters maintains a Github repository strictly for holding package “Templates” here:
https://github.com/bincrafters/conan-templates
This repository is used by members of the Bincrafters team, and community members who want to build packages consistent with the standards and conventions of Bincrafters. If you use this repository, please note that due to the situations explained above, numerous files relating to CI have changed, including the README.md file, which has badges for each CI provider we use.
It’s a good practice to check this repository every time you start a new package, and any time you go back to update or refresh an existing package, because it receives minor updates on a regular basis, and major updates at least once a month.
Bincrafters is continuing to rebuild all our packages, now using these latest improvements in our CI standards and conventions. Hopefully, this will improve our productivity and turnaround times, and enable us to tackle a truly massive number of packages this year.
]]>Over the past several months, while the Conan.io team has been rapidly evolving the platform. As with any platform like Conan, this is a very turbulent period which makes it difficult for early adopters to keep up. The 1.0.0 release signifies a major milestone for compatibility and stability, after which the Conan team will be settling down on breaking changes. Obviously, this makes our team very happy. We certainly had been working hard to evolve our packaging standard, conventions, and maintenance strategies in step with Conan before the release. However, we’ve been eagerly awaiting 1.0.0 so that we could go back through all the old recipes and bring them up to speed (with confidence that we weren’t going to have to re-do it again in the near future).
The past 2 years of aggressive and creative use by the Conan community produced a wealth of new insight into the problem domain. This enabled the Conan team to develop and provide first-class solutions to a variety of old and complex challenges. Many of the complex challenges tackled by the recent changes are very subtle, so we wanted to highlight some of the value and importance of them for posterity.
First, Conan packagers can now tell Conan to run a build with a “target platform”, which can be different from the platform Conan is running on. This enables a number of exciting cross-building scenarios which have recently become possible with containers, and the Windows Subsystem for Linux. The actual mechanisms come into play primarily in packages which contain C and C++ build tools. Examples of which include the cmake_installer andmingw_installer, as well as the 10+ build tools packaged by Bincrafters. Of note, some of these packages and scenarios involving Cygwin, MSYS, and and Windows Subsytem for Linux, also received new helper methods to make cross-building even more convenient.
Second, a long-standing problem has been corrected in the Conan Package Tools (“CPT”) project regarding ABI compatibility for packages compiled with GCC. Starting with GCC 5.0, changes in the minor version number of GCC does not affect ABI compatibility of compiled binaries. Previously, the CPT templates and documentation led users to compile a library once for each minor GCC version, which led to a significant number of unnecessary builds and wasted storage space. The existing situation was inefficient and unnecessarily restrictive, making it very painful in certain circumstances. Unfortunately, this effectively requires the Bincrafters team to re-run the Travis CI jobs for all our packages, but we’re actually happy to do it because of the improvement it represents. Also, as it turns out, we needed to do it anyway as part of our own internal standards improvements.
Third, the CLI commands have been the most volatile element in Conan for at least 6 months, and it definitely feels like they’ve landed in the right arrangement now. At the very least, they are more consistent and logical now. On one hand, the CLI commands might seem almost cosmetic. Sure, it’s been inconvenient to repeatedly re-learn commands, but command syntax is not exactly a big-picture item. However, the command changes were part of a fundamental change in the recommended workflow and the way recipe authors create a package, and that is fundamental to the platform. What many developers don’t realize is that there are two major worfkflows/use-cases for Conan which are very different, and the hard part about the command-line syntax was accommodating both. For the record, the two worfklows are “out-of-source” packaging (like Bincrafters), and “in-source” packaging (common in private enterprise scenarios).
The stabilization with 1.0.0 is primarily about ensuring that existing recipe’s and packages created on 1.0.0 remain stable and supported long-term. However, it doesn’t mean that new features won’t continue to come. Bincrafters team is continuing to work on and plan a number of features for Conan, which we’ll be submitting as PR’s in the coming months. This includes additional CLI flags (non-breaking), more features around Windows subsystems, and better support for sharing code between recipe’s. Also, we are planning to test out some functionality to make git-submodules easier to port to Conan, and potential container integration. Of course, all this is on top of our primary goal, which is to bring more great libraries to Conan Center.
Once again, we want the community to know that the Bincrafters team is still working diligently and constantly to refactor and republish our packages. If you are affected, we apologize. One of the advantages of the Bincrafters team overseeing so many packages, is that the community can be confident that any packages which were broken by the 1.0.0 changes will be fixed. However, admittedly, this time around it’s taking longer than it should have, but that is largely due to a highly unfortunate confluence of factors. Please continue to stay tuned to twitter and slack for updates.
]]>