Packaging Drupal Modules or not ?

So John wrote down his experiences on deploying Drupal sites with Puppet .

It's not a secret that I've been thinking about similar stuff and how I could get to the best possible setup.

John starts of with using Puppet to download Drush... while I want to use rpm for that ...

I want my core infrastructure to be fully packaged... not downloaded and untarred. I want to be able to reproduce my platform in a couple of months , with the exact same versions I`m using now .. not with the version that happens to be on ftp.drupal.org at that point in time, or with ftp.drupal.org being down.

Now the next question off course is what's the core infrastructure.
Where does the infrastructure end and does the application start. There's little discussion about having a puppet created vhost , an apache conf.d file, a matching .htaccess file if wanted , and the appropriate settings.php for a multisite drupal config.

There's also little doubt to me on using drush to run the updates, manage the drupal site etc . Reading John's article made me think some further about what and when I want things packaged.

John's post lead to a discussion on #infra-talk on getting all drupal modules packaged for Centos with Karan and some others

In a development environment I probably want to have periodic drush updates getting the latest modules from the interwebs and potentially breaking my devs code. But making sure that when you put a site in production it will be on a fairly up to date platform, and not on the platform you started developing on 24 months ago.

In a production environment however you only want tested updates of your modules as indeed they will break code.

It's probably going to be a mix and match setup having a local rpm/deb repo with packaged modules that have been tested and validated in your setup and using drush to enable or configure them for that production setup.

But also having a CI environment wher Drush will get the new modules from the interwebs when needed. and package them for you.

To me that sounds beter than getting all the available Drupal modules and packaging them, even automated, and preparing a repository of those modules of which only a small percentage will actually be used by people.

But I need to think about it some more :)

Comments

Jochen Lillich's picture

#1 Jochen Lillich : There's a better solution

Let's see: You'd like to have different stages of the same code (Drupal core, contrib modules, own modules) which differ in their versions. You'd like to experiment with bleeding edge versions and at the same time have a more conservative approach when it comes to production. You'd also like to reproduce what exact configuration you used last summer before that big relaunch project.

I'd say that that's a classic example for an SCM use case.

My company will soon be offering a Drupal cluster hosting service targeted at websites that need high performance and high availability. And the way we'll deploy the application code is via a distributed SCM software, probably Bazaar or Git.

That way, our customers can manage their code in each lifecycle phase -- development, testing, staging and production. If they want to test new code, they can create a new branch and eventually merge it back into the main branch. They can designate a staging branch that will automatically be deployed unto the staging server. After testing on the staging server, they can push their version unto the production branch which will automatically be deployed on the production cluster nodes.

In conclusion, with SCM, you have all the versioning control you need without sacrificing flexibility to a packaging system.


Kris Buytaert's picture

#2 Kris Buytaert : Only part of a different problem :)

The problem you descibe is both different and only a part of the bigger problem.

First of all.. I`m not trying to have different versions of the same code around, yes the way I plan on deploying stuff will have as a side benefit that you can do that :)

And secondly your scm based solution only takes into account the code for these modules, not the external requirements such as vhost config, dns configs, settings.php and database updates

Obvously all using an scm is an obliged part of the full solution, but it won't solve everything.


Jochen Lillich's picture

#3 Jochen Lillich : Yes, it's part of a solution

I didn't mean to imply that SCM would solve all configuration management problems. But it should be an integral part of the complete picture.

Other important parts of our infrastructure are the software packages for the system (OS, LAMP) and a configuration automation software like Puppet or Chef (whose configuration again is under version control). By combining those parts and tools, you'll get a highly automated system under tight control.


Kris Buytaert's picture

#4 Kris Buytaert : drupal6, drupal5

One of the suggestions I made when trying to push a drupal 6 forward in EPEL was to have multiple packages isolating the specific versions.

I think you have to agree with me that you don't want different y instances on one machine, certainly not for security reasons.

Doesn't that bring the whole discussion back to using drush mm ?

Or am I overlooking something ?


adrian's picture

#5 adrian : the problem is harder than rpm

because you can have multiple (major and minor) versions of drupal and all it's modules installed at the same time on the platform (ie: var/www/drupal-x.y) level, and then in each of those profiles/$Profilename and sites/$url

rpm and so forth are built to manage a single version of each package installed at the same time, and doesn't do well with what drupal needs.

I was designing the drupal ports collection to take all these issues into account - http://drupal.org/node/112692