Dragonfly migration or upgrade workflow?


(Epipheus) #1

Is there a simple way to deploy from local machine to production. Also is there a simple workflow for rolling out changes from local to production. For example Dragonfly saves everything in a path that includes the environment name. So if you do all of your editing and updates locally is there a smooth way to deploy. Last question a part of this: is there a simple way to go from local Dragonfly deploy to S3?


(Aaron Russell) #2

This is a good questions. There isn’t a built in or standard way of doing this, so will involve a bit of manual work.

The way I normally build a new site is that it gets developed locally and the first “version” of content and images get worked on in a local dev environment. When the site is ready to deploy, we copy the database and any assets over to the production environment. The database can be migrated using pg_dump and pg_restore, and then the assets can just be copied over to the new environment making sure to maintain the same directory structure.

If assets are being stored on the filesystem, then yes the environment is in the path (I’m not sure why, that’s just the way it is), so you’ll need to copy everything to the correct remote path according to the environment. Eg:

  • From Local: public/system/dragonfly/development/*
  • To Remote: public/system/dragonfly/production/*

If you’re copying to S3, again we just copy everything over, but in this case we copy everything from dragonfly/development to the bucket root. Eg:

  • From Local: public/system/dragonfly/development/*
  • To S3 buckname:/*

With S3, a bit of care is needed to ensure permissions etc are correct.

I’d also add that I see this is a one off manual task as part of putting a new site live. Once the site is live and clients have control of the site, I don’t worry about keeping things in sync - it’s too much hassle. So often if we are creating new features or pages for existing and established sites, instead of worrying about copying things over, we’ll just create new content twice - once locally whilst we’re developing it, and then just repeat the process in production.


(Epipheus) #3

Thanks Aaron. Deploying my first live pushtype now actually so this is timely.


(Epipheus) #4

Is there any magic needed to ensure S3 is referenced instead of local storage? I’m very much interested in testing this as digital ocean s3 object now get CDN for free.


(Aaron Russell) #5

No magic needed, just ensure the appropriate config in push_type.rb:

# For S3 storage, remember to add to Gemfile:
# gem 'dragonfly-s3_data_store'
config.dragonfly_datastore = :s3
config.dragonfly_datastore_options = {
  bucket_name:        ENV['S3_BUCKET'],
  access_key_id:      ENV['AWS_ACCESS_KEY_ID'],
  secret_access_key:  ENV['AWS_SECRET_ACCESS_KEY']
}