Would be great to do a guide on how to use https://circleci.com to automatically deploy a static build of the website to amazon s3 or github pages whenever the website repo is modified. Would successfully eliminate the need for DocPad to be hosted on dynamic servers for such rebuilding functionality.
this is awesome, nice find, thanks ben
This was completed quite a while ago, for CircleCI and TravisCI:
Surge.sh also supports Travis CI http://surge.sh/help/integrating-with-travis-ci
Another very powerful potential technique that I’m about to try out on my current project is with AWS Lambda’s to bi-annually update a DocPad website from a json file from an API.
This has the potential to add a dynamic effect to any DocPad website and if you’re not using the AWS Lambda service over 1 million times a month then it’s also free. There is already a popular framework built atop AWS Lambda for creating extendable serverless applications, and of course it’s called Serverless:
My thought is to use serverless-plugin-cronjob (which uses AWS CloudWatch to set the cronjob) to bi-annually trigger a Lamda function that builds a DocPad project on AWS’ side, updating it with new data from a json file pulled from an API and then push it to the project’s GitHub gh-pages branch to deploy it.
The project I’m working on is a DocPad skeleton and therefore it’ll have multiple deployments from different users onto GitHub Pages. The skeleton will come with a custom Serverless DocPad plugin that requires the GitHub gh-pages repo url and credentials to be able to deploy to that branch (still not sure how to more securely/privately write to the GitHub branches from within the running Lambda function). The following is a diagram of the proposed technique:
I’m welcome to ideas on more elegant implementation ideas. Once I get a working example I’ll write a guide on this technique. There is also a bunch of open-source projects that do similar things to AWS Lambda like TaskMill.io and Google has Google Cloud Functions.
We’re trying out this approach because the website’s content is API driven but can’t be dependent on the web masters refreshing their deployments bi-annually so the alternative before this was to download the json file client-side every time a user went to the website and then the webpage’s elements would be constructed from that json but this isn’t ideal for a static site generator and especially on a eCommerce platform where milliseconds directly affect conversion rates.
Noting the status update on this.
Surge.sh deploys were used for many years, then we moved to cloudflare’s stout, then we most recently moved to zeit’s now. You can find the latest conventions at these two repos: