Faster builds with dependency caching

By on June 27, 2017

One of the key principles behind the design of Pipelines was quick developer feedback on changes committed. After all, machines are cheap but your developers’ time isn’t. This principle influenced how we chose to price pipelines to provide unlimited concurrency, so builds aren’t waiting in a queue. Running builds in Docker containers also means your build scripts start executing much sooner than running them on VMs.

With most languages, builds start by downloading dependencies, accounting for a significant time of each build. Between each build, dependencies seldom change and when they do, it is usually incremental. Builds can be faster if dependencies don‘t need to be downloaded each time.

Today, we are excited to announce that Pipelines supports caching between builds. For early adopters in our Alpha group, we have seen builds reduced by up to 50% in build time (and build minutes)!

Adding a cache is simple. Here’s an example for adding a cache for node_modules.


image: node:8
pipelines:
  default:
    - step:
        caches:
          - node
        script:
          - npm install
          - npm test

We’ve pre-defined caches for several popular build tools. If your build tool doesn’t have a pre-defined cache, you can still define a custom cache in your bitbucket-pipelines.yml file. Caches are per repository and can be shared between your pipelines.

With this addition, we hope your developers are waiting less and coding more!

Enabled caching already? Tell us on Twitter @Bitbucket how much it has shaved off your build time.

New outbound IP addresses for webhooks

By on June 21, 2017

Bitbucket webhooks are used by teams every day to test, analyze, deploy, and distribute great software to millions of people.

In a few weeks, we will be making a change to our network configuration that results in these services routing through different IP addresses. We plan to make this change no earlier than Monday July 10th.

If you’re using webhooks and have firewall rules to allow Bitbucket to communicate with your servers, you’ll need to add the new IPs to your rules before Monday July 10th.

The current source IP address range is:

The new source IP addresses will be:

As always, you can consult our documentation for the current list of supported IPs for webhooks.

New in Bitbucket Server 5.1: Signed commits, PR deletion, and more

By on June 7, 2017

Last month we announced the beginning of our Bitbucket Server & DC 5.x series with an enhanced focus on helping our customers achieve DevOps success. Today we’re taking aim at the management side of DevOps by making the administration of your development toolset easier with Bitbucket Server and Data Center 5.1.

Keep reading to learn about GPG signed commits, pull request deletion, and other new features in 5.1.

Download Bitbucket Server 5.1

GPG signed commits

For those practicing DevOps, Git has become the preferred version control system for its flexibility and speed. This flexibility is powerful but also extremely hard to control without a tool like Bitbucket. For organizations who need to run a tight ship (e.g. adhering to regulatory requirements), Bitbucket’s permissions, merge restrictions, and hooks help immensely. Today we’re adding an additional security measure with the Verify Commit Signature hook, which rejects all commits that are not signed with a GPG public key. Coupling this with the committer verification hook, you can be assured that all the commits in Bitbucket are valid and secure. To learn more about toggling hooks on and off, see our repository hooks guide.

Pull request deletion

Next up is a highly requested improvement to our pull request workflow, the ability to delete them. Have you ever created a pull request by mistake? Or found a pull request to be obsolete? In Bitbucket Server 5.1, irrelevant pull requests can now be deleted instead of declined, leaving your PR history nice and clean. Pull request deletion is now enabled by default for pull request authors and repository administrators.

Disable Pull Request

Search improvements

Last year we brought code search to Bitbucket Server, allowing teams to search for code across all repositories stored in Bitbucket.

For teams making extensive use of forks, the process of building an index for search can use a fair amount of disk space. In Bitbucket Server 5.1, we’re introducing a way for administrators to keep search disk space under control by limiting what actually gets indexed. For example, you can restrict the index process to exclude synced forks, which reduces disk space and provides refined search results.

In addition, we’ve also updated our Elasticsearch guides for Data Center customers, providing more guidance on deploying an Elasticsearch instance.

Download Bitbucket Server 5.1

One more thing: In Bitbucket Server 5.1, we’re laying the foundation for project level settings, allowing project admins to configure items, such as branching model and permissions, across all repositories in a project. To learn more about project level settings and other improvements and bug fixes in 5.1, see our release notes.

Speed up your builds with Pipelines command duration

By on May 31, 2017

Every team wants fast feedback from their CI system, which is why we’ve just added command durations to all Pipelines builds. Knowing which parts of your build take the longest gives your team the information to speed up your build and shorten your team’s feedback loop.

You can now see the duration for each command directly in the Pipelines log viewer. We’ve picked a consistent duration format for quick scanning, so your team can easily work out where your build time is going.

 

So what are you waiting for? Get building with Pipelines and make your build the fastest it can be.

Links in Bitbucket Pipelines logs

By on May 23, 2017

Here’s a small but useful improvement we added last month to Bitbucket Pipelines. You no longer need to copy and paste URLs from logs into your browser as Bitbucket will automatically turn them into links. Convenient for jumping to your deployed application!

links in logs

Try Bitbucket Pipelines

Document changes with required issue keys in Bitbucket Cloud

By on May 16, 2017

“Why was this change made again?” Issue keys referenced in commit messages help answer this question by providing a link with more context around why a particular change was made. They’re useful for everyone from new team members getting their bearings with a repo, to quality engineers reviewing the latest release, or a budding startup getting back to their weekend project.

Issue key references are so important that some teams need every commit to include a reference. These teams rely on issue keys to maintain diligent documentation or to automatically generate release notes or changelogs to verify every change. Historically, they could do this with pre-commit Git hooks on each developer’s machine to validate messages, but this solution adds a layer of overhead that becomes difficult to manage. For these organizations, we’re introducing the ability to require issue keys for commits in Bitbucket Cloud.

Required issue keys in Bitbucket Cloud

Requiring issue keys ensures each change links to an issue in commit messages. Bitbucket automatically converts mentioned issue keys (e.g. ‘PROJ-1234’ or ‘#1234’) into links to your issue tracker so it’s easy to stay coordinated between a change and its backstory.

Issue keys aren’t just a reference. They can be used to automate workflow actions: you can add comments or transition an issue to a different state just by mentioning it, making it easier for your team to go back historically for validation of changes. If you combine this with the new JIRA issue details view, you can view and comment on JIRA issues without even leaving Bitbucket. As a result, including issue keys has become a best practice and requiring them in commit messages will make it easier to scale your workflow.

Get started with required issue keys

If you’re new to Bitbucket, sign up for a Bitbucket Cloud account. If you’re an existing Bitbucket user and are already using JIRA Software, make sure it’s connected to Bitbucket. This connection enables Bitbucket to automatically link to your JIRA Software issues. If you’re using the Bitbucket issue tracker, issues are already linked automatically.

Then, navigate to the “Links” section of your repository settings and enable “Require issue keys in commit messages.”

Once enabled, Bitbucket will only accept pushes to your repo if all pushed commit message contain issue keys.

Once all commit messages include keys (e.g. using interactive rebase to update the message), the push will succeed, and the issue keys are automatically linked across Bitbucket, like in this pull request.

Try Bitbucket, it’s free

Not using JIRA Software and Bitbucket together?

JIRA Software is the preferred issue tracker of over 350,000 Bitbucket teams. Teams who have JIRA Software and Bitbucket integrated release 14% more often and close 23% more issues (when compared to teams using just one of those products).
Learn more.

Have more specific questions about this post? Reach out to us on Twitter to get the information you need.

#Forthecode that does amazing things.

By on May 9, 2017

Bitbucket’s mission is to help teams solve real-world problems and accomplish their unique goals to change the world. Our customers select Bitbucket because it supports professional teams building things with a purpose. We get inspired hearing about all the ways our customers are using our products to move our world forward.

We want to share what you’re doing with the rest of the world. Join our #Forthecode program and share your company’s mission and all the hard work done to accomplish it.

#Forthecode contest

Share your company’s mission for a chance to be featured on our blog, our social media channels, and win some pretty cool swag! 

How to enter:

Click on the Tweet button below and tell us how you’re using @Bitbucket to accomplish your company’s mission with the hashtag #Forthecode.

For example, Atlassian’s mission is to unleash the potential in every team. We would tweet: “@Atlassian uses @Bitbucket #Forthecode that unleashes the potential in every team.”

To kick us off, check out some of the awesome ways our customers are using Bitbucket to accomplish their mission.

@Trulia uses @Bitbucket #Forthecode that finds your next home.

Bitbucket helps Trulia on large-scale agile development to provide home buyers and sellers essential information about the real estate market.

@Splunk uses @Bitbucket #Forthecode that makes big data, easy. 

Bitbucket helped Splunk transition to Git and develop software that makes big data scalable and usable.

@Orbitz uses @Bitbucket #Forthecode that takes you on your next vacation.

Bitbucket helped Orbitz migrate to Git to develop the world’s leading online travel site.

What will your code do?

Tell us how you’re using @Bitbucket to accomplish your mission using hashtag #Forthecode or give Bitbucket try today!

Try Bitbucket, it’s free

Introducing code aware search for Bitbucket Cloud

By on May 2, 2017

Code aware search

Save time combing through usage results with a semantic search that ranks definitions first over usages or variables names. Sign up for Bitbucket Cloud to take it for a spin.

Get started, it’s free

The search for code search is finally over: Bitbucket Cloud is launching a Public Beta of code aware search, specifically built for teams who have many repos or large code bases.

What makes Bitbucket Cloud’s search “code aware”? Rather than simply indexing your code as text, we built a semantic search that has our systems do the grunt work for you. Bitbucket Cloud analyzes your code syntax, ensuring definitions matching your search term are prioritized over usages and variable names. Assuming your team is re-using code effectively, the ratio of usages to definitions will increase as your codebase grows, making this a big time saver on larger projects.

For example, if you search for “FastHashMap”, which document would you want to appear first?


public class FastHashMap {
   /* ... */
}

or


import foo.bar.FastHashMap
 
public class SomeOtherClass {
    public void doSomething() {
        FastHashMap fastHashMap = new FastHashMap();
    }
}

You’d prefer the class definition, right? Let’s take a deeper look at how we built our code aware search to provide the most relevant search results at a fast pace.

How code search works in Bitbucket Cloud

Search indices built using traditional text indexers will usually return the usage result first because it contains a higher number of exact matches for your search term. In code bases where the same class or function is used many times, developers are often left trawling through page after page of usage results trying to hunt down the definition.

We took a different approach: by boosting the definitions matching your search term, the result you want is likely to rank much higher (usually #1) in the search results. Our algorithm boosts definitions for a wide range of type categories including classes, functions, enums, structs, and interfaces.  We prioritized building a code aware search scoped to team and user accounts over a global search functionality. This way, we hope to quickly give our users the relevant results they want instead of the hassle of checking out a repo locally and searching using an IDE.

To compare this live, you can search for the common class “QueryBuilders” on the Elasticsearch repo. In GitHub, it shows up as the 6th result on the 18th page (at time of writing). In Bitbucket Cloud, the class definition shows up as the first result.

Languages, filters, and operators

Code aware search outperforms traditional search approaches for statically typed languages like Java that tend to repeat type names when importing, declaring, and instantiating types. However Bitbucket Cloud’s code aware search is also highly effective for a range of other popular languages including JavaScript, Python, Ruby, and PHP, among others.

Since code aware search is built for source code, we also index  . and _ that are commonly used in identifiers. This means you can get more precise results for compound search terms such as class, function, and variable names like “foo_bar.baz”.

Additionally, we allow you to restrict search results by using modifiers and operators. You can use modifiers to filter by a particular language or file extension (like “ext:css” or “lang:ruby”) or limit search to specific repos (repo:elasticsearch). Projects can use operators (like AND, OR, and NOT) to narrow down or broaden results in case you get too many.

For a full list of the capabilities and search query considerations with code search in Bitbucket Cloud, check out our documentation.

Try Bitbucket Cloud’s Code Aware Search

During the Beta, 25% of all Bitbucket repositories have had their code indexed and are ready to start searching. If your code hasn’t been indexed, we will be throttling requests for those that would like to use code search so we can monitor performance as we scale, but it is easy to request it. Head to the search bar in Bitbucket’s side navigation and click “View code search results”, then “Enable code search” on the search results page.

If you’re ready to use a fast and relevant code search, sign up for a Bitbucket Cloud account, create a repository, and index your code. If you’re already a Bitbucket customer, you can find code search in your sidebar and further documentation on it here.

Get started, it’s free

Have more specific questions about this post? Reach out to us on Twitter to get the information you need.

Bitbucket Server 5.0 & Bamboo 6.0: Bringing DevOps to the Enterprise

By on

According to our recently released DevOps Maturity Report with xMatters, 43% of respondents in organizations of 1000 or more said they either didn’t know of or didn’t have a DevOps initiative in place. With proven benefits like faster time to market and better release quality, why aren’t more enterprises embracing DevOps?

The answer is simple: changing the way teams work isn’t easy. For teams of all sizes, the journey to DevOps can be fraught with roadblocks. For those in the enterprise, challenges like lack of visibility, deep-rooted cultural silos, disconnected tools, and complicated compliance requirements are magnified, making DevOps adoption nearly impossible.

For years, Bitbucket Server and Bamboo have been tackling these challenges, helping software teams build better software faster. Today we’re taking it up a notch with Bitbucket Server and Data Center 5.0 and Bamboo 6.0, giving development teams the freedom, speed, and automation they desire while meeting the demanding needs of their enterprise organization.

DevOps for the enterprise

Adopting DevOps in the enterprise is more than just better communication across operations and development, modern continuous integration practices, or the type of version control in place. Things like compliance and scale become just as important. Tooling must provide freedom and structure, scalability and performance – things that are not often found together.

Atlassian tools have the unique ability to make DevOps workflows a reality while ensuring traceability, availability, and security all remain intact. In Bitbucket Server and Data Center 5.0, and Bamboo 6.0 we’re upping the ante with a committer verification Git hook and updates to smart mirrors.

Committer verification

Git and distributed version control have many benefits out of the box, but controlling access and workflows isn’t one of them. For example, without a Git management tool, a developer can push commits that others have written to the central repository.

This creates problems for organizations with strict security and compliance requirements. Bitbucket lets you address this through permissions and workflow controls including Git hooks. In 5.0, we’ve added a new committer verification hook, which enforces that only the author of a commit can push those changes back to Bitbucket Server or Data Center. Now you can sleep easy knowing that only authorized code changes can make it to your repositories.

committer-verification

Smart mirroring gets smarter

Smart mirroring in Bitbucket Data Center is a hassle-free way of providing geographically dispersed teams a read-only copy of the repository. By pulling updates from a local mirror, teams can avoid the pain of high-latency and low-bandwidth clone operations. All authentication, permissions, and updates are controlled by the master data center instance keeping admin maintenance to a minimum.

In Bitbucket Data Center 5.0 we’re introducing authentication caching – a way for end users to maintain mirror access even in the event of short outages. Instead of communicating to the main server for every login event, credentials are now cached on the mirror for 5 minutes at a time. If network connection is patchy or the main server is offline, users can still fetch/clone using the cached credentials. You can rest easy knowing that Bitbucket Data Center’s active-active clustering, disaster recovery, and now authentication caching ensures your code will always be available.

Productivity at scale

With DevOps, development teams boost productivity through workflow automation, tighter communication across teams, and easier access to information (e.g. build and development status on JIRA issues and pull requests). Bitbucket Server & Data Center 5.0 and Bamboo 6.0 bring these key elements to the enterprise with more transparency and modernization of the release pipeline.

Bamboo Specs

First up, we’re modernizing the way builds are configured with Bamboo Specs, the ability to define Bamboo plan configurations as code. Changing build configurations no longer requires edits in the Bamboo user interface, instead configurations can be stored as code. Beyond simplifying application build, defining plan configurations as code provides benefits such as code reuse, proper code reviews, versioning, and so on. The best part? Bamboo Specs is native to Bamboo, no plugins or additional duct tape required.

Bamboo Specs

Tip:
Did you know you can use our Bamboo Specs exporter feature to automatically create a Spec out of your existing plans? Learn more

Tighter integrations between Bitbucket Server and Bamboo

Next up we’re breaking through silos and introducing tighter integrations between Bitbucket Server, Bamboo, and JIRA Software Server. Development teams already benefit from tracking work items through JIRA Software to commits and pull requests in Bitbucket Server, and builds and deployments in Bamboo.

Traditionally, the link between JIRA Software and Bitbucket has been on the commit level (i.e. commit A pertains to JIRA issue ABC-123). In Bitbucket Server 5.0 we’re adding repository level shortcuts allowing teams to connect a repository to any related asset, like a JIRA project. Repository shortcuts make it easy for everyone on the team to find and jump to repository information. Simply link to your JIRA board, Confluence space, Bamboo plans, HipChat room, or whatever else that’s important to you.

RepositoryShortcuts

Bamboo’s integration with Bitbucket Server is also getting an upgrade. Currently, we display build status (e.g. pass/fail) on related commits, branches, and pull requests in Bitbucket Server, but teams were unable to see Bamboo builds in-progress or trigger builds on pull request creation. With the addition of in-progress build status and pull request aware builds in Bamboo, developers gain more control over when their builds kick off and can monitor progress from inside Bitbucket. This frees up Bamboo build agent resources and cuts down on unnecessary build noise. Even better, if you’re not using plan branches already, now you can, and know that every pull request will get built. For more information on all of the Bitbucket Server and Bamboo integration enhancements, check out our integration guides.

PR aware builds

… there’s room for some fun too

Productivity at scale isn’t all integrations and workflow. In Bitbucket Server & Data Center 5.0, let your team know how you really feel with emoji and HipChat emoticon support for comments. To look for the perfect emoji type ” : ” which brings up the list of available options.

Making DevOps in the enterprise possible

Changing the way your teams work, adopting new tools, and learning new technologies is hard and doesn’t happen overnight. When you add disconnected tools, complicated compliance requirements, and scalability to the mix it makes “going DevOps” that much more difficult. No matter where you are in your DevOps journey, Atlassian provides the guidance and tools to help you succeed. If you’re ready to modernize your version control and continuous integration practices, give Bitbucket Server/Data Center and Bamboo a try.

Download Bamboo 6.0

 

Download Bitbucket Server 5.0

  For more information on other improvements and bug fixes in Bitbucket Server & Data Center 5.0  and Bamboo 6.0 check out our Bitbucket and Bamboo release notes.

Continuous deployment of a static website with Bitbucket Pipelines

By on April 24, 2017

Guest post

This guest post was written by David Von Lehman from Aerobatic, a simple yet powerful solution for static website publishing.

In this blog post we’ll look at how to use Bitbucket Pipelines to automatically build a website using a static site generator. This example will use Jekyll, but the same formula will work with any generator including Hugo, Middleman, Pelican, Gatsby, and many more. We’ll automatically deploy the built site to Aerobatic, a dedicated static website hosting platform. Finally we’ll explore how to combine the capabilities of Bitbucket, Pipelines, and Aerobatic to enable a production release workflow based on branches and pull requests that is optimized for teams.

Step 1 – Create new Jekyll site in Bitbucket

Create a new Jekyll site locally. We can do this by simply running the jekyll create command.

$ jekyll create
$ bundle install

To verify the site builds correctly locally, you can run jekyll serve and take a look at http://localhost:4000.

Now that we have a website, create a Bitbucket repo and push up the code. Our sample repo is named jekyll-pipelines. The full source code available at https://bitbucket.org/aerobatic/jekyll-pipelines.

$ git init
$ git remote add origin ssh://git@bitbucket.org/BitbucketUser/jekyll-pipelines.git

Make sure the _site directory appears in the .gitignore file. Since Bitbucket Pipelines will be building the site from scratch, we don’t want the build output in source control.

Ok, now go ahead and push to your repo:

$ git push -u master

Step 2 – Setup Aerobatic

In this tutorial, we’ll be deploying our Jekyll site to Aerobatic, a specialized static website hosting service. It just takes a minute to get up and running:

Now we need to create an Aerobatic website for this repo. At the root of the project run the following command:

$ aero create --name jekyll-pipelines

To keep things consistent, I’m naming the website the same as the repo. This will create a file called aerobatic.yml that you’ll want to commit to Git.

Before running the pipeline, we need to set the AEROBATIC_API_KEY Pipelines environment variable. Environment variables can be set either at the repo level or the account level. We recommend setting it at the account level so you don’t have to repeat this step for future projects.

Retrieve your API key by running the following:

$ aero apikey

Paste it into the value box and click the Secured box.

Step 3 – Create bitbucket-pipelines.yml

Now we need to configure Bitbucket Pipelines to build our Jekyll site whenever a push is made. First make sure to enable Bitbucket Pipelines on your repo and create a new bitbucket-pipelines.yml file. Since Jekyll is Ruby based, we’ll want to specify a Docker image that has Ruby and Bundler already installed. Aerobatic has published an image to Dockerhub called aerobatic/jekyll that has Ruby already installed along with the necessary low-level libraries required to build most native gem extensions. You can checkout the Dockerfile at https://github.com/aerobatic/docker/blob/master/jekyll/Dockerfile.

Speaking of Docker, there are 3 Aerobatic images available on Dockerhub: aerobatic/jekyll, aerobatic/hugo, and aerobatic/node. Each is based off the ultra-small Alpine base image and has the aerobatic-cli pre-installed avoiding each Pipelines build from having to npm install it from scratch.

Here’s the entire bitbucket-pipelines.yml file:

image: aerobatic/jekyll
clone:
  depth: 1
pipelines:
  default:
     - step:
         script:
           - '[ -f Gemfile ] && bundle install'
           - 'echo "url: https://__baseurl__" >> _config.yml'
           - bundle exec jekyll build
           - aero deploy --directory _site

The script section is the actual set of commands that will be carried out inside the provisioned Docker container.

  1. The first line with run bundle install if a file named Gemfile exists (which in our case it will).
  2. The second line appends a value to the _config.yml overriding the url config setting. Even if the URL is defined earlier in the file, Jekyll will take the last value. The value “https://__baseurl__” is a special value that Aerobatic will replace at runtime with the appropriate site URL.
  3. Next run bundler to build the site
  4. Finally deploy to Aerobatic by running aero deploy. The –directory site option tells it that the files to deploy are located in the _site directory where Jekyll wrote the generated site to. Normally aero is installed by running npm install aerobatic-cli -g, but since we are using the aerobatic/jekyll image, it is already present.

Step 4 – Trigger a build

That’s it for setup, now let’s trigger a build. Commit the bitbucket-pipelines.yml to your repo and that should trigger your first build. If all goes according to plan the log output will look like this:

And just like that, you have a first class git push based deployment workflow for your website. With this setup it’s even possible to use the browser editor to make content updates or author simple blog posts without ever leaving Bitbucket.

Deploying with pull requests

The simple workflow above works great for a site where one or two people maintain the site. But what about a larger team with multiple developers, content contributors, and stakeholders? In agile software development projects, Git pull requests have emerged as the preferred workflow for promoting changes through a series of deploy stages culminating in production. With Bitbucket Pipelines and Aerobatic, this same workflow is easily achieved for static website deployments. Let’s assume the same repository structure suggested in the Bitbucket Pipelines guides:

master Your integration branch
staging Use this branch to trigger deployments to staging
production Use this branch to trigger deployments to production
features/xxx All feature branches

 

In addition to the production instance of the website, we also need a staging instance. Aerobatic provides a feature called deploy stages that makes this really easy – just pass a –stage option to the aero deploy command.

$ aero deploy --stage staging

This command above will deploy the build output to a URL https://jekyll-pipelines–staging.aerobatic.io. In the bitbucket-pipelines.yml we use branch workflows to configure a different target stage for the production and staging branches:

image: aerobatic/jekyll
clone:
 depth: 1
pipelines:
 default:
   - step:
       script:
         - '[ -f Gemfile ] && bundle install'
         - 'echo "url: https://__baseurl__" >> _config.yml'
         - bundle exec jekyll build
 branches:
   master:
     - step:
         script:
           - '[ -f Gemfile ] && bundle install'
           - 'echo "url: https://__baseurl__" >> _config.yml'
           - bundle exec jekyll build
           - aero deploy --directory _site
   staging:
     - step:
         script:
           - '[ -f Gemfile ] && bundle install'
           - 'echo "url: https://__baseurl__" >> _config.yml'
           - bundle exec jekyll build
           - aero deploy --directory _site --stage staging

NOTE: The Aerobatic free plan is limited to shared *.aerobatic.io domains, but deploy stages also work with custom domains available on the Pro Plan.

Now the workflow becomes:

Using Bitbucket permissions you can lock down the workflow to whatever degree you like. For example, you can require that staging and master branches must be updated via pull requests, and specify which users have permissions to approve pull requests. A good rule of thumb is to allow everyone to merge to staging, but only senior personnel to update master (and by extension, deploy to production). The screenshot below shows this configuration:

See the Bitbucket Pipelines docs for more details on configuring branch permissions.

Protecting the staging URL

One lingering detail is preventing the general public from stumbling across the staging site URL. This can be addressed via the Aerobatic password-protect plugin that is declared in the aerobatic.yml file:

id: b74e6fb8-e747-4fb4-bd1b-1f92804ace5c
deploy:
 ignore: []
 directory: .
plugins:
 - name: password-protect
   stages: [staging]
   options:
     password: $PASSWORD
 - name: webpage

The stages property specifies that the password-protect plugin only applies for https://jekyll-pipelines–staging.aerobatic.io. More details can be found at https://www.aerobatic.com/blog/password-protect-a-jekyll-site/.

Content as Code

Over the last several years there has been a trend within DevOps to manage as much of a software system, including the configuration settings and infrastructure definition, as plain text files committed to version control right alongside the rest of the source code. You may have heard the terms “Configuration as Code” or “Infrastructure as Code” that refer to this approach. Aerobatic encourages the practice via the aerobatic.yml file which defines metadata, deploy settings, and runtime behaviors (such as plugins) for the website.

There’s a strong argument to be made that this same practice should apply to website content including markdown files, images, etc. Rather than locking content up in a CMS database with its own proprietary mechanisms for backups, auditing, history, approvals, etc., just put it in Git or Mercurial and treat it like any other source asset. With the deployment workflow described above, you’ll then have a universal build pipeline regardless of whether the change was committed by a developer or a content contributor.

Now this does present a paradigm shift for content editors that are accustomed to a less techie CMS interface. Fortunately there are a new breed of CMS tools and services that bridge the gap – providing a friendly editing interface but using version control as the underlying data store.  Examples include CloudCannon, Forestry.io, DatoCMS, kirby, and other flat-file CMSes.

Doing More with Plugins

The password-protect plugin is just one of many plugins offered by Aerobatic that provide enhanced functionality beyond what is possible with vanilla static hosting. All plugins are configured in the aerobatic.yml file. Some other popular plugins include:

Conclusion

That’s it – we now have a first rate team based deployment workflow complete with private staging environment, streamlined approval workflow, and fully automated deployments – all with no infrastructure to maintain and no DevOps engineering. Everything is being treated “as code”, safely stashed away in Bitbucket where pull requests, branches, history tracking, and all the other benefits afforded by version control can be applied. This includes the site html, templates, css, JavaScript, configuration, and content.

This same setup works not only with static site generators like Jekyll or Hugo, but also sophisticated single page web applications such as React or Ember. You can learn more about continuous deployment of static sites with Aerobatic and Bitbucket over on the Aerobatic blog.

Happy coding (and deploying)!