Bitbucket Deployments is now available to all Bitbucket Cloud teams

By on April 17, 2018

With more teams adopting CI/CD and deploying code on a regular basis, keeping up-to-date with what has been deployed has become more difficult. Teams need a single place to track their deployments so they can build and release their software quickly with confidence.

In December, we announced our early access program for Bitbucket Deployments, a new way to track, view and monitor deployments from within Bitbucket. Today, we’re excited to announce that Bitbucket Deployments is available to all Bitbucket teams, with a beautiful new design to top it off.

bitbucket deployments

Track your deployments

The Bitbucket Deployments dashboard automatically tracks all the deployments you run through Pipelines, giving your team visibility over what’s running in each environment and the status of each deployment. There’s also a complete history of all deployments to each environment, shown by clicking on the history icon on each.

Environments in Bitbucket Deployments are configured out of the box as teststaging, and production, and your team can choose to use one or more of these environments as required by your workflow.

This improved visibility is already helping teams dramatically improve their CD workflows, like the team at Fran’s Chocolates in Seattle, WA:

Before adopting Bitbucket Pipelines, our delivery process was like the wild west. There was nothing to stop broken code from being deployed or see what had been deployed where. Bitbucket Deployments was the final step in getting transparency and control over our deployments.” – James Sweeney, CTO at Fran’s Chocolates

Preview and promote deployments between environments

For many teams, an important part of their workflow is a manual review of changes that will land as part of their deployment. Bitbucket supports this workflow by allowing you to configure a manual deployment step which will pause the build before doing the deployment. Any build artifacts that are configured by the build will be kept for up to a week so that you can run your deployment step at a later point in time.

Where a manual deployment is available, Bitbucket Deployments shows a Promote button on the dashboard. This gives your team control over when to promote a build to a new environment, and lets you preview the specific commits and code changes before hitting Deploy.

bitbucket deployments

A complete record of what code was deployed when

By clicking on the history icon, you can see a history of all earlier deployments to an environment. Clicking on any deployment on the dashboard will load a summary showing the changes that were deployed, along with information about when it was deployed and by whom.

With a detailed history of past deployments, investigating problems becomes much faster and simpler. Your team can confirm the cause of an issue by inspecting the code and commits that were deployed, all without leaving the deployment dashboard.

A new design to better support your workflow

Along with many functional improvements, we’ve done a complete visual redesign of the deployment dashboard for this release. The initial state is clean and shows the most relevant status information. A deployment workflow bar across the top clearly communicates the flow of builds through your deployment environments.

For teams with a manual promotion workflow, we’ve highlighted and expanded the promotion buttons to make it easier to perform this common action.

We’ve also revised the environment history list with the aim of speeding up the most frequent interactions. You can now expand deployments in the environment history and quickly see the commits that were included in each. This makes the task of scanning through recent deployments to find the cause of a bug much quicker.

bitbucket deployments commits

This is just the beginning…

We have a ton more exciting things in the pipeline for Bitbucket Deployments, including deployment concurrency controlrerunning failed deployments and Jira Software integration. We’d also love to hear your feedback about how you’re using Bitbucket and how we could make Deployments better.

To get involved, follow @Bitbucket on Twitter for updates and shoot us a tweet with how Deployments is working out for you.

Try Bitbucket Deployments

 

Bitbucket code search API is now available

By on April 5, 2018

Last year we shipped the highest requested feature for Bitbucket Cloud – code aware search, and we’re delighted with your feedback and responses.

We’re excited to announce that we’ve published the Bitbucket Cloud code search API delivering the same love to machines, and opening up code search for your needs.

All code search features and more

Our goal for the code search API was simple – to give consumers access to all search features they already know from the Bitbucket UI. That includes account scopes, powerful query language and modifiers. So how would the search for the common class QueryBuilders on the Elasticsearch repo look like?

You can get same results by making a request to:

https://api.bitbucket.org/2.0/teams/%7B6f461d1e-a3dd-433b-a0e3-7a69daf6ea47%7D/search/code?search_query=repo%3Aelasticsearch%20QueryBuilders

where search_query is just urlencoded version of:

repo:elasticsearch QueryBuilders

Match highlighting

We haven’t forgotten about search result highlighting. With the new code search API, consumers will get access to that information as well.

and this is how it’s going to be represented in the response body:

Match highlighting


{
  "content_matches": [
    {
      "lines": [
        {
          "line": 39,
          "segments": [
            {
              "text": " */"
            }
          ]
        },
        {
          "line": 40,
          "segments": [
            {
              "text": "public abstract class "
            },
            {
              "text": "QueryBuilders",
              "match": true
            },
            {
              "text": " {"
            }
          ]
        }
      ]
    },
    {
      "lines": [
        {
          "line": 728,
          "segments": []
        },
        {
          "line": 729,
          "segments": [
            {
              "text": "    private "
            },
            {
              "text": "QueryBuilders",
              "match": true
            },
            {
              "text": "() {"
            }
          ]
        }
      ]
    }
  ]
}

For a full list of the capabilities and search query considerations with code search in Bitbucket Cloud, check out our documentation. If you’re ready to use a fast and relevant code search, sign up for a Bitbucket Cloud account, create a repository, and index your code. If you’re already a Bitbucket customer, you can start using code search API today!

What will you build?

If you have feedback about code search or anything else Bitbucket related, hit us up on Twitter – we’d love to hear what you’ve got planned for the API.

Meet Bitbucket Cloud’s new chatbot

By on April 4, 2018

Bitbucket Cloud’s new chatbot features a wide range of notification types, plenty of interactivity, and some smart configuration features that will supercharge your team’s development workflow. The bot is available today for Slack, and is coming soon to Atlassian Stride and other leading chat platforms.

Smarter, by default

Debates have sparked weighing the benefits of real-time communication and productivity costs. Being constantly inundated with notifications—both automated and human—can take their toll on any team trying to focus. By analyzing the usage of Bitbucket’s existing HipChat, Slack, and Stride integrations, we’ve carefully balanced the default notifications to provide an out-of-the-box experience customized for your team.

Wait, “an out-of-the-box customized experience” – how does that make sense? Well, when you create a chat subscription, Bitbucket performs an automated analysis of your repository and configures notifications tailored to your usage patterns.

By default, your team will be notified about important events like pull request activity and commit comments across the entire repository. However we try to be a bit smarter about less critical notifications: you’ll be notified about pushes, merges, and builds only for your primary branch—typically master or default, depending on whether you’re using Git or Mercurial.

Bitbucket detects if you’re using a Gitflow branching strategy, and will automatically notify your team when a new feature branch is created. And these are just the defaults! All notifications are fine-grained, configurable, and able to be bound to specific branches or branch patterns. Different notifications from the same repository can be configured to be sent to different channels—for example, you might send pull request events to #dev and builds or deployments to #ops—so you can tailor each repository subscription to suit your team’s needs.

Complete tasks from your channel

Bitbucket’s new chat notifications are more than just informative – you can now perform key Bitbucket tasks without leaving your channel. With a click on the notification, any member of the team (with appropriate permissions) can create a pull request from a newly pushed branch, re-run a failed build pipeline, or reply to a pull request comment.

But our favorite new feature has to be the ability to “nudge” reviewers, to gently remind your teammates that you needed that code reviewed yesterday. You no longer have to passive-aggressively click the approve/unapprove button to ping your team about it.

The notification is updated with the latest status, so the rest of the team stays in the loop with what’s happening with your builds and PRs.

Context, when you need it

Alongside notifications, Bitbucket provides contextual information from your repositories, if and when you need it. If you mention a pull request, Bitbucket will show you a summary of the salient details, highlighting the outstanding work—missing approvals, failed builds, and uncompleted tasks—required to get it merged. This helps ensure your code flows smoothly from development to production. The message buttons also work in Slack’s mobile app—so you can merge the second you get that reviewer approval or green build, even if you’re on the move!

Inline code snippets

Developers are often sending file references to each other to share code examples, or to indicate the area to make a particular change. If granted permission, Bitbucket will respond to the mention of a source url by inlining the relevant code into Slack at the specified commit. If you include a line number in the URL, Bitbucket will center the snippet on that line, saving you a trip out of your chat room.

Your personal Bitbucket bot is a click away

Bitbucket Cloud for Slack is available today, coming soon to Stride, and will eventually support other leading chat platforms.

Get started with Slack now:

If you’re not using Stride or Slack, tell us which chat platform you’re using by clicking on the “give feedback” button in Bitbucket and help us prioritize our roadmap for future integrations. Happy chatting!

Take me to Bitbucket

 

Speed up your build with parallel steps in Pipelines

By on March 27, 2018

When we built Bitbucket Pipelines, one of our goals was to make a tool that developers love. And if there’s one thing developers love, it is getting their builds finished more quickly. Last year, we added dependency caching and detailed timing information to help speed up your builds.

Today, we’re excited to share that parallel steps are now available in Pipelines to speed up your builds even further, allowing you to run groups of tests at the same time and get your testing done faster.

The team at Paper Interactive, one of our early access customers, is already seeing the benefits of this:

Parallel steps in Pipelines has cut our CI time by two-thirds, saving our developers hours every week.” – Shane Fast, Co-founder and CTO at Paper Interactive

parallel steps bitbucket pipelines

Simple configuration

Configuring parallel steps in Pipelines is simple – just add a set of steps in your bitbucket-pipelines.yml file inside a parallel block. These steps will be started up in parallel by Pipelines so they can run independently and complete faster.

We assume you already know how to split up your tests into batches and run each batch via the command line. How you split them up is up to you – it could be unit vs integration tests, or separating a large number of similar tests into batches of even size. Some test runners can automatically split tests into batches for you, or you can just manually split tests into several test suites to see what performance benefit you will get.

Here’s a full Node.js application pipeline demonstrating an initial build step, a set of parallel testing steps, followed by a deployment step at the end:


image: node:9-alpine
 
pipelines:
  default:
    - step:
        name: Build
        script:
          - npm install
          - npm package
        artifacts:
          - dist/**  # copy these files to later steps
    - parallel:
        - step:
            name: Unit tests
            script:
              - npm run test/unit
        - step:
            name: Integration tests
            script:
              - npm run test/integration
        - step:
            name: Browser tests
            script:
              - npm run test/browserstack
    - step:
        name: Deploy to test
        deployment: test
        script:
          - npm run release --version $BITBUCKET_BUILD_NUMBER
          - npm run deploy/test

Saving your team time

The steps you configure to run in parallel will kick off at the same time in our auto-scaling build cluster, and will run to completion before the next serial step runs. It is primarily designed for large suites of automated tests, but can also be used for large compute tasks that can be parallelized.

Running steps in parallel gives you feedback faster. This saves valuable developer time that would otherwise be wasted waiting for the build. Pipelines will still bill you for all the minutes needed to execute all your parallel steps, so there’s no change to the cost of the build. (And if you run out of minutes, it’s only $10 to buy 1000 more for the month – crazy cheap compared to the developer time you’ll save.)

We’ve kept our existing limit of 10 steps per pipeline, as based on our data our customers don’t currently need more than this. We also expect running steps in parallel to have decreasing benefits beyond 10 steps, as the fixed overhead to start and stop the build starts becoming the limiting factor. If you are using Pipelines and find this limit affecting what you can run, please raise an improvement request and we’ll consider increasing this in future.

To learn more about parallel steps, see our Pipelines configuration guide.

If you have feedback about parallel steps or anything else Bitbucket related, hit us up on Twitter. Happy (parallel) building!

Try Bitbucket free

Git LFS now available in Bitbucket Pipelines

By on March 20, 2018

Bitbucket Pipelines now has built-in Git LFS support, allowing you to seamlessly build, test and deploy LFS files in your builds!

To enable it, just add lfs: true in the clone section of your bitbucket-pipelines.yml. If you don’t enable this feature, your clone will continue to behave as before.


clone:
  lfs: true
 
pipelines:
  default:
    # ... rest of Pipelines configuration ...

How does it work?

Behind the scenes, Pipelines now always uses an LFS-enabled Git client when it clones your repository, allowing LFS files to be downloaded as part of your Pipelines build. However, we’ve decided to maintain the existing behavior of not cloning LFS files by default, which can be slow to download and not needed for many types of builds. So in the case where LFS is not specifically enabled, Pipelines passes a parameter to the clone command to prevent it pulling LFS files.

The LFS clone configuration applies to all pipelines and steps in your build configuration. If you want to pull LFS files only in specific pipelines or steps, you’ll need to use the same workaround as before: include an appropriate Git client in your build image, and pull with SSH authentication to retrieve the files.

When you try it out, you might be wondering why you don’t see git lfs commands in the Build setup of Pipelines. While researching this feature, we found that git clone is now as performant as the deprecated git lfs clone. This wrapper command previously performed a standard clone and git lfs pull but is no longer recommended.

This should be a great improvement for all our Git LFS and Pipelines users. Please let us know what you think!

Support for large builds in Bitbucket Pipelines

By on February 20, 2018

There are software teams out there that want to take advantage of Bitbucket Pipelines, but haven’t been able to trim down their software projects to fit; we have some good news.

Support for large builds

Large builds now allow software teams to take advantage of twice the resources to:

To take advantage of double the resources set the size of pipeline or step to 2x.

Tell me how to get it


options:
  size: 2x  # all steps in this repo get 8GB memory
 
pipelines:
  default:
    - step:
        size: 2x  # or configure at step level
        script:
          - ...

What’s the cost?

Steps that use double resources will consume twice the number of build minutes from your monthly allocation, effectively costing you twice as much. So we recommend configuring large builds only where you need additional memory or speed.

Our early stats show that large builds have decreased the average build time for customers that use them, due to decreased CPU contention on our hosts, but your results will depend on the specific tasks your build performs.

We’re sure the optional extra memory and CPU allocation will prove extremely useful to the growing number of professional teams that are relying on Bitbucket Pipelines for all their CI/CD needs.

Tell us what you think: @Bitbucket.

Bitbucket Cloud gears up for an amazing 2018

By on January 25, 2018

2017 was a momentous year for Bitbucket Cloud users. 17 million pull requests merged, 6 million repositories created, and over 10 million Pipelines builds run – it’s clear that individuals and teams alike got stuff done last year! We added a ton of improvements and functionality you requested, and loads of you have already started using features like Pipelines and code aware search too. Here’s a look at 2017 in numbers:

With 2017 in the rearview mirror the Bitbucket Cloud team has its sights set on an even more amazing 2018. While there’ll be some exciting surprises along the way, our aim – as always – is to help teams and individuals alike build better software, faster. Here’s a look at some of the areas we focused on last year that we’ll continue to improve on over the next 12 months.

Greater context and collaboration

Integration with the key tools in your development workflow will be a focus as Bitbucket Cloud becomes the all-in-one place for teams to get started on their projects. We understand that you want the right context on your work at the right time as well, and we’ll continue to make this as easy as possible for individuals and teams in 2018.

Update Jira issues within Bitbucket’s UI

Many Bitbucket Cloud teams use Jira Software or Trello to plan, track, and manage their work. We addressed many of the issues with context switching last year and brought more of your planning into the product, and we’ll continue to do so over the next 12 months.

For Jira Software we leveled up the existing integration by letting you update Jira issues within Bitbucket’s UI. Transitioning, commenting, the works – it’s now possible to do all of that (and more) right inside of Bitbucket.

Embedded Trello boards in Bitbucket

And for teams that use a simple visual project management tool like Trello, we wanted to bring all of your planning and tracking into one place and further simplify your development process. So we brought Trello boards inside of Bitbucket Cloud, turning Bitbucket into a tool that wasn’t just a single source of truth for your code, but a central place for you to collaborate with your entire team too.

In 2018 we aim to build on these integrations and look to further improve how you collaborate with your team and around your code – look out for deep integrations that offer chat services, security, monitoring, and more. And we’ve listened to all your valuable feedback and are looking to revamp the code review process, arguably an integral part of your experience in Bitbucket Cloud today.

Continuous integration and delivery

CI/CD continues to play a crucial role in the software development process and an important part of the Bitbucket experience. Since its inception a little over a year ago Bitbucket Pipelines has grown into a vital part of your development workflow. And with over 10.7 million builds run along with many of your requests and suggestions, 2017 was a year of progress for the team as we shipped 6 of your top 10 most requested improvements.

Bitbucket Deployments

2018 will be no different, and you can look forward to more improvements for Pipelines and Bitbucket Deployments. Tight integration with Jira Software, tracking microservice deployments across your environments, improved monitoring and insights into delivery speed – these are a sample of features to look forward to as we make Bitbucket Cloud your single source of truth to manage and track your code from development through code review, build, test, and deployment – all the way to production.

Want to know more about our plans for Bitbucket Deployments this year? Read this blog to find out more!

Code aware search

You asked and we delivered in 2017 – we were excited to announce the introduction of code aware search in Bitbucket Cloud, the top voted feature request with over 2600 votes! We went one step further than simply indexing your code though, implementing semantic search that prioritizes definitions matching your search term over usages and variable names, resulting in higher quality search results.

Code aware search

Search will continue to play an important part of the Bitbucket Cloud experience, and we’ll continue to improve the user experience and search result quality based on your feedback.

Looking ahead

The above is just a small taste of the new features at your fingertips, and what you can look forward to in 2018. We can’t wait to show how we’ll be improving how you deliver and collaborate around your code over the next 12 months.

Here’s to a great 2018!

Try Bitbucket free

 

Bitbucket Deployments and the future of continuous delivery

By on January 24, 2018

What you’ve seen over the past few weeks with Deployments and Pipelines in Bitbucket Cloud is just the beginning of our mission to help every software team adopt continuous delivery – and we’ve got some exciting ideas in store for how to improve the product. If you have feedback about how to improve Bitbucket Pipelines and Deployments, please share with us here. We’d love to hear from you. 

Jira integration for shared visibility

It isn’t enough to just know which commits are in each deployment, teams want a higher level overview and deployment status information directly in their tracking tool. Bitbucket Deployments will appear in your Jira issues, and also support workflow automation so you can progress issues automatically as deployments go out.

Jira Pipelines integration

Insights into delivery speed

Teams want to know whether their investment in continuous delivery is paying off. Soon, you’ll be able to see how often code is deployed to each environment. We’ll use the historical deployment data tracked in Bitbucket to provide you with figures and charts so your team can set goals and track their release cadence.

Microservices deployment tracking

For teams that run multiple services, you will be able can track deployments across all the repositories and services that make up your application, see which version is deployed to each environment, and get delivery speed insights across those services too.

Microservices tracking

Monitoring integrations

As your team is deploying to production, imagine seeing both the deployment logs and system metrics live and side-by-side as your deployment goes out. We already have great integrations with vendors like Raygun and Rollbar for production monitoring, but want to take these to the next level for teams doing frequent deployments.

Monitoring integrations

Ready to try Bitbucket Deployments?

Learn more and let us know your thoughts.

Learn more

If you missed our original blog announcements for Bitbucket Deployments, you can find them here:

Follow us on Twitter to be notified of the next updates from Bitbucket. Thanks!

Pipelines stocking stuffers: test reporting, Docker run and large builds

By on December 20, 2017

We’ve recently added a few tasty new features to Bitbucket Pipelines that we wanted to share with you before the holiday break. Here’s what Bitbucket Santa has put under the tree for our good Pipelines customers this December. (Naughty customers should close their browser window now!)

Test reporting with zero configuration

We’ve added test reporting to Bitbucket Pipelines, a highly requested feature from our customers. If you’re already generating test reports as part of their build process, you should already be seeing them picked up by Pipelines. They’ll appear in your build results and soon in notifications too.

Zero-config test reporting in Bitbucket Pipelines

As well as making your builds blazingly fast, our goal with Pipelines is to take the pain out of configuring CI/CD builds. To that end, we are always looking for ways to reduce configuration options that bloat and complicate things.

All the other build tools we’ve seen require you to configure exactly where you build stores test reports. Whenever you add a new module to your project, you might need to go back and update this configuration, or remember to set it when you configure a new project.

Unlike these other tools, test reporting in Pipelines requires zero configuration. We automatically scan your build directory up to 3 levels deep for directories matching “test-reports”, “test-results” or similar, then parse the JUnit-style XML files we find there.

This common format is supported and automatically generated by many tools (we’ve seen almost 10% of Pipelines builds pick up test reports automatically), and for tools that don’t automatically generate reports, we have instructions in our documentation.

A few improvements to test reporting are still in progress, like showing steps with successful tests, but we’re happy to make our first iteration available to you now.

Run Docker containers and docker-compose files

We’re also excited to announce that Pipelines now offers complete hosted Docker support, allowing you to build, run and test your Docker-based services in any configuration that doesn’t require privileged mode on the host. This includes using docker-compose to start a set of microservices up for testing on Pipelines.

Back in April 2017, we introduced the ability to build, tag and push Docker images from Pipelines up to your preferred registry. However, due to the security model of the Docker daemons in our shared infrastructure, we couldn’t offer the ability to run Docker containers and related tasks.

Thanks to some innovative work by our engineering team, we’ve extended our Docker authentication plugin to support all Docker commands and still prevent privileged commands being run.

Because we now allow starting arbitrary containers via docker run, we also now enforce a 1GB memory limit on Docker usage on Pipelines that was previously unlimited. You can read more about this change and how it might affect you in our infrastructure changes documentation.

Large builds: double resources for double cost

Ever since we launched Pipelines, we’ve had demand to support ever larger builds. We started offering 2GB of memory by default, and increased that to 4GB shortly after launch. Allowing builds with even more memory is a common request.

But one does not simply increase the resources allocated to customers on a hosted build service. Our default allocation factors in our hosting costs, typical customer needs, and maintaining a competitive price. To offer configuration options for larger builds, our scheduling and auto-scaling systems have to handle builds of varying sizes while continuing to keep queue times to an absolute minimum. Fortunately, the latter is what we’ve been able to achieve.

I’m pleased to announce that Pipelines now offers 8GB of memory, and similarly doubled resources (CPU, network, etc), as a new option for customers. Simply configure size: 2x in your Pipelines YAML file, either globally or on a specific step, to take advantage of this.

Steps that use double resources will consume twice the number of build minutes from your monthly allocation, effectively costing you twice as much. So we recommend configuring large builds only where you need additional memory or speed.


options:
  size: 2x  # all steps in this repo get 8GB memory
 
pipelines:
  default:
    - step:
        size: 2x  # or configure at step level
        script:
          - ...

Our early stats show that large builds have decreased the average build time for customers that use them, due to decreased CPU contention on our hosts, but your results will depend on the specific tasks your build performs.

We’re sure the optional extra memory and CPU allocation will prove extremely useful to the growing number of professional teams that are relying on Bitbucket Pipelines for all their CI/CD needs.

Happy holidays!

We hope you enjoy these great new additions to Bitbucket, and thanks to the thousands of new customers that joined us this year.
Please let us know how Bitbucket and/or Pipelines is helping you and your team to build great software. We’re always excited to hear your stories.

Bitbucket Deployments: flexibility meets CD best practices

By on December 13, 2017

We believe development tools are more powerful when they provide flexibility but also steer teams towards the practices that help them succeed. In the design of Bitbucket Deployments in Bitbucket Cloud, we’re using our experience working with thousands of customers to recommend and reinforce the practices we see working for teams in the trenches.

Designed to fit your workflow

Our first goal was to support a wide variety of different deployment workflows, and provide visibility and confidence across all of them:

Regardless of how your team does deployments, we’ve designed Bitbucket Deployments to support your tracking model.

Multi-cloud support: AWS, GCP, Azure, and more

Support for multiple cloud vendors is also part of the Bitbucket ethos, and that includes multi-cloud support with our deployments features.

Bitbucket has deployment integrations with all the major cloud hosting platforms, whether that’s Amazon AWSMicrosoft AzureGoogle Cloud Platform or a Kubernetes cluster. By following these easy guides, you can start deploying today with Bitbucket Pipelines and tracking it through Deployments.

Built-in best practices for CD teams

While we provide a lot of flexibility for teams using Bitbucket Deployments, we also want to have some guardrails that guide teams towards best practices in continuous delivery.

First, we’ve found the “deployment pipeline” approach of deploying the same code (and ideally same build artifacts) to each environment in order dramatically improves the speed and confidence of teams in their release process. Bitbucket Deployments is designed around this idea of helping you progress quickly through each step in your defined release process, rather than jumping around haphazardly.

Second, the use of a “staging” environment is a key part of how many of the best teams work. It is used to validate once-off release changes like database schema changes or data migrations against replicated production data, or as a checkpoint for verification of single service changes in a multi-service environment. We want to encourage teams using push-button deployments to adopt a process that includes a staging environment.

Lastly, we’ve pre-configured the environment names in a way that we believe works for the vast majority of teams, who have test (or UAT/QA), staging (or pre-prod) and production (or live) environments. We want to encourage teams to use one, two or three environments in this pattern. Running different code in more than three shared environments tends be an anti-pattern, leading to developer confusion and an unclear release pipeline from code to production.

You can read more about our approach and best practice recommendations to deployments in our Bitbucket Deployments Recommendations.

Up next in the deployments journey…

If you’ve been following along with the blogs or on Twitter, you’d know that we’ve introduced the new Bitbucket Deployments, dug into some of the features, and now explained how we combine flexibility and best practices.

Next time we’ll look at some of the customers who have adopted Bitbucket Deployments and see what benefits they’re getting from the features already. Follow us on Twitter to be notified when the next post is out.