Git Large File Storage (Git LFS) now in Bitbucket Cloud

By on July 18, 2016

In recent years software teams across all industries have adopted Git thanks to its raw speed, distributed nature and powerful workflows. Additionally, modern software teams are increasingly cross-functional and consist not only of developers but designers, QA engineers, tech writers, and more. In order to be successful these teams need to collaborate not just on raw source code but on rich media and large data.

It’s no secret though that Git doesn’t handle large files very well and quickly bloats your repositories. We are therefore excited to announce that, following Bitbucket Server’s lead earlier this year, Git LFS is now available in beta for Bitbucket Cloud to improve the handling of your large assets. So even if your files are really, really large, Bitbucket Cloud allows your team to efficiently store, version and collaborate on them.

Why should you care about Git LFS?

Git was optimized for source – it’s easily merged and compressed and is relatively small, so it is perfectly feasible to store all history everywhere, but this makes it inefficient and slow when trying to track large binary files. For example, If your designer stores a 100 MB image in your Git repository and modifies it nine times, your repository could bloat to almost 1 GB in size, since binary deltas often compress poorly. Every developer, build agent, and deploy script cloning that repository would then have to download the full 1 GB history of changes, which may lead to drastically longer clone times. Just imagine what would have happened if your designer made 99 changes to that file.

A common solution to this inherent flaw in Git is to track these large files outside of Git, in local storage systems or cloud storage providers, which leads to a whole new set of problems. Separating your large files from your repository will require your team to manually sync and communicate all changes to keep your code working.

With the addition of Git LFS support, you can say goodbye to all these problems track all your files in one place in Bitbucket Cloud. Instead of bloating your Git repository, large files are kept in parallel storage, and only lightweight references are stored making your repositories smaller and faster. The next time your team clones a repository with files stored in Git LFS, only the references and relevant large files that are part of your checked out revision will get downloaded, not the entire change history.

For those interested in a longer explanation of how Git LFS works and how to migrate your existing repository, watch this presentation by Tim Pettersen, Atlassian Developer Advocate, on Tracking huge files with Git LFS.

When is Git LFS right for you?

Generally, if you want to use Git to easily version your large files, Git LFS is the right choice. To call out just a few cases in which Git LFS will make your life easier, here’s a short list:

  • Game developers working with large textures, 3D, audio and video files
  • Mobile developers catering for higher and higher display resolutions
  • Web developers building pages with rich media
  • Software teams handling checked in dependencies
  • Researchers working with huge data sets
  • Multimedia producers and designers
  • QA engineers using database snapshots for functional tests
  • Technical evangelists who store their presentation slides in Git

Git LFS support is built right into SourceTree

If you’re like us and use SourceTree, tracking files with Git LFS is one click away. Simply right click on any file type you’d like to track and select the “Track in Git LFS” option. SourceTree will do the rest for you.

Group 3

Group 2 Copy

What you will get with the Git LFS Beta

  • Track everything in one place – Track your large assets alongside your source code and stop worrying about manually versioning them in a separate storage system.
  • Store, version and share large files – Version all your large files with Git, no matter how large they are. With Git LFS you can store huge audio files, graphics, videos or any other binaries efficiently. During the beta period, you get 1GB of storage for up to 5 users. If you are a paid team your storage is based on your user tier.
  • Free up your repository – Store more in your Git repository. Git LFS stores your large files externally and keeps your actual Git repository lightweight.
  • Clone and fetch faster – Only download what you need. Git LFS will only fetch those large files that you check out and not the entire history of changes.
  • Just do git – No need to learn anything new. Git LFS seamlessly integrates into your Git workflow and does not require you to learn a new workflow, new commands or use new tools.

Git going right away

If you’re ready to get started, sign up for a Bitbucket Cloud account. If you’re already using Bitbucket Cloud, enable it in one click on the repos you would like to try it out on by heading to the “Git LFS Beta” section on the left sidebar.

Sign up for Bitbucket

Universal 2nd Factor (U2F) now supported in Bitbucket Cloud

By on June 22, 2016

YubiKey4-YubiKey4Nano-1030x674-1-1030x674

Last week, we released support for FIDO Universal 2nd Factor in Bitbucket Cloud. FIDO U2F is an emerging standard for two-step verification that uses a physical USB key to digitally sign a challenge from a trusted website. It’s a new authentication standard designed to enable small USB tokens, mobile phones, and other devices to act as a secure second factor for 2FA without requiring any additional overhead of installing drivers or client-side software applications.

What does this mean for you?
You may have heard about some high profile breaches and subsequent unauthorized publication of stolen user credentials in the past few weeks. Two-step verification on your Bitbucket Cloud account ensures that your data will continue to be protected even if someone else gets your password.

With U2F, instead of having to enter a TOTP (Time-based One-time Password) every time you want to log in to Bitbucket Cloud, you can simply press a button on a small USB device plugged into your computer. You are also less vulnerable to phishing attacks since security keys will only sign challenges that match the proper domain for the website.

security_keys_full

Visit two-step verification settings to add your key. If you do not already have two-step verification enabled, you’ll need to enable it before you can use your U2F key with Bitbucket Cloud.

Special Yubikey promotion for Bitbucket users
You’ll need to purchase a security key that supports U2F in order to take advantage of this feature. We’re collaborating with Yubico, co-creator of the U2F protocol, and offering discounts for a limited time through a special offer: Bitbucket teams can purchase up to 10 keys at a 25% discount, (while supplies last). You can find more information about the offer here.

We are proud to be among the first few websites to support this standard. “We applaud Atlassian for their support for the FIDO U2F protocol, by introducing this forward thinking strong public key cryptography two-factor authentication option to their user base,” said Jerrod Chong, VP Solutions Engineering, Yubico. Earning and keeping your trust is part of our customer commitment. Learn more about 2FA and U2F.

Atlassian account is coming to Bitbucket Cloud

By on June 14, 2016

join-my-accounts

Forget about having to remember multiple passwords and say hello to a better Atlassian experience in the cloud. Bitbucket users are now being upgraded to Atlassian account. The integration of Atlassian account with all Atlassian Cloud services is currently in progress, and when it is complete, you will be able to log in to Bitbucket, JIRA, Confluence, and HipChat with one Atlassian account. In addition, you can now use your Atlassian account to log in to SourceTree and our support and billing systems.

The upgrade process will ask you to verify the email address that we have on file. This should be quick and painless, and then it’s back to Bitbucket business as usual!

Already have an Atlassian account? No worries, we’ll join your Bitbucket account to your existing Atlassian account. And if you want to change your email address associated with your Bitbucket account, click “Try another email” during the migration process. If you use Google authentication to log in, you can keep doing so provided it matches your Atlassian account email address.

Want to learn more about Atlassian account? Click here.

Git 2.9 is out!

By on June 13, 2016

Just a couple of months after version 2.8, Git 2.9 has shipped with a huge collection of usability improvements and new command options. Here’s the new features that we’re particularly excited about on the Bitbucket team.

git rebase can now exec without going interactive

Many developers use a rebasing workflow to keep a clean commit history. At Atlassian, we usually perform an explicit merge to get our feature branches on to master, to keep a record of where each feature was developed. However, we do avoid “spurious” merge commits by rebasing when pulling changes to our current branch from the upstream server (git pull −−rebase) and occasionally to bring our feature branches up to date with master (git rebase master). If you’re into rebasing, you’re probably aware that each time you rebase, you’re essentially rewriting history by applying each of your new commits on top of the specified base. Depending on the nature of the changes from the upstream branch, you may encounter test failures or even compilation problems for certain commits in your newly created history. If these changes cause merge conflicts, the rebase process will pause and allow you to resolve them. But changes that merge cleanly may still break compilation or tests, leaving broken commits littering your history.

However, you can instruct Git to run your project’s test suite for each rewritten commit. Prior to Git 2.9 you could do this with a combination of git rebase −−interactive and the exec command. For example:

git rebase master −−interactive −−exec=”npm test”

would generate an interactive rebase plan which invokes npm test after rewriting each commit, ensuring that your tests still pass:

pick 2fde787 ACE-1294: replaced miniamalCommit with string in test
exec npm test
pick ed93626 ACE-1294: removed pull request service from test
exec npm test
pick b02eb9a ACE-1294: moved fromHash, toHash and diffType to batch
exec npm test
pick e68f710 ACE-1294: added testing data to batch email file
exec npm test

# Rebase f32fa9d..0ddde5f onto f32fa9d (20 command(s))

In the event that a test fails, rebase will pause to let you fix the tests (and apply your changes to that commit):

291 passing
1 failing

1) Host request “after all” hook:
Uncaught Error: connect ECONNRESET 127.0.0.1:3001

npm ERR! Test failed.
Execution failed: npm test
You can fix the problem, and then run

        git rebase −−continue

This is handy, but needing to do an interactive rebase is a bit clunky. As of Git 2.9, you can perform a non-interactive rebase exec, with:

git rebase master -x “npm test”

Just replace npm test with make, rake, mvn clean install, or whatever you use to build and test your project.

git diff and git log now detect renamed files by default

Git doesn’t explicitly store the fact that files have been renamed. For example, if I renamed index.js to app.js and then ran a simple git diff, I’d get back what looks like a file deletion and addition:

diff −−git a/app.js b/app.js
new file mode 100644
index 0000000..144ec7f
−−− /dev/null
+++ b/app.js
@@ -0,0 +1 @@
+module.exports = require(‘./lib/index’);
diff –git a/index.js b/index.js
deleted file mode 100644
index 144ec7f..0000000
−−− a/index.js
+++ /dev/null
@@ -1 +0,0 @@
-module.exports = require(‘./lib/index’);

A move is technically just a move and a delete, but this isn’t the most human-friendly way to show it. Instead, you can use the -M flag to instruct Git to attempt to detect renamed files on the fly when computing a diff. For the above example, git diff -M gives us:

diff −−git a/app.js b/app.js
similarity index 100%
rename from index.js
rename to app.js

What does -M stand for? renaMes? Who cares! As of Git 2.9, the git diff and git log commands will both detect renames by default, unless you explicitly pass the −−no-renames flag.

git clone learned −−shallow-submodules

If you’re using submodules, you’re in luck – for once! 🙂 Git 2.9 introduces the −−shallow-submodules flag that allows you to grab a full clone of your repository, and then recursively shallow clone any referenced submodules to a depth of one commit. This is useful if you don’t need the full history of your project’s dependencies. For example, if you have a large monorepo with each project stored as a submodule, you may want to clone with shallow submodules initially, and then selectively deepen the few projects you want to work with. Another scenario would be configuring a CI or CD job that needs to perform a merge: Git needs the primary repository’s history in order to perform its recursive merge algorithm, and you’ll likely also need the latest commit from each of your submodules in order to actually perform the build. However you probably don’t need the full history for every submodule, so retrieving just the latest commit will save you both time and bandwidth.

Overriding .git/hooks with core.hooksPath

Git’s comprehensive hook system is a powerful way to tap into the lifecycle of various Git commands. If you haven’t played with hooks yet, take a quick look at the .git/hooks directory in any Git repository. Git automatically populates it with a set of sample hook scripts:

$ ls .git/hooks

applypatch-msg.sample pre-applypatch.sample pre-rebase.sample
commit-msg.sample pre-commit.sample prepare-commit-msg.sample
post-update.sample pre-push.sample update.sample

Git hooks are useful for all sorts of things, for example:

App passwords are here in Bitbucket Cloud

By on June 6, 2016

Keeping your code secure is crucial. That’s why last year we added support for 2-factor authentication. However, only having native 2FA support limits you from accessing Bitbucket repositories via 3rd party applications. Today, we’re excited to announce application-specific passwords (a.k.a. app passwords), which will allow you to just do that. App passwords let applications access Bitbucket’s API via HTTPS when 2-factor authentication is enabled on your Bitbucket account.

app password list retina

For example, you can use an app password in SourceTree to get full desktop access to your repositories when you have 2FA enabled. From the command line, you can make API calls with the app password instead of the account password, like:

curl --user bitbucket_user:app_password https://api.bitbucket.org/1.0/user/repositories

Granular scopes
You can set the scope of app passwords when you create them and give each application exactly the access it needs. We show you the last time each app password was used to access Bitbucket, and if you want, you can easily revoke an app password if something changes.

app password create retina

You can create and manage app passwords in the Access Management section of your account settings, or check out the docs to learn more.

Happy (and secure) coding!

Bitbucket Pipelines Beta: continuous delivery inside Bitbucket

By on May 24, 2016

Bitbucket_Pipeline_Illustration_100

Software has changed the world faster than almost any other industrial innovation and it’s only picking up speed. Companies are moving from infrequent, large code deployments to frequent, small and agile deployments. This trend is having a huge impact on the current software development processes. For example, in one of our most recent customer surveys, more than 65% of software teams noted that they are practicing some form of continuous delivery. It’s becoming the norm for software teams.

But implementing continuous delivery is not easy. Setting up build agents is complicated. Developers have to constantly juggle between different tools. And most of the time, the build is sitting in a queue, or you’re burying yourself in log files digging for information about failures.

Bitbucket Pipelines
That is until now. Bitbucket Cloud is introducing Pipelines to let your team build, test, and deploy from Bitbucket. It is built right within Bitbucket, giving you end-to-end visibility from coding to deployment. With Bitbucket Pipelines there’s no CI server to setup, user management to configure, or repositories to synchronize. Just enable it in one click and you’re ready to go.

Pipelines is also a great fit for branching workflows like git-flow. Anyone on your team can adapt the build configuration to map the structure of your branches. Here is a quick look at some of the salient features of Bitbucket Pipelines:

Bring your own services to Bitbucket Pipelines
We worked very closely with some of the leaders in the industry so you can bring your own services to Bitbucket Pipelines, right out of the box. Whether you want to deploy, test, monitor code quality, or store artifacts, complete any workflow with the tool of your choice: Amazon Web Services, Ansible, bitHound, BrowserStack, buddybuild, Code Climate, JFrog, Microsoft Azure, npm, SauceLabs, Sentry, Sonatype, and TestFairy.

Pipelines provided us with the perfect opportunity to bring the power of automated code quality analysis to Bitbucket users. We’re excited about the awesome potential of Pipelines and they’re only just getting started!” –Michael Bernstein, VP of Community, Code Climate.

Check out our integrations page for more details.

Continuous Delivery options from Atlassian
We believe that the best way to provide our customers with a top-notch cloud CD solution is to build the service natively within Bitbucket Cloud. That’s why we built Bitbucket Pipelines and also why today, we’re announcing the end-of-life for Bamboo Cloud, which will be discontinued starting Jan 31, 2017. While Bamboo Cloud has helped many customers to adopt CD, we realized that we would not be able to deliver the experience and the quality of service that our customers need. If you’re a Bamboo Cloud customer, click here to learn more about the migration options.

If you want to build and ship behind the firewall, we’re still heavily investing in Bamboo Server as an on-prem CD solution.

Get early access to Bitbucket Pipelines
With Bitbucket Pipelines we want to empower every team to accelerate their releases. No more time wasted on setup and maintenance, just focus on the work you love. You can sign up for the Bitbucket Pipelines beta program today and request early access.

Sign up for the beta program

Pin a comment in issues for Bitbucket Cloud

By on May 11, 2016

Every few weeks, we set aside a week for what we like to call “Innovation Week”. During that time, our developers can work on whatever they like – feature enhancements, new features, bug fixes, performance improvements, etc. Today, we’re releasing one of the projects from our most recent Innovation Week: the ability to pin a comment in issues as an “Official response“:

Screen Shot 2016-05-10 at 3.31.37 PM

Repository administrators can now pin their comment to the top of an issue’s comments section. It will be highlighted as an “Official response” right below the issue’s description and can be used for communicating a resolution status, highlighting open questions, or providing additional context and clarification. It will be particularly useful for large open-source repositories or repositories owned by organizations with large user bases.

This feature is available now for all users. Happy pinning!

Build webhook integrations faster with new troubleshooting tools

By on May 5, 2016

Last year, we released a set of new webhooks to make integration with Bitbucket Cloud easier and more powerful. The response has been overwhelming. We recently added new troubleshooting tools to better support all of you developing integrations using Bitbucket webhooks.

Request details page
We’ve added a “View details” link next to each webhook request on the log table. The new “Request details” page includes a lot of useful information about the webhook request and response, including headers, bodies, and other specifics that will help you debug.

USE 1 webhook log table

Resend request button
While developing or testing an integration, sometimes you need to trigger the same webhook multiple times while you make code changes between each request. This can be very time-consuming, slowing down your development speed. To alleviate this, we’ve included a “Resend request” button on the “Request details” page. Now you can just click once to resend the original request manually – no need to go through the steps to trigger the hook again. And if a request has been resent several times, each try will split into its own tab for easy reference.

USE 2 multiple requests

Also, the request list view now displays an icon to indicate when a request has been attempted multiple times:

USE 1 webhook log table 2

We hope these improvements will enhance your webhook development experience! If you’re new to Bitbucket webhooks, you can configure them in your repository settings. Are you inspired to build new, exciting integrations? Clone our webhook-listener demo project to quickly get started with Bitbucket webhooks.

Renewing our certificate

By on May 4, 2016

It’s almost time to replace our main bitbucket.org certificate, so we’re planning to do that at 00:00 UTC on May 6 2016.

Most users shouldn’t notice any difference. However, if you’re using Mercurial over HTTPS, then you might see this error message after the new certificate is installed:

abort: certificate for bitbucket.org has unexpected fingerprint
3f:d3:c5:17:23:3c:cd:f5:2d:17:76:06:93:7e:ee:97:42:21:14:aa
(check hostfingerprint configuration)

If you do see that error, you’ll need to update the host fingerprint for bitbucket.org in your ~/.hgrc or Mercurial.ini configuration file:

[hostfingerprints]
bitbucket.org = 3f:d3:c5:17:23:3c:cd:f5:2d:17:76:06:93:7e:ee:97:42:21:14:aa

Happy branching!

Got questions? We’ve got answers. Git, agile and JIRA/Bitbucket answers to be exact.

By on May 3, 2016

Are you a JIRA Software or Bitbucket user? How about an Agile fan or Git enthusiast? We’re looking at you. To honor one of Atlassian’s core values, Don’t *%# the customer, we want you to use us as a resource. Today, Atlassian’s Product managers and developer advocates will unite to give your team the leg up on any JIRA Software/Bitbucket, Git or agile questions you may have.

How you ask?

Here’s the deal. Today we’ll be answering questions on Twitter from 12-7pm PST and will get to as many questions as we can, as fast as we can, until signing off at 7pm PST. Here’s how to join in the fun:

  1. Ask a question on twitter using the hashtag #AskJIRABB
  2. Be sure to follow both @JIRA and @Bitbucket to see your answer, and follow along with the conversation
  3. It’s as simple as that

To get the conversation started, here are some examples questions we can help you with:

Get the idea? Awesome. Gather your questions, your dev team, and join the conversation.